"Buy Me A Coffee"

  • 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle


  • Not sure if I entirely understand what you’re asking but here’s my setup that sounds similar-ish that might help.

    I’ve got essentially 3 machines

    1. Download machine - contains Sonarr/Radar/Nzbget, etc… This machine isn’t very powerful but it has A LOT of RAM.
    2. A Nas - this is where everything gets downloaded to. Primarily this machine just has a lot of HDD space.
    3. Jellyfin box – Decent RAM and a beefy CPU for transcoding.

    The download machine has a network share to download directly to the NAS in a special /downloads/ folder. Once a download completes Sonarr, etc… move it to it’s correct media folder.

    Finally the Jellyfin machine is monitoring the media folders for changes.

    I assume you could set up something similar with Plex instead of jellyfin and then store the fully downloaded files on a separate machine with a network drive, so Plex can see it. Essentially the NAS for you would be two machines one (the seedbox) for the partial downloads and a local NAS for the fully downloaded files?

    Anyway, not sure if that’s what you’re looking for.


  • I’m also running Ubuntu as my main machine at home. (I have a Mac and do Android development for my day job).

    But at home, I do a lot of website and backend dev.

    1. Code in VSCode
    2. Build using docker buildx
    3. Test using a local container on my machine
    4. Upload the tested code to a feature brach on git (self hosted server)
    5. Download that same feature branch on a RaspberryPi for QA testing.
    6. Merge that same code to develop 6a. That kicks off a CI build that deploys a set of docker images to DockerHub.
    7. Merge that to main/master.
    8. That kicks off another CI build.
    9. SSH into my prod machine and run docker compose up -d

  • Unless you have an account there’s no easy way to get access to the content on the page. Once you have an account there’s technically nothing stopping you from just saving the HTML file to your computer.

    Something else you can try though, assuming you don’t have an account, is to just turn off JavaScript. If the site lets you partially load the content and then asks you to create an account to read more, they usually just block the content by having JavaScript add an opaque overlay. With JavaScript disabled, obviously it’s not there to add the overlay and you’re able to keep reading.





  • marsara9@lemmy.worldtoLemmy@lemmy.mlLemmy content aggregator bot list
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Maybe. 2nd idea I’ve got is that if no one is replying after say 24hrs and something like 75-80% of your posts are as such and you have at least 100 such posts, you get added to the list?

    Main concern I see about something like this is false positives and how someone real could end up getting blocked.

    I definitely want to think on this some more but it might have some legs.


  • …I wonder if there’s a programmatic way to detect these bots? Some sort of analysis on their posting behavior?

    If they’re playing nice they’ll have the bot flag checked in their profile, and then maybe build a list of any bot that creates posts? As most of the “good” bots just reply to comments? Anyway just thinking out loud. But I’m thinking I could easily add a public API to my search engine that just returns a list of “posting bots”…


    1. Yes most trackers have something on their website to let you know what your ratio is, what you’re downloading and how long you’ve been seeding those files.
    2. With the trackers I’m familiar with yes – seeding for 9d 23h 59m and 59s is the same as seeding for 0s. You’ll still get tagged with a HnR (Hit and Run)
    3. You can shutdown as much as you like. But, again the trackers that I’m familiar with have a cap on the number of HnRs you can have on your account. So you might have action taken against you if you’re seeding 5 different torrents and decide to shutdown.
    4. Don’t know.
    5. The rest don’t appear to be questions so not sure how to respond.





  • Let’s say I just sent a request from my non-existent server with my user id…

    Who or what is going to send this request if not some server that implements ActivityPub? This could be a Lemmy or Mastodon or Kbin instance… Or anything else that implements ActivityPub.

    …and just every time I wanted to check whether I got replies I would query the other server (which a Lemmy server would do to get notifications about replies or upvotes)

    ActivityPub works via pushes. So there’s nothing to query. There HAS to be some server for it to send and store that data.


  • So you can’t just send data from a domain. There has to be a service running behind that domain name to do something.

    Without a server, it’d be like asking “why do I need tires on my car?”. Well it’s not going to go anywhere without them.

    Now this could be a private instance with only you as the single user. And it could federated with the rest of the fediverse. But you still have to run some software to do that.

    Now in theory someone I guess could come up with a slim version of Lemmy that only has a single user, and you can’t post or comment directly to that instance but again something has to be running on a server behind that domain.



  • So here’s my current setup (each one is a separate docker container):

    Download machine: (has lots of RAM and HDD space)

    • Nginx (for reverse proxy)
    • Sonarr (tv)
    • Radar (movies)
    • Prowlarr (organizing download sources)
    • Qbittorrent (make sure to bind to Wireguard interface)
    • Wireguard (for qbittorrent VPN)
    • Nzbget (Usenet)
    • Szabnzb (also Usenet; some providers work better with Szabnzb for whatever reason)
    • Portainer agent (for remote docker management)
    • Watchtower (for automatic updates)

    Tv machine: (can transcode)

    • Nginx
    • Jellyfin (to transcode and actually watch the content)
    • Portainer agent (for remote docker management)
    • Watchtower (automatic updates)

    I’m not aware of a single container that has all of this bundled together though.