That’s why I pay for Plex pass. That’s why I go through the trouble of configuring it.
The truth is that, if I wanted to, I could just make a UPnP server and tell people to use BubbleUPnP and stuff . And it would basically work. I know I could definitely use it to play stuff from my tablet or TV. But the problem is my parents. I’d like them to use my content as well, and they constantly use it. And that’s the whole reason I am using Plex, because it similar to playing stuff from other streaming sites.
I am not opposed to Plex investing dev time in weird features or in trying to monetize in additional ways. But these changes shouldn’t come in detriment of Plex’s main job. And making the main page so complicated to use, telling users to use ‘Discover’ . Filling things up with ‘Live TV’ from weird internet channels. It’s a lot of pollution. It’s worse when these changes can happen overnight. If my parents are used …
Every. Single. Time I update Plex I have to re-pin all of the libraries on all of my clients. It’s frustrating as shit. Why is this happening and how can I make it stop?
Dear airport security, it’s not a bomb I swear:
Everything in the safe case with the power cables punching out
Out of the case so you can see everything
​
My media is basically my psychological safety-blanket/emotional-support-animal, so I want minimal down-time without access to it even while on the move. This is so I can take a large portion of my library with me while travelling, and can completely run off a normal USB power bank if I don’t happen to have easy access to a powerpoint, eg riding on a bus/train/airplane/etc or waiting in a terminal. If there is a powerpoint handy then I can just power this directly off a USB charger.
I wanted to have a server/wifi-network so I’m not wrangling a rats nest of hard drives out in the open hanging off a laptop. All this can just stay in my bag then I just have my unencumbered client device (phone/tablet/video-glasses/etc) out and can seamlessly switch between whatever device may be convenient …
As pr. title.
Tons of changes:
Release Notes - SABnzbd 3.6.0
=========================================================
## Changes since 3.5.3
- Significantly increased performance by using the yEnc-decoding
library of u/animetosho. Usenet articles are now decoded using
specialized CPU instructions (SIMD) on x86 and ARM systems.
- Create and restore a backup of configuration and database.
- Show source of lower download speed (CPU or disk).
- Added keyboard shortcuts (`P`ause, `A`dd, `S`tatus, `C`onfig).
- Result of the `Deobfuscate` step is listed in History details.
- `Path` of `Default` category will be used if category doesn’t have one.
- Disabling `api_warnings` prevents showing `Access Denied`
information to the external client.
- Jobs with `Force` priority will always skip the duplicate check.
- Added `ext_rename_ignore` to add custom extensions that should
be ignored during the `Deobfuscate` step.
- Removed Indexer Feedback Integration.
- Removed …
Funny post I came across on r/gaming- thought I would share 🙂:
https://www.reddit.com/r/gaming/comments/v5qbxj/according_to_the_manual_to_duke_nukem_3d/
Hi all,
We we’re “down” for 2-3 hours today because we were upgrading some hardware to AMD Zen3 CPU’s, adding more RAM and switching our NVME ZFS pool over to a striped mirror pool.
We’ve been back online for an hour or so now and all seems to be working fine. If you encounter any issues please drop me a PM.
Cheers.
I know there’s a lot of “resellers” out there and I don’t know who to trust so I’m not just paying for the same service twice. What indexer would best support me? I use NewsHosting for my usenet provider.
This is geared towards newbies but I see people talking about incomplete downloads (takedowns) all the time and the first thing they ask is about “acquiring more servers” or “which server is best for completions” when they should really be asking “how do I find another copy of the same content on my same server” or “how do I download a file before it gets removed”?
Questions about “acquiring more servers” or “which server is best for completions”
Chances are the content that you are trying to download is available on your current server, or the copy you wanted to download was removed by the time you tried. You typically don’t need multiple servers or full backbone coverage to find what you want, you just have to spend a little time and effort searching for it or set up your downloads to be automated. Now you may need another indexer or two or even a private nzb forum to locate your nzb file but if what you …
Several series have a two part episode merged together so it shows as not being complete when it is. When I’m filtering by series with missing episodes they show up. Is there a way to mark them as complete so they don’t show?
I am having issues with a lot of my torrents stalling as they have no seeds, while there are lower quality copies that have tons. Is it possible to get Sonarr to download the most seeded one first, and then upgrade?
I believe this isn’t an isolated use case, but for a couple of reasons that I’ll list below, I am hoping that Sonarr will someday allow multiple versions of a file to exist. Is this something that is feasible, or is it inherently impossible to change how Sonarr works to allow this?
The two main reasons I want two simultaneous releases (for me specifically, a 1080p release and a 4K HDR release):
​
I understand …
Take the following custom format taken from trash guides for example:
I am confused to what Required and Negate mean exactly.
Am I right in saying that for Required, the release title must match the custom format, in this case, it must contain the string ‘Remaster’ in the title.
And for negate, it must not match?
Hi guys,
​
I have a seedbox from Seedhost.eu. I tried to create a ratio group on RuTorrent that deleted the torrent and data coming from Sonarr after 2 hours. This was the ratio group that I set up :
Min : 1000, Max : 3000, UL : 20000, Time : 2
When I do this however some of my files do get deleted like I want them to but some of them stay even tho they all come from the same indexer and same setup.
I also tried setting up a time limit on seeding on Sonarr directly and I had the same thing, some torrents do get deleted while others dont.
​
Does anyone have any advice on how to solve this?
Thank you in advance and hope you have a wonderful day !