Some might know, some may not, but Plex has hidden advanced server settings that are accessible here: https://support.plex.tv/articles/201105343-advanced-hidden-server-settings/
Which ones do you use and/or suggest? Just interested to see what others might be using.
Just got a good deal on a Beelink. For $170 I think it’s going to be a pretty decent Plex server. I have been running Windows Based Plex servers for 10 years…. Think I am going Linux/Ubuntu this time. Those that made that switch….. are you glad you did?
Checked with a few buddys and they’re all seeing the same thing after the most recent update to 1.40.0.7998-7000
For context running off a synology nas right under the tv on the same network. Have not had this issue until about a week ago.
TIA!
Currently have NZBGeek. Looking to add either NZBPlanet or DrunkenSlug or both? What would you guys recommend? What do people end up having the most success with?
​
Edit: I got a invite to DrunkenSlug from one of you guys thanks so much. A bunch of people dm’ed me and a few people in the comments were looking for invites if you guys wanna hook them up.
I wanted to try Eweka and see how well it works on comparison to Frugal which I use. If I put both at 0 what would happen? Would that be a good gauge at which is better for my use case?
Does anyone know how it determines which one to pull from?
With all the services unfortunately increasing their prices, hopefully these deals will help everyone out a little…
* $14 - 1TB Prime Block + 1TB Bonus XSNews Block
* $14 - 2TB Non Expiring Prime Block
* $14 - Monthly Unlimited Prime + 500GB Bonus XSNews Monthly (both reset monthly)
UsenetPrime Valentine’s Day Specials
Happy Valentine’s Day everyone…….
I have nzbgeek right now but it misses some stuff. by missing I mean on nzbgeeks website episodes 1-12 will be there but 13 is physically missing, then 14-28 will be there. All of season 6 is missing on another show, stuff like that. i’m looking for an indexer I can join relatively soon, the next few days, that can hopefully pick up the slack on the few thing’s that are missing. I saw people mentioned su and finder are good, but just looking for everyones opinions on something that is open right now.
For the last week, Prowlarr says NZB.su is down. When I test, it just spins until I get this result:
Unable to connect to indexer, indexer’s server is unavailable. Try again later. Http request timed out
NZB su reports : > API Hits Today: 36
Also, VIP expires in 258 days
I can see this is not a terribly new thing, but I have only just realised some release groups are generating their own DoVI layers. They are scoring highest with my custom formats and a potential problem if they are trash.
Quote from their release notes:
DV/HDR10+ generating process: HEVC stream was extracted from the source MKV, converted to prores 422HQ (qscale=3), and imported into Davinci Resolve Studio 18.6. Scenes were detected in Resolve. Tuning analysis setting for DV algorithm was set to 1 (Most Highlight Detail/Most Mapping). Mastering Display setting was matched to the Base Layer (1000-nit, P3). Output blanking was set to match the 2.40 aspect ratio (T280/B280). DV (v4.0, P3-D65) and HDR10+ analyzed in Resolve and exported. HDR10+, then DV injected into base layer source and remuxed.
Would I be right to guess this is a red flag and stick with Hybrid DVs?
If so, is there any other release groups doing this to also avoid?
​
​
In Connect , under the Settings tab. Maybe it’s obvious to some, but I don’t understand why there is a need in Sonarr to have it linked to my Plex server. Can someone explain what the benefits are?
While doing a little cleanup in my plex folders I noticed radarr downloaded that are too big for my needs: 50+G for a 2.5h movie, when for my needs 10G would be enough.
To rein it in I updated my quality settings.
Now: how to have radarr take care of those huge files and re-download a more sensible size?
Searching around for an answer did not yield much luck for me: I only found a reddit post on the same topic:
https://www.reddit.com/r/radarr/comments/6g15ma/help_delete_big_file_sizes_and_redownload/
where it doesn’t seem possible to do.
The post is 7 years old, perhaps in those 7 years, radarr gained the ability to do it?
(Meanwhile, I’m backing up those huge versions just in case the only solution is to delete the assets I want re-downloaded, but it’s gonna take a long while as they’re several Terabytes of files)
I relocated a ton of movie files on disk from a root directory to alphabetically organized file folder structures (ABC, DEF, GHI, JKL etc.,) to make it easier to locate a specific movie when digging around in the file structure. Radarr is monitoring the movies of course and now can’t recognize that the movie is on the file system and continues looking for it in NZBGet, resulting in a lot of duplicate file downloads. I’d like to import the movies that I relocated into Radarr to clear all this up but it indicates that the movies exist / are monitored already so won’t let me relocate them. How can I force Radarr to scan all the movies over again and index them anew. It isn’t feasible to manually edit the file paths in Radarr given I relocated about 1100 movies. I’d ieally like to simply have Radarr scan the library and find the files in their new locations. Is that something easily done that I am missing somewhere?
Going to give my basic setup and what I’m trying to accomplish since there may be a better way to handle this: I have a seedbox with a 1TB drive, I have a local computer with a 60TB drive. Transmission runs on the seedbox but everything else is running on the local computer. The seedbox is mounted with rclone mount to the local computer.
Currently Sonarr searches, sends to Transmission, and when the file completes makes a copy of the file in the local plex folders. I periodically delete files off the seedbox as it nears capacity but tend to leave stuff as long as possible for seeding purposes and I have a small set of files that I never delete to maintain ratio/BP/etc.
What I would like to do is to download files with the seedbox but use the local computer to permanently (although slower) seed the files Sonarr is grabbing.
The current method I’m thinking of to do this is to periodically run rclone copy to copy files from the mount to a local folder, this way when I …