This is just an observation and others may have alternative solutions which I’d like to hear.
As the title says: I find the use of the forum, in its current layout, a tiresome experience for reporting problems or searching for information. There are too many sections and duplicate threads for the same issue that can hamper the discussion and let official responses go unseen.
A system similar to GitHub’s issue tracker would be useful where all support issues can be consolidated, tagged appropriately and duplicate issues cross-referenced to the original thread - I presume Plex uses a similar system internally to track issues.
Version 1.33.0 has been released for Plex for Windows and Plex for Mac.
I can hear you from here: “Um, you skipped 1.32?”. Yep! An issue found in 1.32 testing wasn’t resolved until the 1.33 timeline. At that point looking forward to 1.33 was a fitting way to bid adieu to 1.32.
This release improves default video quality, provides selectable Video Quality in Settings, fixes many UI issues reported in 1.31, and preps us for modern automatic update control in the UI.
Like our Plex HTPC preview, this release automatically installs future updates at startup. If this is troublesome you can disable it manually. An upcoming release will make this right with proper UI confirmation, enabling, disabling, and checking control.
New
Fixed
Hi everyone,
i just wanted to let you know that site is encountering some weird traffic issues and i am trying to find out what is going on, as something creates a lot of connections and is killing php-fpm. This started couple of hours ago. I have never seen load of over 400 on server, and i am seeing this since issues started.
I will keep reddit updated as much as i can.
Edit 1: Site might be down for couple of days as i will migrate it to new, better server. I was planning to do it anyway, but this issue helped me speed up the process.
​
Edit 2: Someone is abusing the API with sending same requests every second. I wonder how come that neither cloudflare nor built in stuff detects it.
Edit 3: Site is more or less operational. I am starting the migration to new server.
Edit 4: Site is up on new server. If you encounter any issues feel free to contact me.
Because it’s almost Father’s Day: For a limited time all our block accounts are now 35% off! Do yourself or your loved one a favour and pick up this special right now!
500GB BLOCK now just €9.09 / $9.74! Order here: https://www.vipernews.com/sign-up/?plan=44
1000GB BLOCK now just €15.59 / $16.24! Order here: https://www.vipernews.com/sign-up/?plan=47
2000GB BLOCK now just €27.94 / $29.24! Order here: https://www.vipernews.com/sign-up/?plan=50
For those of you looking to cash in on this Father’s Day special: You can order multiple blocks and they will all be accumulated under your account so buying 3x 2000GB block gives you a whopping 6000GB block!
Hurry, this Father’s Day special won’t last very long!
The ViperNews team
I have newhosting currently with 55 connections. Somehow I found some deal where it’s like $2.50 a month. It’s the only service I have and while they say they have the best retention there’s definitely a lot of files missing too much information to be repaired. Do you use a backup service and if so are there any free ones that might be slower but are worth using in the event what I’m looking for on Newshosting is damaged beyond repair?
Hey folks,
First off WOW how did it take me 20 years of toring to dip my toes into usenets… I’m so overly thrilled with the experience thus far.
With that said I have read contradicting info all over the place regarding indexer preference. Most everything I can google seems to be promotional or outdated.
​
I went with Geek after seeing it mentioned frequently throughout this sub and without much blowback comments.
Geek has been fantastic so far! There are several TV shows I was unable to find anywhere for years on torrents just sitting there waiting on usenet; however, there are missing episodes scattered throughout these new found series.
I have read tons of highly contradictory information regarding indexer overlap versus the benefits of multi indexing. Is just Geek going to be what’s out there for the most part? Should I expect to see any benefit by subscribing to another indexer? or two? Which would you recommend to pair with Geek?
I’ve been using Sabnzbd for many years. Just a few days ago, I thought I’d give NZBGet a shot since there was a recent update 10 days ago; and, I heard that it’s theoretically less hardware-intensive than Sabnzbd.
Anyway, NZBGet is cleanly installed (latest version) at all its default settings. Provider settings are identical to Sabnzbd; with 40 simultaneous connections, SSL enabled.
I get an average of 35MB/s-40MB/s download speed with Sabnzbd (which is close to the maximum download throughput of my ISP). I get an average of 15MB/s-20MB/s download speed with NZBGet (roughly half the speed).
What settings do I need to change (from their defaults) on NZBGet to get the equivalent download speed of Sabnzbd?
Just a wild guess.. do I need to disable DisableWrite in NZBGet and increase the article cache to 1024MBs in NZBGet? I didn’t want to start experimenting. I rather get a definitive answer.
If I do need to start making tweaks and changes to NZBget to …
v3.0.6.1266 has been released to main today.
As always go to System -> Updates or repull your Docker container and get updated.
Release Notes
Does this include .net / .netcore?
No we’re still using mono. .net is tentatively planned for v4
What branch should I use?
main
[See here for more details](https://wikijs.servarr.com/sonarr/settings#updates)
You really should not be on `develop` unless you plan to actively assist in troubleshooting potentially broken builds.
Common Problem Warning:
It’s 2021, Sonarr v3 now validates SSL certificates. This will likely …
I want to use Sonarr with Jellyfin, but it’s not listed in the “Add connections page” and google is being WAY worse then usual
Ive been fighting my new system install for four days now, no matter what i did i just could not get any sonarr container (linuxserver nor hotio) to run with any kind of reliability on any OS on my odroid XU4. Tried different versions of ubuntu and armbian, all the same result. I would get the weirdest crashes and errors that were different every single time i tried to access the web-ui. Exactly the same configurations on a pi4 would work without as much as a single hickup. Turns out every single time the mono process is handed over from a small to a big core or vice versa everything breaks. This should not happen in this day and age but it does and was a pain to pin down.
Workaround; Force the mono process on one set of cores (big cores are 4,5,6,7 and small cores are 0,1,2,3) or just disable a full set of cores and the problem will go away.
Search terms - a collection of the more popular errors i would get;
Got a SIGSEGV while executing native code. This usually indicates a fatal …
Hello folks! I recently got my Radarr/Sonarr/Jackett box up and running and I am loving the experience thus far. I am just wondering about best practices for managing multiple libraries of different quality for same movies. I have quite a bit of storage and keep several libraries:
4K/1080p Remux - for my main essential collection
1080p Stream Quality - Bitrate capped at 15Mbps - Most everything from 4k/1080p Remux + some tertiary trash sequels to fill out sets
720p Stream - Bitrate capped at 5Mbps - Scaled down library of classics that I stream on the go frequently.
480p Stream - Bitrate capped at 1Mbps - Neglected library but existent copies of essentials when I can find them used only when traveling to places with atrocious or rural mobile only bandwidth
​
From everything I have read thus far it seems that running an instance of Radarr for each library is the (only?) way to go. I just wanted to make sure that I wasn’t missing some major feature that let …