So I originally posted this: https://www.reddit.com/r/PleX/comments/m8vrdh/building_the_ultimate_plex_server_guide/
**Windows 10 Guide**
Which got a lot of upvotes. So I decided to go ahead and create
Part 1 of the guide: https://docs.google.com/document/d/1vl3uckEy2Pk-EiCVNAjr2DnmBmLdZ2To/edit#
Now I’d love to use Reddit as the official area for comments, debate, and feedback so I can correct the current article and continue to build out the other parts of the series. Let me know if you all have found this type of style of guide useful (although its very easy for Part 1, this should give you a gist on how the rest of the guides will go).
​
Coming Next:
Part 2: Optimizing and Extending Plex (https://www.reddit.com/r/PleX/comments/m9piq7/the_ultimate_plex_guide_part_2_optimizing_and/)
Hello World,
Plex Auth servers were down earlier today, and as usual, we had the numerous posts complaining about it, and how Plex should implement local authentication. In these posts, you’ll find tips and tricks on how to allow local IPs without login, people bashing Plex and recommending alternatives and fans defending Plex.
In my personal opinion, I would say that users should not expect Plex to ever implement local auth, simply because the moment they do, they will go out of business.
I believe that having control over the auth servers is the only thing that stop Plex Software from getting cracked. If local auth is implemented, you’ll see cracked versions with premium features on the market. Of course, Plex knows that a good portion of their users are not the type to pay for digital content.
A secondary point is collecting information and telemetry. Plex is attractive to video on-demand platforms because it can offer a solid base of users with some (a lot?) of …
**Windows 10 Guide**
Part 2 of the guide: https://docs.google.com/document/d/1t7g_ruqcu8Xw7MnpdKU2tw8taXmMyT00lhzvqpQt5ug/edit?usp=sharing
Part 1 of the guide: https://www.reddit.com/r/PleX/comments/m9gt84/the_ultimate_plex_guide_part_1_starting_plex_with/
Now I’d love to use Reddit as the official area for comments, debate, and feedback so I can correct the current article and continue to build out the other parts of the series.
​
What’s Coming Next:
Part 3: Outside tools to automate media (The mod’s have said this section may not be suitable for this subreddit, so I will think about an alternative area).
Part 4: The nuances of hosting with a Dynamic IP (This may be split into two parts)
I remember when giganews was the king of usenet providers. Giganews was the first provider to have 90 days of retention when the other providers where still around 30 days they had the best peering, mirrors, raids of all there articles almost 100% completion of everything. In any forum discussing usenet giganews fanboys would jump in to proclaim how great giganews was. Now today giganews is still overpriced but the service has just gone down the tubes speeds are awful even post under 1000 days have completion issues so what happened the company just seemed to stop caring! I would like to hear others thoughts. The usenet industry owes it’s Survival at least in the U.S. to giganews if The company didn’t fight perfect 10 or lost that case there be no binary usenet anymore!
https://torrentfreak.com/usenet-provider-giganews-wins-landmark-copyright-battle-170124/
I moved off Supernews to Newsdemon but have now realized I have redundancy with my UExpress block. I was looking for another main provider to go with. I know Eweka is the favorite, but wasn’t sure how they do with US based folks. I was also considering Frugal for their current sale. I do have Ninja as well due to their retention and have been fine with it given the deal I’m on (~$30/year).
Hi all,
Yummmyyyy!
The cheesesteak was born in the 1930’s, created by Pat and Harry Oliveri, hot dog vendors who just so happened to put steak on the grill. After that, in 1940 The Pat’s open their own restaurant in South Philly, Pat’s King of Steaks. The cheesesteak evolves and in 1950 the company Kraft Food developed Cheez Whiz, which then became the number one choice cheese for the cheesesteak sandwich!
In 1966 Pat’s King of Steaks gain rivals across the street, Gino’s Steaks opens their restaurant. Being it pretty friendly rivals, they came up to start a National Cheesesteak Day in the USA. So by then, the National Cheesesteak Day was born!
And as it happens to be at StingyUsenet we love cheesesteaks! That’s why: as of right now until Tuesday the 30th of March (14:00 CET), you’ll get 30% discount!
Not a customer? Click below:
https://stingyusenet.com/en/order/?promocode=cheesesteak-day-2021
Already a customer? Click below: …
I find usenet tools such as NZBGet or Sabnzbd way better than anything available on torrent, so I was thinking of trying to use them as daily drivers for getting my Linux ISOs. Just finished firing up a Debian VM with NZBGet, got a account on altHUB and some trials on NewsHosting, Eweka, TweakNews and Supernews. From my current understanding, having more servers should make it easier for the client to get the posts, and I believe those providers are all based on different backbones.
For new content, it just works great. The problem is when NZBs are 1000 days old or more. Even tough most providers claim to have 4000 days retention, it seems that if I’m looking for Season 1 of series X, it just won’t happen. With that in mind, I have a few questions:
- Is it possible that NZBs on altHUB are available on other usenet servers? If that is the case, how should I pick my provider?
- Is having more than one indexer relevant? At least for TV Shows, NZBGeek seemed to have the …
I have a NZBGeek account, and anytime I try to download something it gives me invalid API key, I checked and it is using the one listed on my profile.
Solved: The TBA title is preventing the transfer. The solution is only in Ver 3 of Sonarr. Under:
Settings -> Media Management -> Importing there is a line labeled “Episode title required” Changing this from “Always” to “Never” or “Only For Bulk Season Releases” should fix the issue.
​
The newest marvel shows “Wandavision” “The falcon and the winter soldier” have been screwing up my Sonarr and I’m wondering if anyone else has had the same problem.
For some reason, the shows wont transfer after being downloaded. The activity shows the show as downloaded/downloading with a yellow mark. The show downloads, but then just stops. There’s nothing in the logs showing a failure to transfer or anything. Just the
Report sent to qBittorrent.The.Falcon.and.the.Winter.Soldier.S01E02.1080p.DSNP.WEBRip.DDP5.1.Atmos.x264-NOGRP[rartv]4:41am
when it downloads and it never even tries to …
Hey guys, I was wondering how you handle sonar grabbing shows with the same or similar names to the one you are trying to find. Just as an example I added Ducktails the old 80⁄90 version and for the most part it works but I noticed every few episodes will actually be the newer Ducktails show that came out. There are a few other shows I have that are like this as well. Is there way way to make sure you are getting the correct show for each episode without manually going threw them?
I have radarr running on my seedbox and with rclone moving files from seedbed to my pi at home with Plex. Radarr is set to mark them unmonitored after file is deleted and that’s been working fine, but it seems to have stopped working.
I have it updating now to see if that fixes it, but this seems to have started out of nowhere. Is there anything I should check after the update?
I have a drive being used by my nextcloud installation, I have all my series there, but I can’t set any path under that drive as my root path for Sonarr, I created the sonarr user under the www-data group and I can manually change every Serie to that path but it can’t import there.
I’ve downloaded some 4k movies, can’t play them through plex bc that apparently super cpu heavy. Now i either want to remove 4k and re-download everything or keep 4k and redownload in 1080p. Ive already set it to 1080p but it won’t redownload (prob bc it already has a better version). Do I need to do this manually?