Categories
Computer Programming Web

Plex, Synology, Xfinity and Me

For Christmas last year I bought myself a Synology DS920+ NAS. Nice thing, expandable, and so far I’ve been quite happy with it! With the expense of drives and other issues, I have been slowly expanding its capacity from one drive to the maximum of four (without the expansion bay). Additionally, I’ve moved some of my TCP hosting applications to this, and it’s been working pretty well!

Now I live under the umbrella of Xfinity/Comcast where this thing is. I have had many issues, namely with connectivity and a flaky cable, an old Netgear N300 router/modem and a few other things.

I also have been a Plex customer/lifetime Plex Pass holder for over 11 years. But only with the purchase of this NAS have I been able to finally get a server that actually works reliably up and running.

Over the past week, and even sporadically since I’ve imported my media library in January, I’ve had weird connection issues, namely the remote kinds, or the kinds that only crop up when you use the Android app (which forces you to resolve through app.plex.tv). I did manage to get my custom domain’s certificate changed over (the one this very blog uses). Additionally, I managed to get remote access turned on with a bit of port forwarding, and it seemed to work okay for a while. However if I were to stay on Plex’s “Remote” settings tab, I would see the connection very sporadically drop out and then resolve itself a few seconds/minutes later.

All in all, I haven’t noticed enough problems to really start to complain. I’ve been able to watch and listen to my stuff on my phone and laptop, and that’s all I want. New media I can upload via Samba from the laptop, or over the network using the extraordinarily slow Synology file manager via the web interface.

However, today was the clincher.

I have the entire collection of Harry Potter audiobooks, and in lieu of my normal podcasts during the commute to and from work, I’ve wanted to listen to these instead. Additionally, it’s nice to pull them up from anywhere and listen when I’m working on stuff. All of this on the Android.

Yet this week, PlexAmp, the audio-only client for Plex was unable to access my server at all, even on the local network. Additionally, pulling up the audiobooks through the regular Plex app would work, but shortly after the screen blacked out due to lack of visual content, the audio would pause between tracks, forcing me to unlock the phone after which play would resume almost immediately.

I decided to try and “fix” it today. I figured a good first step was to delete all authorized devices from my Plex account. Except, even despite the warnings, I also deleted the Plex Media Server from my account. Mind you, the server itself was okay. Still running, no problems, but it was no longer tied to my account, and I had no way of recovering it because, well, it’s a NAS.

I found a way to reset the login token in Preferences.xml and did so.

Then I couldn’t even access it via the custom domain connection. UGH..

So I found out how to simulate a connection to the media server as localhost via PuTTY’s SSH tunnels, since an unauthenticated Plex Media Server still lets you access it from localhost. From here, I was able to get to all of my media and, more importantly, the server settings. I was asked to log in, did so, and then received a big orange “Your server is unclaimed. Remote access is not available until you claim the media server.” I found this to mean that I was essentially still locked out, even though I was, in my browser, still logged into Plex.

A click of the claim button sat there for a few minutes, then timed out. At one point, I was able to see a barely distinguishable “Unable to claim server” error which quickly disappeared.

Over and over I tried this, tweaking settings, removing my custom domain certificate, replacing it, completely uninstalling Plex from the NAS, reinstalling an old version, all to no avail.

All online resources said that there were essentially three things that could be wrong:

  • Bad/expired user token
  • Improper port forwarding
  • DNS Rebind Protection on the router or ISP

Well, the first two were easily tested and already verified. Just plain removing authentication strings from the Preferences.xml file was what caused this issue in the first place, and I know that port forwarding was working fine on my router, so it had to be something with the DNS.

Besides, whenever I logged into localhost:32400 (via the tunneled SSH remotely), I never saw my profile pictures or anything show up, and all server logs seemed to indicate that during the “claim” process, Plex itself was unable to contact the authentication servers.

I know they were having upstream problems earlier this morning, but https://status.plex.tv indicated that this was working just fine when I checked it. Must be something on my end.

Unfortunately this “DNS Rebind Protection” is not a thing in my N300 router. There is no place I can enter rebind-domain-ok=/plex.direct/. I have no internally running DNS server; I do everything through Cloudflare and my registrar, and Cloudflare doesn’t offer Rebind Protection as far as I know. At least, not for free.

My router does have a standard “manually configure DNS” settings section, though, and months ago I had configured it to 1.1.1.1/1.0.0.1 (Cloudflare).

Yet this was not working, and doing the same “Claim” process over and over with endless tweaks to Plex and the settings file was getting frustrating.

That’s when I decided to step out just one level and look into the Synology network settings and found… Network > General > Manually Configure DNS Server.

What could it hurt? I’ve been seeing warnings and errors all over Plex for the past 2 hours, and if, for some reason the NAS is contacting Comcast directly even though it’s SUPPOSED to resolve directly with the router and on to Cloudflare, that may still be the issue.

So I switched it to Manual, entered the Cloudflare nameservers there as well, clicked Save, and switched back over to localhost:32400. Back to the General settings tab and lo and behold….it was suddenly Just Working!

I must also note that I’ve been spending the past 3 days adding old DVD rips to my library, none of which had automatically grabbed correct metadata, and which could not be manually matched with an online title for grabbing proper metadata, but instead gave an immediate “nothing found”. All my old media was okay, but none of the new stuff had updated itself like I saw it doing the first few weeks of my running this thing.

Now with the NAS DNS settings properly pointing to Cloudflare, I had no issues matching the new titles.

So if your router doesn’t have access to this DNS Rebind Protection business, but you’re still seeing the “Unable to claim” error and you’ve tried everything, see about manually setting the DNS configuration of your actual server machine; not just hoping it will use the ones in your modem or router like it’s supposed to.

FORCE your stuff to contact Cloudflare or Google or something that won’t screw with you.

Categories
Web Youtube

The Reddit Thing

Yesterday I got a notification from Youtube stating that COPPA/Made for Kids was being forced come January. I’ve largely ignored these messages, as my channel is not that big, not very expansive, and not terribly interesting. However, when these settings apply to everyone, and I am a member of everyone, well, maybe it’d be worth paying attention to.

And boy, was it worth it! The Made for Kids deal coming through Youtube does a number of things to videos which I imagine more money-minded people will care about more:

  1. Only child-friendly ads/no ads
  2. No comments
  3. No watch later

I read a few articles and watched a few videos on what counts as “Made for Kids,” I realized a lot of my stuff actually does count, even if I didn’t actually make it for kids. What comes to mind are my old LEGO animations, Minecraft sessions, and a short Disney Christmas clip that was uploaded as a ‘check out this 30 second clip encouraging duck cannibalism’. My account isn’t largely going to drag the kiddies in by the bucketloads, but there are some things. Therefore, I’ve taken Youtube’s instruction to mark my channel as “NOT Made for Kids,” but will piecemeal the few videos that are, and deal with it.

Well, the ad-nerfing I don’t care about, as I don’t monetize my channel at all. But commenting? That’s the biggest deal to me. Since I don’t care as much with “view counts”, comments are really the biggest dopamine rush that Youtube will provide for me on my stuff. Namely that Donald Duck Christmas clip.

So what did I decide to try to do? I started a subreddit at reddit.com/r/ppsstudios!

Besides, I like to reserve my “brand” of PPSStudios anywhere I can. So why not just squat on that page if anything?

Right now, it’s marked as completely private, so nothing should be visible (even though 8 people were lurking around last I checked), but eventually my grand plan is to drop in a link and direct video viewers to that page via an end screen and video description on every Youtube video.

Or perhaps for my Dungeon World campaign on another Youtube channel, there can be some conversations there. I don’t know.

Is there a Reddit replacement for the WordPress Comments system? Sorta like Disqus?

I’ll have to think on this.

For now, though, feel free to check in sometime when I’ve got a little time to set it up!!

https://reddit.com/r/ppsstudios

Categories
Games Life Web Writing

Diplomacy

I just finished listening to NoDumbQuestionsEpisode 53 – What Would Happen Every Time You Restarted Earth? I have to say, the discussion definitely got me onto two things which I briefly mentioned in my comment on Reddit (I don’t know if my thoughts will take off at this point, but I wanted to put them out there).

Categories
Graphics Photoshop Web Writing

Christmas Again

So I guess I could say Merry Christmas. Then again, I WANT to say, “Hey it’s that time of year where I break out my Christmas theme for a blog!” I haven’t done this in a long while, and given that I’ve moved over to WordPress, I wonder if it’s possible.

Stay tuned for theme-like edits in the next day or two. As soon as I find time to break out my old copy of Photoshop and a few stupid photos of myself which I may already have somewhere on this server, I’ll drop in a new banner image of some sort!

Categories
Programming Web

Super Duper Status Update

It works! I was a bit annoyed that the wordpress atom feed was only XML based. However, I do consider myself pretty good at googling things, and so I found this PHP library called “SimpleXML” which solved a LOT of stuff for me.

I used to display the first five Blogger titles on the homepage in a simple list with links.

Categories
Web Writing

Blogger Migration

So you may be wondering: what’s up with all these blog redesigns, Daniel?

Well, I got sick of Blogger. A whole lot. And about a month ago, I decided to do something about it.

Categories
Computer Programming Web

Manual letsEncrypt for CPanel

Jump to Renewal Instructions

Additionally, EFF has deprecated use of aptitude/yum/dnf/etc package managers for deploying Certbot on Debian-based systems. Instead, they recommend use of Snap.


At work I recently collaborated with our hosting provider to move our company website to a version of cPanel. Up until this time, there has been no way of running our site on SSL/TLS, and it’s been quite frustrating, having discovered LetsEncrypt and its ease of use. Basically, with this certificate signer, I have no reason to actually figure out the handshaking and signing process as was required in old command-line versions of SSL.

Well, our hosting provider’s version of cPanel has not really been expanded to allow for LetsEncrypt, even though multiple people on the cPanel forums say there’s a plugin available. Seems they don’t mind forcing me to pay another fee on top of everything to get an annual signature from the two default signers they had enabled in the system.

This made me wonder, and think, well CertBot, which generates the certificates and private keys and runs the signing requests automatically, always talked about this “cert only” option, and here on their website, I see instructions for a “manual” option as well. I thought this may have been exactly what I was looking for, since my scenario is – I have a website on a host who does not have LetsEncrypt enabled, but does allow me to upload certificates and keys from an offline source.

Here is my process of installing a LetsEncrypt SSL/TLS DV certificate on a cPanel site not equipped to generate one automatically.

Create a new certificate with any subdomains we’d need using certbot certonly -d c-pwr.com,www.c-pwr.com –manual

Certbot warns you that the computer’s IP you’re generating the certificate on will be shared with them, even though it’s not the server on which the cert will be installed on in the end. Type Y.

Without any “challenges” option in the original command, certbot assumes you’re using the acme challenge which involves uploading a text file to your site. Using cPanel’s file manager I simply do this.

Once the first file in acme-challenges is created, certbot asks us to create another file in the same place with a different string as its contents.

Once both files are created and saved to this location, we probably should verify that the URLs certbot is pointing to are actually visible from the public web.

Knowing that I can access the challenge files from my browser, I assume certbot will also be able to access them, presumably from a curl command or something, so I let it continue.

If we get the standard certbot success message, we now see that it’s created our certificate, chain and private key files in certbot’s standard location (I’m using the PPA repository through aptitude, so certbot automatically installs the latest versions of my certificates to /etc/letsencrypt/live/c-pwr.com/ , which are actually symbolic links to /etc/letsencrypt/archive/c-pwr.com/ , as every time we renew, it will archive the old files and create new ones.

I now can copy the contents of both /etc/letsencrypt/live/c-pwr.com/cert.pem  and /etc/letsencrypt/live/c-pwr.com/privkey.pem up to cPanel in their SSL interface.

After this, I head over to the Manage SSL Sites tool and install this certificate as-is. It automatically detects the domains I specified in the original certbot command and applies the certificate to them.

Renewal

At this point, I have no idea how the renew will work. Since LetsEncrypt issues certificate signatures for only 3 months, this will become an issue sometime in August. I HOPE the acme-challenges will remain the same, but if they don’t, it should be a simple task to recreate the files as above, then copy the files in manually, assuming certificates and private keys can be edited once created in cPanel.

Renewing is super simple, but with this method must be run differently from an automated certbot renew.

  1. Run certbot certonly -d c-pwr.com,www.c-pwr.com --manual again.
  2. I am asked to create new acme challenges on the webserver which I did.
  3. Since the cert already existed in the /etc/letsencrypt/live, it detected this as a renew, and did not prompt me to upload certificates a second time!!
  4. I logged into cPanel and created two text docs in the File Manager as instructed, hit enter in my local server command line and it did everything from there.
  5. 2018-08-01: I forgot that I also need to update and re-copy cert.pem and privkey.pem to CPanel SSL/TLS Status in order for it to actually update, as cPanel just emailed and said my cert was expiring in ten days.
    • cPanel > SSL/TLS > Install and Manage (Manage SSL Sites)
    • Scroll down and select the old domain in the dropdown.
    • sudo cat /etc/letsencrypt/live/c-pwr.com/cert.pem
    • sudo cat /etc/letsencrypt/live/c-pwr.com/privkey.pem
    • Copy the certificate and private key text to the crt and key fields in cPanel.
    • Click Install Certificate.

Additionally, I needed to manually set up my .htaccess file to redirect any http requests to the https version. This is usually done automatically by certbot during an automatic installation, and is embedded in the /etc/apache2/sites-available/000-default.conf file, but since I don’t have access to this, .htaccess will have to do.

Categories
Computer Programming Web

A Fun Adventure in PGP

So I got curious about PGP keys and signing and encrypting using them. I managed to figure out how to use the semi-popular gpg4win (the standard windows port of GnuPG) with its built in Kleopatra GUI, Outlook add-ins and all the other fun stuff.

Categories
Computer Programming Web

PHP Access Control List

A quick little Access Control List (ACL) snippet I made for PHP/HTML. Enjoy!

<?php

$acl = array(
    // Populate with IP/Subnet Mask pairs.
    // Any zero bit in the subnet mask acts as a wildcard in the IP address check.
    array("192.168.10.24","255.255.255.255"),
);

$acl_allow = false;
for ($i = 0; $i < count($acl); $i++) {
    $ip2chk = (ip2long($acl[$i][0]) & ip2long($acl[$i][1]));

    if ((ip2long($_SERVER['REMOTE_ADDR']) & $ip2chk) == $ip2chk) {
        $acl_allow=true;
    }
}

if ($acl_allow) {
    // Put all test stuff here!! Only visible to ACL.
    phpInfo();
} else {
    echo "<a href='http://this-page-intentionally-left-blank.org/whythat.html' target='_blank'>This page intentionally left blank.</a>";
}

?>

 

Categories
Computer Web

Expired Domains – A Headache (but a learning experience)

Hey all! It’s been a few months, I know. But I wanted to share an experience I had with my recent domain name headache.

So as you may or may not know, I’ve owned ppsstudios.com since May 2013. I purchased it via Google Apps which in turn set my registrar to eNom. Both are useful services and work reasonably well. I was attracted mostly to the (back in 2013) $10/year with free ID protection deal that Google Apps offered. Since then they’ve gone up to $12/year, but that’s not really an issue.