Bristlerock

0 Followers
0 Following
31 Posts

While I agree with scrubbles about you eventually wanting public services covered, and so the initial pain is worth it in the long run, it can be done with an internal DNS server. I started this way, still use it (mostly for Gitlab CE, which needs a name), and have the SWAG+Authelia setup for public-facing stuff that I mentioned above.

Dashboards like Heimdall and Homepage do the job nicely, but if you want to give the internal DNS thing a try, this is how I set mine up internally:

  • Pick a non-routable domain TLD so that mistakes won't accidentally route through to the Internet. Lots of people choose a *.lan or *.local domain, but that's not a good idea (it's a RFC and unintended consequences things). Using a *.arpa domain is a better option, e.g. whatever.arpa. But it's your call.
  • Setup a DNS server on your LAN that forwards external lookups to AdGuard Home. I've got this running on my NAS, using its native DNS Server service.
  • On this new DNS server:
    • Create a forward zone file for your new domain, add the SOA, configure it to only accept zone file requests from your AGH IP, and add A- and CNAME-records as needed to map each service.whatever.arpa. to an IP.
    • Create a reverse zone file for your LAN subnet, same SOA and zone file permissions as above, and add PTR records to map each IP.in-addr.arpa to a service.whatever.arpa..
  • Lastly, on AGH in Settings > DNS Settings...
    • Upstream DNS servers: Add the following to whatever you currently have there: [/whatever.arpa/]your.new.dns.ip
    • Private reverse DNS servers: Add your new DNS IP, tick "Use private reverse DNS resolvers" and untick "Enable reverse resolving of clients' IP addresses".

My client devices all use the AGH as their DNS server. Lookups to internal addresses get forwarded to my internal DNS server and everything else gets done by AGH. This lets me browse to http://service.whatever.arpa on my network without issue.

This is how I do it. It works internally and externally, though it's more than OP needs. :)

To add to what's been said (in case it's useful to others), it's worth looking at SWAG and Authelia to do the proxying for services visible to the Internet. I run them in a Docker container and it does all the proxying, takes care of the SSL certificate and auto-renews it, and adds MFA to the services you run that support it (all browsing, MFA-aware apps, etc).

Another thing I like about SWAG's setup is that you select which services/hostnames you want to expose, name them in the SUBDOMAINS environment variable in Docker (easy to remove one if you take a service down, for maintenance, etc), and then each has its own config file in Nginx's proxy-confs directory that does the https://name.domain -> http://IP:port redirection for that service (e.g. wordpress.subdomain.conf), assuming the traffic has met whatever MFA and geo-whitelisting stuff you have set up.

I also have Cloudflare protecting the traffic (proxying the domain's A record and the wildcard CNAME) to my public address, which adds another layer.

GitHub - linuxserver/docker-swag: Nginx webserver and reverse proxy with php support and a built-in Certbot (Let's Encrypt) client. It also contains fail2ban for intrusion prevention.

Nginx webserver and reverse proxy with php support and a built-in Certbot (Let's Encrypt) client. It also contains fail2ban for intrusion prevention. - linuxserver/docker-swag

GitHub
The Honeynet Project, related to the SANS Institute when I last checked, has a lot of resources on honeypots that are worth a look, if you haven't already.
The Honeynet Project

Yeah, the container I used requires your Steam ID as an environment variable.
GitHub - lloesche/valheim-server-docker: Valheim dedicated gameserver with automatic update, World backup, BepInEx and ValheimPlus mod support

Valheim dedicated gameserver with automatic update, World backup, BepInEx and ValheimPlus mod support - lloesche/valheim-server-docker

GitHub

That's a really open-ended question. Depends purely upon your interests and appetite for risk, etc.

Might be worth looking at, from a Docker perspective:

  • AdGuard Home (I think it's better than Pi-Hole)
  • Wireguard or similar. Great for reaching your services when away from home.
  • Audiobookshelf. Audiobooks. There are good apps.
  • Calibre-Web. Ebooks.
  • RSS feed reader, for non-social media websites you visit. Plenty to choose from: FreshRSS, TT-RSS, Sismics, etc.
  • Gitlab CE. If you're a developer or can otherwise make use of version control.
  • Gotify. Alerting on your containers. Has a good mobile app.
  • Heimdall. A dashboard for everything you're running.
  • Komga. If you're into manga. The best iOS app is meh, but the best Android app is awesome.
  • Mealie. Recipe database.
  • Paperless-ngx. Excellent for storing your PDFs and other digital life.
  • PhotoPrism. Basically Google Photos.
  • Portainer. Great for managing Docker containers/stacks.
  • qBitTorrent. Guess what that's for.
  • SWAG with Authelia. SWAG does reverse proxying with a Let's Encrypt certificate, and automatically renews it for you. Authelia provides MFA (Authy, Google Authenticator, etc) on top of it.
  • Vikunja. Todoist or Toodledoo without having to pay for features.
  • Wallabag. Basically Pocket.
  • Watchtower. Automatically updates containers for you. Can exclude the ones you don't want to update, etc.
  • Webtrees. Family tree research, if that's your thing.
  • YouTransfer. Useful for sharing files without having to use Dropbox, etc.
Searched "tdr" before replying, and was inexplicably happy. :)

I have zero problem with curated or algorithmic timelines. I have a 100% problem when there isn't a chronology timeline option.

It's simple really: give me the permanent option of chronological with dark pattern fuckery of having to reset it periodically, or fuck off forever.

Every time a social media site has offered, pleaded, cajoled or forced me to take a non-chronological timeline, I've refused. And if that refusal eventually becomes impossible (no option, addons no longer work, etc), I take my eyeballs elsewhere.

You're not an edge case. :)

Yeah, it make for a nice workflow, doesn't it. It doesn't give you the "fully automated" achievement, but it's not much of a chore. :)

Have you considered something like borgbackup? It does good deduplication, so you won't have umpteen copies of unchanged files.

I use it mostly for my daily driver laptop to backup to my NAS, and the Gitlab CE container running on the NAS acts as the equivalent for its local Git repos, which are then straightforward to copy elsewhere. Though haven't got it scripting anything like bouncing containers or DB dumps.

Agreed. The lack of varied examples in documentation is my common tripping point. When I hate myself, I use visit SarcasmStackOverflow to find examples, and then reference those against the module's documentation.

And it's definitely become an easier process as I've read more documentation.