I want to migrate Home Assistant from a container on an old NUC to HAOS in a VM on a new machine.

Thought it should be easy: take a backup in HA, download it, restore it. But fuck me, with the default settings HA takes nearly 2 hours to create a backup of my (admittedly large) data. I'm not going to accept 2+ hours of data loss.

Turns out they now encrypt backups by default (good, probably). Turning that off I can reduce to 15 mins, but looks like it's slowed down by a single-threaded gzip.

Home Assistant, I know I can do better than you.

Let's try streaming a raw tar file over the network to a machine that can compress faster.

On the HA machine:

time tar -cvf - --exclude='backups/*.tar' --exclude='*.db-shm' --exclude='*.log' --exclude='*.log.*' --exclude='tts/*' * | nc -Nl 2222

And on my M2 Pro Mac:

time nc sadpunk 2222 | gzip > ha-manual.tar.gz

9:28.20. Better than the 15 mins.

But we can do better! Let's give it some pigz.

time nc sadpunk 2222 | pigz > ha-manual.tar.gz

3:44.93. That'll do nicely. Down from nearly 2 hours to just under 4 minutes.

lol Home Assistant. Now the new system fails to import the backup. The web UI returns a 500 after a while. I guess I’m really going to be doing this the hard way then.

Jesus fucking christ WHAT THE FUCK, HOME ASSISTANT.

So, the backup needs to be a tarball containing a file called backup.json (metadata) and the gzipped tarball I made.

The upload of the backup failed because HA is expecting the tarball to contain ./backup.json and I added it just as backup.json. What the actual fuck.

I've written up the full saga: https://pub.jamesog.net/3lysnfelse22w

This is published on https://leaflet.pub/ which uses ATproto to store your posts. It's pretty nice. Much nicer than my Hugo setup that constantly breaks, anyway.

Adventures in migrating Home Assistant between devices - james(bl)og

Moving from HAOS to a container install and back to HAOS

@jamesog that’s quite the migration effort. I considered doing something similar for replacing my ubiquiti gateway with the newer one, then I just yeeted the old one and plugged everything into the new one. 30 minutes later everything was back online.
@phredmoyer In some ways I have regrets about buying in to the Home Assistant ecosystem. I hate large Python software projects and I wish it was easier to migrate the data to a proper TSDB. Probably I could have started from scratch and been happy enough having the data that's also in Prometheus, but it's useful to have HA have all data so I can review how solar PV has been working year-on-year.
@jamesog I always treated my Home Assistant container as being close to ephemeral. Every 6-12 months I would just copy out the automations/config yamls (which I stored under version control anyway) then drop that into a new installation. I was never bothered about historical data in HA since I scraped HA's Prom exporter separately to drop the entities that I had no interest in (most of them).
Having said all that I still haven't rebuilt my HA after moving house 3 years ago so 🤷
@29821632 I've got it all in Prometheus too, but there's some places where HA's more useful for looking at the data, mostly on the Energy dashboard. It's a bit tricky to reproduce that in Grafana. I guess HA keeps its own aggregations too, which I probably need to create recording rule for in Prom
@jamesog Ah that's fair, especially for smart energy monitoring integrations. My HA was purely for controlling things, either manually with a toggle/widget or a chain of integrations, so for that I only needed the instantaneous data. Main integration I had was presence/absence detection using phone MAC addresses appearing on the Unifi wifi controller which would arm/disarm the Blink cameras. It worked well but the database would get massive, hence me blatting it quite frequently.