There used to be a startup where you could buy intentionally thermally inefficient computers to install in your house as radiators. They were networked, and the company would sell off distributed computing power and split the money with you, essentially subsidising your domestic heating.
This should be the model for LLM/ML, no huge data centres, just very cheap electric heating for households. But that won't happen in the current bubble because investment is locked in to a different model.

@_thegeoff It's not just the financial model. Data centres have redundant power to the site coming from different suppliers through different routes. They have redundant network connections, again, coming from different suppliers through different routes. They have a lot of physical security with multiple security cameras observing each other on physically separated networks, because generally if you have physical access you can get access to all the data. They have three-phase because the big multi-GPU racks for AI applications are rated for tens of kW.

People's homes do not have any of these things. The model might work for niche cases like companies with low security requirements who only want to colocate one server rack and don't care about uptime or having a fast network connection. It would only work for the big boys if the houses were purpose-built to support it, with dedicated secure space and services, and even then, if it's one server rack per house, you'd need a city's worth of houses to fit a data centre's worth of racks.

More plausible today is using a neighbourhood heat network, where warm water from the data centre is used to power water-source heat pumps in other buildings in the neighbourhood. These are getting popular with city planners, but the problem with getting data centres onto them is that people don't want to live next to them, and they don't want to build new ones on expensive land in existing neighbourhoods.

@tuftyindigo Precisely my point. We could have the advantageous elements of ML, and free heating.
But we're stuck with shonky chatbots (for public consumption), a potentially economy-destroying bubble, and no free heating, because locked-in investment models.
@_thegeoff If your point was that putting servers in domestic houses just doesn't meet the requirements of people who want to rent or host servers, and building new houses that could benefit from free heating doesn't meet the requirements of people who want to live in houses, then yeah, spot on. But it sounded like you think this idea could work well and investors are the only obstacle.
It is a good idea, but none of the participants (data centre builders, data centre customers - which includes everyone who uses a computer or phone - and people who live in houses) want what it implies.
@_thegeoff There is a startup that is building a distributed data centre with computer units that heat domestic hot water tanks in people's houses. Only available to people with fibre to property internet connections. Being promoted by Octopus Energy.
@_thegeoff I am 100% convinced that the end game for LLMs/similar is only running them locally. It's the only long term cost model that remotely makes sense.

@simonbp @_thegeoff and it's kind of the death knell for generalist models. You can get some perhaps-amusing but not very *useful* outputs from a generalist model that fits on consumer hardware.

Specialist models that run locally are often pretty great, but the economics don't support the "infinite growth" myth, so the market craze is looking directly past this use-case.

@SnoopJ @simonbp On the other hand, you, I and maybe a few thousand other people with a particular interest in <insert ML usage>, say SETI for me, basically fediverse our radiators. Honestly, that sounds like the future. Written by @cstross, until it all goes very weirdly wrong.

@_thegeoff

And because the workload is completely different and cannot be distributed for reasons of both security and efficiency.