* Make software that works on older devices, the older the better.
* Make software that will keep on working for a very long time.
* Make software that uses the least amount of total energy to achieve its results.
* Make software that also uses the least amount of network data transfer, memory and storage.
* Make software that encourages the user to use it in a frugal way.

#FrugalComputing

@wim_v12e you know what, let me plug good old fashioned AI* here.

Well-understood, high performance for its intended uses, runs on a single laptop, and is still closer to AGI than LLMs: soar.eecs.umich.edu

It's runs local, and it has been updated over decades.

Rough around the edges, but none of this "become dependent on us and require huge compute" garbage.

*It's not quite as brittle as GOFAI has been known for.

@Unampho @wim_v12e I'd argue that even some of the stuff OpenAPI did before LLMs is closer to AGI than LLMs are.