Something has changed about the unprompted responses from people seeing me using the MNT (Pocket) Reform in the last month. I'm not just getting a "whoa this is cool and nerdy", I'm hearing more people, unprompted, say "Oh, this seems important, because maybe we won't be able to get computers we control soon."

It's uncomfortable, in a way, that this is starting to become more of the zeitgeist, because I think it's true.

I really do think we need computers we can control and hack on and advance. Because we're in real trouble if we can't.

A friend has been saying "we need more routes to getting chips, we need more routes to manufacture PCBs, people have no idea how fragile 'being able to control your computer' is right now", and I agree with that
@cwebber my only prayer is that LLMs do not get good at things like, trace length matching, EMI tolerance, clock domains, et cetera

@cwebber
Pcb lead times are already slowly going crazy, this was raised in another thread the other day. Meanwhile vps prices are also spiking as these companies slowly tighten their grip on the market and infrastructure.

Compute as a service, it's finally on the table and the capitalists are ravenous.

@rusty__shackleford @cwebber

Between botslop poisoning the wells of information, surveillance invading every layer of our lives, and the market manipulations pushing hardware out of reach, we need alternatives and we need them NOW.

They are enclosing the digital commons. To have any real communications or organizing, there's got to be avenues to survival outside the rapidly gnashing corporate jaws.

@violetmadder @cwebber

Agreed, I have been taking a pretty hard stance on this due to the current geography and climate:

Re: on developers & google:
https://mastodon.social/@rusty__shackleford/116131656159734497

Re: on negotiations:
https://mastodon.social/@rusty__shackleford/116137652128111142

@violetmadder @rusty__shackleford @cwebber I have been looking for alternatives for the future myself.

Raspberry pis aren't quite there yet. I have run beowulf clusters on pine64s (similar hardware) in the past, but the hardware wasn't quite enough then... about 10 years ago (yikes).

If anyone figures something out... I will be paying attention.

@cwebber Language and branding is important in conveying ideas and setting expectations for the general public. For example, Apple managed to convince people that “a Mac” was a totally different type of thing from “a PC”; if it's different, maybe it's better. And an “iPhone“ is not the same type of thing as a “computer”, so don't expect to be able to do everything a “computer” can; it's a new different thing.

So I think we need a word that means a local-first computer that you actually control. “PC” is already taken; “sickobox” is (alas) a hard sell.

I propose we use the word #cyberdeck. A *cyberdeck* means a local-first computer that you actually control. My #PinePhone with #postmarketOS is a prototype cyberdeck. @modal are trying to make a ready-for-primetime cyberdeck.
@cwebber YES & thank you both for speaking out about this
@cwebber yeah definitely, on all counts. artisinal small-batch silicon etching is kind of the dream we've been hoping those smart chemists will figure out, but, right now... well, it's barely a thing...

@ireneista @cwebber

Also isolation milling for PCBs, maybe trying to home-brew chips from the 80's instead of trying to compete with current sizes, etc.

(IMHO, we don't need all that computation power, we simply need to use it better. (Personally, I believe GUIs to be a hype and collective infantilization of most of the user base, but that is an entirely other topic. (Oh no, this post is devolving into a tree structure, stopping now.))

@wakame @ireneista @cwebber

(IMHO, we don't need all that computation power, we simply need to use it better.

There is a point however where you really can't do more efficiently other than implementing the function in hardware.

Even just doing modern cryptography on a 6502 will be painful.

And yes ensuring adequate data integrity & security essentially requires such cryptography.

artisinal small-batch silicon etching is kind of the dream we've been hoping those smart chemists will figure out, but, right now... well, it's barely a thing...

It's awfully expensive and much of it is horrendously toxic. That's mostly the issue.

(Personally, I believe GUIs to be a hype and collective infantilization of most of the user base, but that is an entirely other topic.

There is a large number of things that cannot be represented efficiently or intuitively in text.

Text-mode interfaces and vector graphics can cover for a lot of what now gets done with bitmap interfaces, yes.

And then of course there is literally all the digital painting & drawing that mostly can't be done without bitmap graphics (at least without ridiculous efficiency issues).

@lispi314 @ireneista @cwebber

There are definitely uses for graphics. Or GUIs.

What I abbreviated beyond the point of comprehendability or even context:

Many GUI designers (actually: programmers) seem either to believe that one more line, one more gradient, one more toolbar full of icons will somehow make the GUI more usable, "nicer", etm.

Same people who couldn't set a sensible tab order if their life depended on it.

A candy bar vending machine where you type in two digits and look at an 8-segment-display for the price is IMHO at least twice as usable (on the Foobinger-Nonsens-Scale) as its touchscreen equivalent.