Remember the "One Laptop Per Child" project, that developed a low-cost computer for children in developing countries? I was always amazed by a certain feature: The "View Source" button.

When you pressed it, the source code for the currently running application would open. This was supposed to encourage tinkering with the software on your device! <3

I've been pondering what it would take to build that button on modern machines. Has anyone seen something like that?

(Prototype in next toot.)

You'd roughly need to:

- Figure out which program is currently focused
- Figure out the Git repo of this software
- Clone it into a temporary directory
- Set up the required tools to start hacking on it and compile it

As a quick prototype, I wrote a li'l Bash script that does some of these things. It makes heavy use of #nix and #nixpkgs:

https://codeberg.org/blinry/view-source-button

I enters a "dev shell" with the required tools already in the PATH, and even sets up a Git remote to start contributing. :D

view-source-button

A script that allows you to start tinkering with software

Codeberg.org

@blinry

Or have the entire system built around being interpreted like Python or C#. Maybe C# would even be a better option as it's JIT compiler is better in my eyes. And it integrates better with that XML based GUI definition language Microsoft had.

Edit: WPF XAML was it.

@agowa338 @blinry I used to have the dream of a runtime that worked like a dynamic language interpreter, but rather than being specific to one language each call could invoke a different interpreter for whichever language was needed for the method being called. My original goal was to not have to recreate every library in every language, but the more I thought about it the more other potential benefits I saw. Even wrote my undergraduate thesis about how it might be done

@ShadSterling @agowa338 @blinry that's one of the very long term goals of my post-scarcity computing project, but it's not one I expect to hit in my lifetime.

"To summarise, again:

Code is data. The internal representation of data is Don't Know, Don't Care. The output format of data is not constrained by the input format; it should suit the use to which it is to be put, the person to whom it is to be displayed...."

/Continued

@ShadSterling @agowa338 @blinry

"Thus if the person to whom my Java code is reflected back is a LISP programmer, it should be reflected back in idiomatic LISP syntax; if a Python programmer, in idiomatic Python syntax. Let us not, for goodness sake, get hung up about syntax; syntax is frosting on the top. What's important is that the programmer editing the code should edit something which is clearly understandable to him or her." -- me, obviously.

https://www.journeyman.cc/blog/posts-output/2006-02-20-postscarcity-software/

Post-scarcity Software

For years we've said that our computers were Turing equivalent, equivalent to Turing's machine U. That they could compute any function which could be computed. They aren't, of course, and they can't, for one very important reason. U had infinite store, and our machines don't. We have always been store-poor. We've been mill-poor, too: our processors have been slow, running at hundreds, then a few thousands, of cycles per second. We haven't been able to afford the cycles to do any sophisticated munging of our data. What we stored — in the most store intensive format we had — was what we got, and what we delivered to our users. It was a compromise, but a compromise forced on us by the inadequacy of our machines.The thing is, we've been programming for sixty years now. When I was learning my trade, I worked with a few people who'd worked on Baby — the Manchester Mark One — and even with two people who remembered Turing personally. They were old then, approaching retirement; great software people with great skills to pass on, the last of the first generation programmers. I'm a second generation programmer, and I'm fifty. Most people in software would reckon me too old now to cut code. The people cutting code in the front line now know the name Turing, of course, because they learned about U in their first year classes; but Turing as a person — as someone with a personality, quirks, foibles — is no more real to them than Christopher Columbus or Noah, and, indeed, much less real than Aragorn of the Dunedain.

The Fool on the Hill
@simon_brooke @agowa338 @blinry that’s way more ambitious than the runtime I dreamed of; part of the reason for making it one giant FFI is that, even aside from different runtime in-memory type representations, the different semantics and scoping rules make arbitrary translation extremely challenging. Encapsulation seemed to me more tractable. But methods written to be attached to objects of my dream runtime could be easier to manipulate that way