What's the bare minimum #C89 needed to connect to the console in any OS and, in Windows (inc. XP), Linux & MacOS, enable raw mode? Asking for a friend.

Maybe there's a bare-bones framework out there for "give my app a terminal window and make the OS go away" that doesn't involve installing gigabytes of compiler suites and libraries :|

#programming

I'm looking at this tutorial to build a minimal text-editor in C: https://viewsourcecode.org/snaptoken/kilo/index.html I don't need the text-editor, I don't want to use C for that, but it does provide the bare minimum to get a raw-mode terminal and I could build a VM / Forth using that and then forget about the OS and work in script from there upwards.
Build Your Own Text Editor

Bear with me whilst I work this out; my head has been in a spin for a few days with the infinite possibilities and loose ends.

#Forth is almost wilfully, aggressively impenetrable. I respect it for what it achieves with so little but it's not for me. I realised that unlike #uxn, what I really want isn't a stack-machine but a register-machine.

People keep talking about making "simple" software but really what I keep seeing is toys. I see this in uxn with its 8-bit CPU and 64 KB program space, or Pico8 with its ridiculous 128x128 resolution and 32K token limit. Limits like these don't actually make software simple, it places a greater burden on the programmer to do really low-level engineering that's just a distraction from the program they want to write. You only have to try writing an 8-bit division routine from scratch once to know what a brutal multi-day excursion that can be. If the programmer is having to manually implement allocation and virtual memory -- things that should be completely transparent to the program -- that's not simpler software.

Simple software can be serious, can be a replacement for big, bloated software. My ideal VM would be 32 (or 64)-bits wide and assume it's running on top of an already existing OS that has hundreds of megs of RAM. Manual memory management and virtual-memory won't be the programmer's concern, they won't even have to deal with it directly. Even a 15 year old computer is going to be able to run a single VM burning up hundreds of megs with absolute ease.

#retrocomputing #retrodev #programming

Why 32-bit instructions and not something more compact / variable?

Besides alignment benefits, the truth is that CISC is faster and more efficient for interpreters because it means less fetch/decode iterations for the amount of work done. Your overhead for separate load + store instructions in a minimalist RISC design is more than a single combined load + store instruction!

In a CISC design you want a variety of addressing modes to maximise throughput. A 32-bit instruction affords us this flexibility, but we can also do something very neat by aligning everything to nybbles, you can easily read and type instructions in hexadecimal!

In the image is my current design for src-dst instructions. since in an interpreter opcodes 0-255 will just be a big switch statement, we use multiple opcodes to vary addressing-mode, long/byte/word for each instruction, and don't have to encode those bits into the instruction!

#programming

Another failure of toy VMs is memory -- often just a single allocation where both program and data reside. This places engineering burden on the programmer to manage resources all within the same space. Maybe I'm barking up the wrong tree, but in my VM, the program code is execute-only; not even read-only! the program code exists in its own address space and *cannot* read or write to itself.

The VM is given a byte-orientated memory space for the program to read/write to, but the code can also allocate additional memory spaces or load files into their own memory space, doing away with the need to manually manage unrelated resources within the same contiguous region -- let alone arbitrary size limits like 64KB!

#programming #xom

With about 30 instructions and some combination of addressing modes applied, even with a full compliment of 256 opcodes the implementation would still be trivial but writing an assembler for this thing would be a nightmare. Now there are tools for that but that creates another dependency and another rewrite if you want to self-host.

We need to go ~~deeper~~ simpler.

We need an ISA which we can read as words from a text file and execute with almost no context , i.e. one word at a time without lexical scoping.

Which means... oh dear

Forth (or Lisp)

Most of Forth's weirdness stems not from RPN but from it's Von Neumann architecture. Code that writes self-modifying code all intermixed with whatever the program being executed is doing. This makes perfect sense if you're writing a #Forth against bare metal -- it's fast, compact and flexible -- but if you're writing a VM or interpreter on top of an OS then for the love of all that's sane in programming, use Harvard architecture!

I've failed to write a working Forth probably 4 times now because I've struggled with the dividing line between assembly language and scripting language. This boils down to the fact that instructions are fundamentally different from what those instructions manipulate. In a scripting language you need metadata from the source file to give error messages that point to the source. This all goes awry when the program constructs code on the fly as part of running the program.

Nobody in this day and age questions why their #C program can't modify itself whilst running and why every C program doesn't have a built-in C REPL. The compiler has solved the code-generation problem already, x86 binary doesn't need to be interactive.

If we take lessons from XOM (above), then we separate code and data into separate spaces (Harvard). The file that is fed in is assembled and constitutes the only instructions that can execute. The code *operates* upon a byte-orientated data-space; instructions cannot read instruction space, that is to say that instructions themselves don't have to deal with the structural layout of instructions (bytes, words, metadata etc), which is the pain point of Von Neumann Forth programs.

#programming

But I might be entirely wrong; I'm a designer first, programmer second.

I have tried to design a low-level scripting language that's stack-based underneath, but operates left-to-right but I warn you that it's incomplete and probably full of fundamental flaws: https://gist.github.com/Kroc/62fd60dda68f667e0e4d94c9e08bf2af

I will probably rework some of these ideas into what I'm building with a closer eye on practical implementation over aesthetics.

#programming #forth

Pling!

Pling! GitHub Gist: instantly share code, notes, and snippets.

Gist

Hooooly crap this is crazy cool. I added the ability to my Forth-like scripting language to read words from the call-site when calling a function, i.e. getting parameters. This allows for some very interesting program structure, like creating infix operators using the native Forth primitives:

#programming #forth

I've hit the jackpot here -- we can define a function that defines functions and build an easy to read left-to-right #python / #lua &c, style scripting language even though it's a stack machine underneath. This is very powerful!

#programming #forth

#Programming just makes you want to scream sometimes! There's no standard way in #C to get the path of the executable, i.e. to open files next to the exe, rather than at the current-working-directory!

There are platform-specific APIs, libraries and so on, but you are instantly excluding all kinds of systems (will this work on Amiga? EPOC32??). I found one solution that uses a variety of methods to deduce the path by mimicking what most OSes do anyway to locate an executable but it's just another agonising pain point of trying to make the most basic software work on computers these days: https://stackoverflow.com/questions/1023306/finding-current-executables-path-without-proc-self-exe/34271901#34271901

Finding current executable's path without /proc/self/exe

It seems to me that Linux has it easy with /proc/self/exe. But I'd like to know if there is a convenient way to find the current application's directory in C/C++ with cross-platform interfaces. I'v...

Stack Overflow

Honestly this is a dead-end; whilst I've written my first C program (https://codeberg.org/Kroc/xom), and it's some very clean, well commented code (practically unheard of, I know) and it compiles without warnings on a PowerPC Mac and Windows XP on a Pentium III, the difficulties of designing a programming language are ultimately a sidetrack.

I would use Lua+SDL2, for example there's lite (https://github.com/rxi/lite) which builds a text-editor out of a basic C-to-SDL2 wrapper as a front-end to Lua, but it doesn't handle my screen-scaling and updated forks, like Pragtical (https://github.com/pragtical/pragtical) add Meson, LuaJIT and other layers of complexity that make it impractical or impossible to compile on old systems

Why the hell does it even matter if my code compiles on XP or PPC Mac? Why not just accept reality and target the machine you use all the time? Because if my code can compile on old hardware, then it'll do so and run fine on even the most esoteric Linux base or window manager. I don't want to be tied to Debian-based, KDE|GNOME OSes any more than I want to be bound to Windows 11.

When your tools will no longer compile on/for Windows 10 and you don't want to swallow the AI pill, are you going port GCC/Clang yourself? What if the same things happens to KDE|GNOME?

#programming #c #KDE #GNOME

xom

eXecute-Only-Machine: A portable Harvard Forth-like scripting language.

Codeberg.org
I’ve learnt enough C to be portable even when compiler suites and OSes aren’t to just write a text-editor in C and use SDL2 to get an interface; I want to create a modern dark-mode Windows 3.1 GUI because I think the world has gone f*cking insane and nobody is making clear, usable interfaces with borders any more

Rough first SDL2 code to just draw on a window manually. I've discovered that SDL2 can't really give me the scaling factor from the OS [pic1], that's fine, as I don't want to be 'stretching' bitmaps into a blurry mess, rather everything is drawn as lines and boxes and when the scale is increased we can increase the spacing without the lines being fractional [pic2] and above some scale factor (e.g. 4k screens), switch to thicker lines as well [pic3] (there are bugs with this, lol :P)

#programming #sdl2 #retrocomputing

Compiled and ran first time, no warnings, on Mac OS X 10.5 PowerPC :) You see, we can have side-by-side software releases for modern and retro machines!

#programming #retrocomputing #mac

The #UI framework is going to be called OUI (pronounced "we" like the French word) for "OldUI" but the text-editor is going to be called either "peek", "meek" or maybe "pling!", if you have any suggestions. If you're interested in either C89 & SDL2 on old computers and/or/xor retro UI design consider reaching out here or on Discord https://discord.gg/mKkYfA4B (7-day link)

The goal is using the lessons of old UI but not as a slavish reproduction; I still want dark-mode, HiDPI and anti-aliasing, i.e. it won't be limited to just Win3.1, I'd like to offer plenty of customisation and perhaps other drawing paths to mimic Mac OS's "Platinum", QNX Neutrino and anything else with a solid #UX foundation.

#programming #retrocomputing

Publish early, publish often: https://codeberg.org/Kroc/oui
oui

OUI, short for "OldUI", fights against UI enshitification. Take your liquid-glass and shove it.

Codeberg.org

I know I'm very much reinventing the wheel here, especially because the wheel has been abandoned and whatever they're calling wheels these days require a supercar and a subscription at minimum to use and no one can build a wheel any more without a hundred people and a plan to burn money until a vulture swoops in to buy the burning husk for how bright it doth shine.

I *wish* this wasn't necessary, like I could just write some UI code and it'll run on any platform and it'll continue to do so forever no matter what new thing they add further down the line. Oh, wait, that already exists! It's called Visual Basic 6! I could write a GUI [to track the IP] in VB6 right now and it would run on Windows 95 through Windows 11 and via WINE on *nix.

Microsoft went on a bit of a bender in the 90s with COM, their object model, but they had the right idea if only lacking in execution (ActiveX...). If you want to embed a web-browser in your app you hook into the COM object and you're away. Different programming language? Doesn't matter. Different OS? Doesn't matter. Different *Endiness*!? Doesn't matter. Different physical location than the object? Doesn't matter, it works over the network!

Software is less modular than ever before; now "modular" just means the programmers used folders this time! Want to render a LibreOffice document inside your app? A PDF? A media player? How many years you got left?

#programming #windows #vb6 #retrocomputing