so, apparently hacking #scheme is going to get even more fun with B.L.U.E., a sane, extendable, lisp-y l, agnostic build system and #Ares, the interactive hacking tool we always sensed was missing from our work. Yes, we now have insightful backtraces in #guile!

The future has come!

https://codeberg.org/lapislazuli/blue
https://git.sr.ht/~abcdw/guile-ares-rs

@abcdw @shepherd

#guix #fosdem #fosdem2026 #blue #lisp #repl #buildsystem #reproducibility #hacking #fun #coding #interactiveprogramming

blue

B.L.U.E - Build Language User Extensible. A generic build-system crafted entirely in Guile.

Codeberg.org
@dthompson Yeah it's annoying. I'm looking forward to catching the news about your #repl #guile #wasm work and what else is happening in #spritely world!

@akkartik Since #Forth is just so great for super concise code, allow me to add another example, here to transpile (a subset of) Forth into GLSL for livecoding shaders. This one is using my old 2015 CharlieVM and you can find all the example source snippets in the readme here:

https://github.com/thi-ng/charlie

The REPL itself live at:
https://forth.thi.ng/

The attached screen capture shows 4 shader examples (longest one is 12 lines of code)

#Livecoding #REPL #GLSL #Shader #Transpiler

Trivial memory cardgame in the #commonLisp #repl . (Er, using #McCLIM presentations).
https://screwlisp.small-web.org/lispgames/memory-game/

A good beginner example.

As I reveal in the thrilling conclusion

https://screwlisp.small-web.org/lispgames/memory-game/#conclusions

this small post is a step on the way to using xhtmlambda to generate a kitten webgame soon.

CLIM-USER> (reveal (elt *cards* 3))
T
CLIM-USER> (disp)
X,X,3,1,2,3
G3648,G3647
NIL
CLIM-USER> (reveal (elt *cards* 0))
NIL
CLIM-USER> (disp)
1,X,3,X,X,3
G3644,NIL

Sharpsign #lispgames #programming

A memory game in McCLIM

Wait, so #Git has a #REPL 😱

git add --interactive (https://git-scm.com/docs/git-add#_interactive_mode) opens up an interactive prompt allowing one to review, revert, update, patch, and edit changes to the repository. I mean, yeah, it’s not really a REPL (proper REPL is much more!) but an interactive interface is nice nonetheless!

I doubt any person I know actually knows and uses that, so I might as well be the first.

Git - git-add Documentation

In the #Python programming language, the new #REPL from Python 3.13 (2024) has added colorization in the #interpreter in #interactive Python, similar to the interface seen in later versions of #PyPy. Python 3.14 (2025) and Python 3.15 (2026) continue along with the improved REPL with the colorization of the Python #syntax itself.
This shows differences between the interfaces for the #Python programming language #REPL interpreters. Python versions 3.9, 3.10, 3.11, and 3.12 have retained relatively similar interpreter features in interactive mode.

At this point, I think I'm satisfied with the vim-go plugin providing me with a stoopid simple template for prototyping very basic example programs.

It's definitely not a #REPL like I'm used to with #Python or running from the #CLI; but, it's a bit of a useful workflow to get started. *shrug*

These keybindings help a bit:

```
augroup go
autocmd!
autocmd BufNewFile,BufRead *.go setlocal
\ noexpandtab
\ tabstop=4
\ shiftwidth=4
autocmd FileType go nmap <leader>b :<C-u>call <SID>build_go_files()<CR>
autocmd FileType go nmap <leader>d <Plug>(go-doc)
autocmd FileType go nmap <leader>f <Plug>(go-fmt)
autocmd FileType go nmap <leader>i <Plug>(go-info)
autocmd FileType go nmap <leader>l <Plug>(go-lint)
autocmd FileType go nmap <leader>r <Plug>(go-run)
autocmd FileType go nmap <leader>v <Plug>(go-vet)
autocmd FileType go nmap <leader>t <Plug>(go-test)
autocmd FileType go nmap <Leader>c <Plug>(go-coverage-toggle)
augroup END
```

#Golang #Vim

While I was working on this, the article Python Numbers Every Programmer Should Know appeared on the orange website. In #LuaLang, and on a 16-bit target, these overheads are less -- for example, a number weighs 10 bytes instead of 24 bytes -- but overheads don't have much place to hide on a small, slow machine.

(Btw numbers cost 7 bytes each in 8-bit Microsoft BASIC so Lua isn't gratuitously inefficient here, even by the standards of 50 years ago.)

One place that makes overhead really obvious: a 64K segment holds a table of length, at most, 4,096 entries. That's 40,960 bytes, and Lua's strategy is to double allocation size every time it wants to grow the table. 2 x 40,960 exceeds a 64K segment, so 4,096 entries is the growth limit.

On a 640K machine, after deducting the ~250K (!) size of the interpreter (which is also fully loaded into RAM), you'll get maybe five full segments free if you're lucky. So that's like maybe 20,000 datums total, split across five tables.

Meanwhile a tiny-model #Forth / assembly / C program could handle 20,000 datums in a single segment without breaking too much of a sweat!

The efficiency has costs to programmer time, of course. Worrying about data types, limits, overflows, etc. The kinds of things I was hoping to avoid by using Lua on this hardware -- and to its credit, it does a good job insulating me from them. Its cost is that programs must be rewritten for speed in some other language once out of the rapid prototyping phase and having reasonable speed / data capacity becomes important.

I'd estimate the threshold where traditional interpreters like Lua become okay for finished/polished software of any significant scope, is somewhere around 2MB RAM / 16MHz. So think, like, a base model 386. Maybe this is why the bulk of interpreters available in DOS are via DJGPP which requires a 386 or better anyway.

#BASIC was of course used on much smaller hardware, but was famously unsuited to speed or to large programs / data.

I know success stories for #Lisp in kilobytes of memory, but I'm not quite sure how they do it / to what extent the size of the interpreter, and overhead of data representation (tags + cons representation), eats into available memory and limits the scope of the program, as seen with other traditional interpreters.

This is beginning to explain why #Forth has such a niche on small systems. It has damn near zero size overhead on data structures. (The only overhead is for the interpreter core (a few K) and storing string names in the dictionary (which can be eliminated via various tricks)). ~1x size and ~10x speed overhead is the bargain of the century to unlock #repl based development. However, you're still stuck with the agonizing pain of manual memory management and numeric range problems / overflows. Which is probably why the world didn't stop with Forth, but continued on to bigger interpreters.

#retrocomputing

Python Numbers Every Programmer Should Know

A cheat sheet of real-world timing and memory numbers to guide performance-sensitive decisions.

Michael Kennedy's Thoughts on Technology
✨ Behold, the latest marvel from the "cutting-edge" tech minds: an "enhanced" #REPL for Common #Lisp that promises to revolutionize your coding experience by doing...well, exactly what a REPL already does. 🤯 But don't worry, it comes with the added bonus of making you question why you ever thought coding was fun in the first place. 🥳
https://github.com/atgreen/icl #cuttingedge #techinnovation #codinghumor #softwaredevelopment #HackerNews #ngated
GitHub - atgreen/icl: Interactive Common Lisp: an enhanced REPL

Interactive Common Lisp: an enhanced REPL. Contribute to atgreen/icl development by creating an account on GitHub.

GitHub