project #permathread: glsl-view, a GLSL shader host for livecoding.

EDIT: scroll down for the successor toolkit wgsl-view 😉

It's designed in tandem with #alv, but speaks generic OSC and should be easy to integrate elsewhere. The OSC interface is unopinionated and tries to expose GLSL language features as immediately as possible (within reason); there's no magic builtin uniforms or additional sampler info injected for example. So far, the only specific niche feature is loading of videos into 3d textures.

past threads:

https://merveilles.town/@s_ol/111393188626738475
https://merveilles.town/@s_ol/114162299639261929
https://merveilles.town/@s_ol/114973650565149124

s-ol (@[email protected])

Attached: 1 video picking up alv a bit again while trying to focus on the original purpose I had in mind, mangling MIDI and OSC in real time to animate visualizations. #theWorkshop

Merveilles

did a poorly implemented retro tunnel scroller effect while looking for a new aesthetic for the next gig and somehow it ended up in spooky october colors ;)

mosquito drawn by bandmate @mgpjm for our last concert at the "giovani zanzare" festival ("young mosquitos")

apart from the mosquito sprite, the only other asset is this single hexagon wall segment that I whipped up in Inkscape in 5min and want to replace with more hand-drawn scans instead :)

I've been migrating glsl-view (Zig + OpenGL + GLSL) to a new wgsl-based toolkit (Rust + wgpu/Vulkan + WESL/WGSL) that heavily relies on (a fork of) texture-share-vk to share textures between separate processes directly on the GPU.

The new toolkit splits the monolithic "shader host that can load images, videos, streams as textures" up into a set of tools like

- tsv-view: show a texture-share-vk texture in a window
- tsv-video-stream: stream any ffmpeg source into a 2d TSV texture
- tsv-video-buffer: load N frames of any ffmpeg stream into a 3D TSV texture
- wgsl-render: receive a shader, uniform values and TSV texture bindings and render it to another TSV texture

So using these tools I basically set up a multimedia pipeline/graph by spawning processes and sending OSC messages from #alv.

TSV works fine but is a little temperamental, eventually I'd like to move to PipeWire with DMABuf as media interconnect, and maybe do OSC-over-pipewire as well (but the former I haven't gotten to work yet even in isolation and the latter isn't supported in PipeWire yet).

What's pretty neat is that I have http://wesl-lang.dev with its module system now! That means I can trivially consume https://lygia.xyz now :)

I'm also finding some interesting metaprogramming techniques between #alv and the modules, like basic polymorphism in this raymarching library:

pic one is the user code, pic two is the library implementation.

First the user declares the scene sample type they want to use by defining a shader module that contains a "Sample" type, an intial value, and a helper function that extracts the "distance" float from that type. This means they have complete autonomy over what material data they need (e.g. meterial identifiers, surface UVs, or whatever else).

That shader module can be passed to an alive function that returns another shader module that implements common distance field operations (union, difference etc) on top of these primitives.

Now the user can define a second shader module that contains the scene function (using the SDF utils), and give that back to the library which uses it to provide "castRay" and "calcNormal", which the primary user module can include to render the scene.

Unlike GLSL #define-type approaches, you could even instantiate multiple scenes with different result types if you wanted to

#theWorkshop

spinny cube go brrr

bit more detail on the #alv part of this: inside a $shader"…" you can interpolate values of different types:

Numbers, Booleans, and arrays of (arrays of) these are assembled into a uniform struct (one per shader). Textures and samplers are declared as globals with a unique name. The interpolation is replaced with a reference to that name in both cases.

When you interpolate a shader module, that module is added to the set of dependencies of the generated module and the reference expands to the referenced modules fully qualified name. That happens to work both in an import statement (to give a local name or import module members directly) or as an inline reference:

(def depmod $shader"fn double(v: f32) -> f32 { return v*2; }")

## import alias:
import $depmod as aliased_name;
let six = aliased_name::double(3);

## import specific members:
import $depmod::double;
let six = double(3);

## direct reference:
let six = $depmod::double(3);

Here's the alv code and the generated WESL modules for a simple example that distorts and draws an input texture:

#theWorkshop

I was a bit hesitant about the switch to WGSL initially, but despite the backlash that it got online, I'm actually mostly finding it a step up from GLSL in terms of ergonomics.

The only two major gripes I have wrt livecoding at the moment are not being able to assign to / modify swizzled vectors (`p.xy *= math::rotate2d(r)`) and needing to reassign function parameters to make them modifiable (as seen in the examples above).

Next on the list would be the lack of function overloading, which makes for somewhat awkward conventions like `center`, `center2`, `center3` etc... Which is mostly annoying in combination with the fact that WESL lacks wildcard imports as of right now.