please enjoy: my Wasm-hosted, Wasm-targeting build of Clang/Clang++/LLD: a self-contained, 25 MiB (gzipped) pure function
https://www.npmjs.com/package/@yowasp/clang
here's how you can use it to "just-in-time" compile and run any C (or C++) code you'd like:
you can also compile executables using printf(), fopen(), etc; indeed in theory you could bootstrap (recompile) LLVM using this very compiler! although CMake makes it fairly annoying to figure out how to run it

this C compiler is really fast! it can compile and link a simple C program in ~65 ms on my machine. (this involves spawning three Wasm "processes")

on the same machine, running the same command with a very similar Clang build natively takes ~80 ms.

this is fast enough for update-as-you-type live coding!

@whitequark Be careful, someone's gonna try to compile the linux kernel with that in hope it's gonna be faster
@whitequark Catherine I’m worried if you don’t stop we’ll discover it’s webassembly all the way down
@s0 I'm actually more focused on building _up_ than _down_; I did this so that I could compile CXXRTL simulation code and immediately run it in the browser
@whitequark well consider me deeply impressed & respectful as always.
@whitequark @s0 I look forward to the day I can get steam achievements for formally verifying an AXI interconnect.
@whitequark @s0 and now i'm picturing a character designer screen where you get to choose AMBA, Avalon, or Wishbone instead of a gender
@azonenberg @whitequark @s0 What about an alternative universe with (widened) ISA/68000/maybe 6502 buses used internally?

@snowfox @whitequark @s0 I don't think any of those support multiple pipelined transactions so it would be super slow.

Would be interesting to imagine what a pipelined variant would look like though

@s0 move over string theory,

@s0 not bad, actually. Build once, optimise closer to the end user CPU

@whitequark

@mo @s0 @whitequark hopefully the WebAssembly Modules work will make this a more common way of providing interoperable libraries with well-defined interfaces.
@whitequark This is awesome 😀
@whitequark this is cursed beyond belief, w-why?!
@k i've compiled the entire FPGA toolchain (synthesis and P&R) to WebAssembly, and most of the verification options (whether Verilator or CXXRTL) output C++. so you need to be able to compile C++ code to simulate RTL
@whitequark I don't usually follow people from mastodon social because the instance is unmoderated but you've just earned it, holy hell.
@k see https://yowasp.org for details
YoWASP

Unofficial WebAssembly-based packages for Yosys, nextpnr, and more

YoWASP

@whitequark this is great

import { runClang } from "npm:@yowasp/clang"; async function c(statics, ...rest) { const { "a.out": wasm } = await runClang([ "clang", "-nostartfiles", "-Wl,--no-entry", "-Wl,--export=malloc", "-Wl,--export=free", "input.c", ], { "input.c": String.raw({ raw: statics }, ...rest) + "\n" + 'char* __Run(char* input) __attribute__((export_name("__Run"))) { return run(input); }\n', }); const module = await WebAssembly.compile(wasm); const enosys = () => { return 52; }; const instance = await WebAssembly.instantiate(module, { wasi_snapshot_preview1: new Proxy({}, { get(o, k) { return enosys; }, }), }); return (...args) => { const encoded = new TextEncoder().encode(JSON.stringify(args)); const inp = instance.exports.malloc(encoded.length); new Uint8Array(instance.exports.memory.buffer, inp).set(encoded); const ptr = instance.exports.__Run(inp); instance.exports.free(inp); if (ptr) { const u8 = new Uint8Array(instance.exports.memory.buffer, ptr); const str = new TextDecoder().decode(u8.subarray(0, u8.indexOf(0))); instance.exports.free(ptr); return JSON.parse(str); } return null; }; } const run = await c` #include <string.h> #include <stdio.h> char* run(char* input){ char* out = 0; input[strlen(input)-1] = 0; asprintf(&out, "\\"hello, %s", input + 2); return out; } `; console.log(run("world"));