The next main things to do are building out an audio subsystem for the program so you can hear the patch, and then there's some important stuff missing from the editor like being able to place new tiles.

One of the things I'm really excited about with the design of the compiler is if you modify a patch while it is running, the oscillators don't reset, so you should be able to make a program while performing it :3

I got the audio hooked up to it and everything seems to work right :3
I am now the operator, with my virtual calculator.
the screen shot doesn't show off how cool this is, because when you're using this to edit constants in the patch, it'll update their values as you input expressions (so, every time you press equals or input a second operator) and so you can hear the change to the patch as you are doing math
I don't feel like being mad at linux right now, so I'm gonna figure out how to record this another time
I've got this hooked up to my audio convolver (a simple slow morphing fm patch convolved with the sound of a metal water bottle with a little bit of water in it being perturbed) and it sounds mesmerizing. Mym described it as "birds in a pinball machine" :3
I've been listening to it for like an hour and I'm so zoned out now 😎
I made it so you can rearrange the nodes now :)

I added a way to actually place tiles now instead of hard coding them!

finally, after five weeks and change of building a visual programming language from scratch, i can now build a working patch starting from nothing :D

The natural thing to implement next would be save and load, but pygame doesn't provide anything for conjuring the system file io dialogue windows (SDL3 has it, but pygame is SDL2 under the hood iirc). I'll worry about that another time I guess.

I'm probably going to go with an xml format so it's easy to extend. I want it to be easy to make custom controls in other applications eg game engines, so this feels like the obvious choice also I'm one of the five people who likes xml.

i'll probably end up adding envelope generators and midi before i implement save and load since it's not really worth using without those but it really depends on which way the wind is blowing when i next work on this
I'm really excited because all of the hard stuff is done now. I mean none of it was hard to implement or design, it's just that I made a bunch of architectural decisions up front and five weeks later the result is a thing that 1) works, 2) I've managed to keep the complexity of implementation pretty low across the entire program, and 3) all of the missing features and areas I want to improve upon have obvious paths to implementation.
and so further effort put into developing this will have an extremely favorable ration of time spent actually making the program more useful vs wallowing in tech debt
I made a short sequence of boops entirely out of oscillators and arithmetic. There's no sequencer or envelope generator. #mollytime
and this is the patch, or at least here is most of the patch. it probably looks complicated but I did not put a lot of thought into it
from my experimentation so far since I got all this working end-to-end, I think modality worked as useful convenience for standing up a visual programming language quickly, but it adds too much friction to this one. In particular when making patches I want to be able to alternate between placing/moving stuff around and connecting stuff, so those two modes need to be merged for sure.
I'm not really making good use of multitouch because I've been developing this using a mouse. I'd like to have it so I can touch two things to hear what they sound like connected or disconnected; and then double tap to connect or disconnect that connection when a quick connection is possible. Mouse mode will need to work differently.
Another verb I would like to implement is the ability to disconnect a wire and connect a new wire in the same frame. The current flow for disconnecting a wire and connecting a new wire is too long even with a touch screen, but I think anything where it is not one action would be.
Being able to suspend/resume updating the patch that is playing while making edits is also a thing I want to implement for similar reasons, but I think being able to splice parts quickly is useful in different ways.
I added a flip/flop instruction. It's mostly a toy, but it's helped me clarify how I want to handle envelopes.
here's what that patch sounds like
I'm sorry everyone but I have to confess to a sin. See that chain of flip flops up there two posts ago? That's a number. That's a number where every bit is a double precision float 😎
I came up with a pretty good approximation of a dial tone on accident while implementing a mixing node. #mollytime
and here's the patch for that sound
Also now that I have tile types with multiple inputs and/or multiple outputs, I went and cleaned up manual connect mode to make it a little more clear what's going on.
today I added an envelope generator and some basic midi functionality. and then I made this absolute dog shit patch :3 #mollytime
it'll be a bit of work to do, but I want to eventually add a convolution verb to my synthesizer so I don't have to run it as a separate program. I'll probably also end up rewriting it to be a FFT because it would be nice to be able to have longer samples and maybe run several convolutions simultaneously. That might be "get a better computer" territory though.
I had this really great ambinet patch going where it was these washes of noise convolved with a short clip from the chrono cross chronopolis bgm, and I recorded 3 minutes of it but audacity just totally mangled it. There's this part that sounded like a storm rumbling in the distance but it was just the patch clipping, and that seems to be the danger zone for recording and playback I guess.
so you're just going to have to pretend you're hovering gently in a vast warm expanse of sky and ocean with no obvious notion of down, as a storm roils off in the distance, during an endless sunset. the moment is fleeting, yet lasts forever
saving and loading is probably going to be the next highest priority feature now. you can do a lot with simple patches, but it would be nice to be able to save the more interesting ones
I've got saving and loading patches working 😎 that was easy
I've been very careful so far to make it impossible to construct an invalid program, but I'm eventually going to add a reciprocal instruction (1/x), and was pondering what to do about the (1/0) case and I had a great idea which is instead of emitting infinity or crashing, what if it just started playing a sound sample as an easter egg. I'd have to add another register to the instruction to track the progress through the sample though.
I haven't picked out a good sample yet for the (1/0) case. A farting sound is too obvious and puerile. The most hilarious idea would be to start playing a Beatles song on the theory that if you messed up the copyright autotrolls would kill your stream, but I don't want the RIAA coming after my ass.
I might just have the rcp component explode if you feed it a zero.
ok it's less sensational than stuffing an easter egg in here, but I realized I can simply have it refuse to tell you the answer if you try to divide by zero
I'm thinking of adding a tile type that could be used to construct sequences. It would take an input for its reset value, a clock input, and an input for its update value. It outputs the reset value until the clock goes from low to high, at which point it copies the current update value to its output. This would let you build stuff like repeating note sequences with two tiles per note + a shared clock source, and you could use flipflops to switch out sections.
I made a crappy oscilloscope for debugging. Every frame it draws a new line for the magnitude of the sample corresponding to that frame. This does not show the wave form accurately because it's just whatever the main thread happens to get when it polls for the latest sample. This is mostly meant to help check the magnitude to verify that it matches expectations. A red line indicates clipping happened. #mollytime
And here's what an eccentric patch looks like in scope view. I'm using the RNG node as if it were an oscillator, but I didn't try to correct the range, so the samples are all positive. Also it clips like crazy. Anyways here's 3 minutes and 29 seconds of hard static to relax and study to. #mollytime
personally, I highly recommend the 3 minutes and 29 seconds of hard static
I reworked the oscilloscope so that it now shows the range of samples in the time frame between refreshes. Here's a short video fo me playing around with inaudible low frequency oscillators. Also, I added a "scope" tile which you can use to snoop on specific tile outputs in place of the final output. I'm planing on eventually having it show the output over the regular background when connected so you don't have to switch modes to see it.

The scope line is sometimes a bit chonky because it's time synced, and recording causes it to lag a bit. It generally looks a bit better than in the video normally.

You can also make it worse by taking successive screenshots :3

mym tells me this is ghost house music #mollytime
and here's most of that patch
one of the things I think is neat about programming is the way fundamental design decisions of the programming languages you use inevitably express themselves in the software you write. for javasscript it's NaN poisoning bugs. for c++, it's "bonus data". unreal blueprints have excessive horizontal scrolling on a good day. blender geometry nodes evolve into beautiful deep sea creatures. as I've experimented with mollytime, I've noticed it has a wonderful tendency to summon friendly demons
I have all these big plans for using procgen to do wire management ( https://mastodon.gamedev.place/@aeva/114617952229916105 ) that I still intend on implementing eventually, but I'm finding their absence unexpectedly charming at the moment
aeva (@[email protected])

Attached: 1 image I put it to the test by translating my test patch from my python fronted to a hypothetical equivalent node based representation without regard for wire placement and then stepped through my wire placement rules by hand. I'm very pleased with the results, it's a lot clearer to me what it does than the python version is at a glance.

Gamedev Mastodon
anyone up for some doof and boop music #mollytime
a good test to see if your speakers can emit 55 hz or not
I whipped up a version of the doof-and-boop patch that automates the alternating connections with a pile of flip flops and I've been grooving out to it for the last two hours 😎
I tried doing a live jam for some friends tonight where I had the drum machine patch going along with a simple synth voice on top of that which I intended to drive with the midi touch screen keyboard I made back in January, but I quickly found that to be unplayable due to latency in the system. I'm not sure exactly where as I haven't attempted to measure yet.

I figure if I move the touch screen keyboard into the synth and cut out the trip through the midi subsystem that might make the situation a little better (again, depending on where the real problem is)

Anyways, it got me thinking, I think processing input events synchronously with rendering might be a mistake, and it would be better if I could process input events on the thread where I'm generating the audio.

I added an instruction called "boop" that implements a momentary switch, and built a small piano out of it, and also built a metronome in the patch. That is very playable, so the latency must have been something to do with having one program implement the midi controller and the other implement the synth. I think I had this same problem with timidity a while back, so maybe it's just linux not being designed for this sort of thing.
with one program being the midi controller and the synthesizer it's pretty playable, but sometimes touch events seem to get dropped or delayed. that is probably my fault though, there's a lot of quick and dirty python code in that path, though I'm not sure why it's only occasional hitches
k, I've got tracy partially hooked up to mollytime (my synthesizer), and with mollytime running stand-alone playing the drum machine patch (low-ish program complexity), the audio thread (which repeatedly evaluates the patch directly when buffering), the audio thread wakes up every 3 milliseconds, and the repeated patch invocations take about ~0.5 milliseconds to run, with most patch evals taking about 3 microseconds each.
almost a third of each patch invocation is asking the also if there's any new midi events despite there being no midi devices connected to mollytime, but that's a smidge under a microsecond of time wasted per eval, and it's not yet adding up enough for me to care so whatever
anyways, everything so far is pointing toward the problem being in the UI frontend I wrote, which is great, because that can be solved by simply writing better code probably unless the problem is like a pygame limitation or something
I switched pygame from initializing everything by default like they recommend to just initializing the two modules I actually need (display and font) and the audio thread time dropped, probably because now it's not contending with the unused pulse audio worker thread pygame spun up. The typical patch sample eval is about 1.5 microseconds now, and a full batch is now under a millisecond.
tracy is a good tool for profiling C++, but the python and C support feels completely unserious :/
making building with cmake a requirement to use the python bindings and then only providing vague instructions for how to do so is an odd choice.
using cmake is a bridge too far, so I figure I'll just mimic what the bindings do, but it turns out they're a bit over engineered as they're meant to adapt C++ RAII semantics to python decorators and I don't want to pay for the overhead for that when I could just have a begin and end function call to bind instead. that probably will have to be wrapping new and delete like they do though because there is no straight forward C call for this.
The relevant C APIs are provided as a pile of macros that declare static variables that track the context information for the begin and end regions. This seems to be on the theory that you would never ever ever want to wrap these in a function for the sake of letting another language use them in a general purpose way. The inner functions they call are all prefixed with three underscores, which is basically the programmer way of saying "trespassers will be shot without hesitation or question"
also there's this cute note in the docs saying that if use the C API it'll enable some kind of expensive validation step unless you go out of your way to disable it, which you shouldn't do but here's how 🙄

@aeva with a lot of objects in memory, occasional hitches sounds suspiciously like gc generation 2. If you haven't already...

import gc
gc.set_debug(gc.DEBUG_STATS)

Usually I escalate to gc.callbacks, then discover the cycles are created by a (networking-related, in my cases) library.

I once got it so I could've turned off gc but the pace of allocations was then low enough that we chose to tolerate hitches in case of future cycles.

gc.freeze() seemed a possible next step but I didn't try.

@aeva oh yeah and you will probably notice if you also have to escalate to gc.callbacks that the "elapsed" in gc.DEBUG_STATS is often a blatant lie. :/
@pteromys it could be GC, but there's other reasonable explanations, so I'm just measuring things right now to see what's actually going on
@aeva Linux audio is not so straightforward (it's a mess). Are the two programs communicating via jack?
@djpanini via alsa midi
@aeva mm ok. I am not so expert since I usually only play with daws and synthesizers and not programming anything. But in your situation I would give Jack a try even if It sounds counterintuitive . Probably pipewire-jack nowadays.
@aeva hmmn. I run realtime kernels and was pleasantly surprised that more recent (6 voice) supercollider patches ran without droping frames with a queneo midi controller. On a raspi 3+ at 1gHz. I CAN crush the box by holding 16 pads at the same time, though :-) That's with jackd running at rt priority -P98 at 44.1 kHz. On a 2.8 gHz dual core, it's rock solid . If you're running recent hatdware, it can be made as stable as apple's core audio.
@poetaster shouldn't be necessary. the audio generation itself is very stable - it runs in a real time thread, and doesn't drop frames. the ui frontend is just hitching, which causes touch screen input events to be processed late, or appear to be dropped if they're more rapid than the refresh rate (likely as successive touch start/stop events processed on the same frame may get merged)
@poetaster but also while I'm still following the guiding light of scratching my itches and not worrying about commercial viability, it is still true that I would like to make this available to other people eventually, and requiring a custom wizard kernel or a specific linux distro is out of question (it helps that I don't want that for myself either)
@poetaster regural linux should be capable of maintaining a 60 hz or better frame rate for normal video games though, so this should be a problem that's relatively "easy" to solve

@aeva

Boop, eh, do you have the voice of a gui tar
or a cat tar.?

@aeva Badoofboop could be a good name for ur whole app.
@aeva ok what about hotkeys for the side buttons? or do you just perfer using the buttons?
@cxxvii its designed for a touch screen monitor, which I have, but it's an external monitor that I usually don't bother to go get first
@cxxvii I'm planning on making control schemes for game controllers as well as mouse+keyboard eventually, but I want to nail down the core functionality first before I do that
@aeva I think... maybe... you should make a tile that's a switch you can toggle on and off and it just cuts the path temporarily... and then maybe it has a control pin too?
and also, visualization of what is "powered" if that makes sense???
@efi I am thinking of doing something like that. Originally I was thinking of having a way of defining a control surface that's separate from the patch, but I've been thinking more lately that it is preferable to just put the controls in the patch and not hide the patch away
@aeva yesssss, knobs and sliders are easy~ 🎶
@aeva You're a mouse-jay!
@aeva
funny lines
straight as the crow flies
music time
@aeva this is a lot
@efi it's not as bad as it looks XD
@aeva but I have tiny brain
@efi its ok. store bought brains are valid too. maybe.
@aeva I have to go buy more?? bwuuuuuuh =w=
@aeva How can I tell if the fude i have is loud or not?
@EndlessMason if you have a bass note and a treble note play at the same amplitude, the bass note will sound really quiet because human perception reasons, so to fix that, you drop in the loud fudge instruction to amplify or diminish the amplitude based on the note's expected frequency
@aeva Oh, i see, so it fudges the loudness, rather than being a fudge that is loud
@aeva it feels like something out of an extremely average platforming game from 1997