I added a flip/flop instruction. It's mostly a toy, but it's helped me clarify how I want to handle envelopes.
here's what that patch sounds like
I'm sorry everyone but I have to confess to a sin. See that chain of flip flops up there two posts ago? That's a number. That's a number where every bit is a double precision float 😎
I came up with a pretty good approximation of a dial tone on accident while implementing a mixing node. #mollytime
and here's the patch for that sound
Also now that I have tile types with multiple inputs and/or multiple outputs, I went and cleaned up manual connect mode to make it a little more clear what's going on.
today I added an envelope generator and some basic midi functionality. and then I made this absolute dog shit patch :3 #mollytime
it'll be a bit of work to do, but I want to eventually add a convolution verb to my synthesizer so I don't have to run it as a separate program. I'll probably also end up rewriting it to be a FFT because it would be nice to be able to have longer samples and maybe run several convolutions simultaneously. That might be "get a better computer" territory though.
I had this really great ambinet patch going where it was these washes of noise convolved with a short clip from the chrono cross chronopolis bgm, and I recorded 3 minutes of it but audacity just totally mangled it. There's this part that sounded like a storm rumbling in the distance but it was just the patch clipping, and that seems to be the danger zone for recording and playback I guess.
so you're just going to have to pretend you're hovering gently in a vast warm expanse of sky and ocean with no obvious notion of down, as a storm roils off in the distance, during an endless sunset. the moment is fleeting, yet lasts forever
saving and loading is probably going to be the next highest priority feature now. you can do a lot with simple patches, but it would be nice to be able to save the more interesting ones
I've got saving and loading patches working 😎 that was easy
I've been very careful so far to make it impossible to construct an invalid program, but I'm eventually going to add a reciprocal instruction (1/x), and was pondering what to do about the (1/0) case and I had a great idea which is instead of emitting infinity or crashing, what if it just started playing a sound sample as an easter egg. I'd have to add another register to the instruction to track the progress through the sample though.
I haven't picked out a good sample yet for the (1/0) case. A farting sound is too obvious and puerile. The most hilarious idea would be to start playing a Beatles song on the theory that if you messed up the copyright autotrolls would kill your stream, but I don't want the RIAA coming after my ass.
I might just have the rcp component explode if you feed it a zero.
ok it's less sensational than stuffing an easter egg in here, but I realized I can simply have it refuse to tell you the answer if you try to divide by zero
I'm thinking of adding a tile type that could be used to construct sequences. It would take an input for its reset value, a clock input, and an input for its update value. It outputs the reset value until the clock goes from low to high, at which point it copies the current update value to its output. This would let you build stuff like repeating note sequences with two tiles per note + a shared clock source, and you could use flipflops to switch out sections.
I made a crappy oscilloscope for debugging. Every frame it draws a new line for the magnitude of the sample corresponding to that frame. This does not show the wave form accurately because it's just whatever the main thread happens to get when it polls for the latest sample. This is mostly meant to help check the magnitude to verify that it matches expectations. A red line indicates clipping happened. #mollytime
And here's what an eccentric patch looks like in scope view. I'm using the RNG node as if it were an oscillator, but I didn't try to correct the range, so the samples are all positive. Also it clips like crazy. Anyways here's 3 minutes and 29 seconds of hard static to relax and study to. #mollytime
personally, I highly recommend the 3 minutes and 29 seconds of hard static
I reworked the oscilloscope so that it now shows the range of samples in the time frame between refreshes. Here's a short video fo me playing around with inaudible low frequency oscillators. Also, I added a "scope" tile which you can use to snoop on specific tile outputs in place of the final output. I'm planing on eventually having it show the output over the regular background when connected so you don't have to switch modes to see it.

The scope line is sometimes a bit chonky because it's time synced, and recording causes it to lag a bit. It generally looks a bit better than in the video normally.

You can also make it worse by taking successive screenshots :3

mym tells me this is ghost house music #mollytime
and here's most of that patch
one of the things I think is neat about programming is the way fundamental design decisions of the programming languages you use inevitably express themselves in the software you write. for javasscript it's NaN poisoning bugs. for c++, it's "bonus data". unreal blueprints have excessive horizontal scrolling on a good day. blender geometry nodes evolve into beautiful deep sea creatures. as I've experimented with mollytime, I've noticed it has a wonderful tendency to summon friendly demons
I have all these big plans for using procgen to do wire management ( https://mastodon.gamedev.place/@aeva/114617952229916105 ) that I still intend on implementing eventually, but I'm finding their absence unexpectedly charming at the moment
aeva (@[email protected])

Attached: 1 image I put it to the test by translating my test patch from my python fronted to a hypothetical equivalent node based representation without regard for wire placement and then stepped through my wire placement rules by hand. I'm very pleased with the results, it's a lot clearer to me what it does than the python version is at a glance.

Gamedev Mastodon
anyone up for some doof and boop music #mollytime
a good test to see if your speakers can emit 55 hz or not
I whipped up a version of the doof-and-boop patch that automates the alternating connections with a pile of flip flops and I've been grooving out to it for the last two hours 😎
I tried doing a live jam for some friends tonight where I had the drum machine patch going along with a simple synth voice on top of that which I intended to drive with the midi touch screen keyboard I made back in January, but I quickly found that to be unplayable due to latency in the system. I'm not sure exactly where as I haven't attempted to measure yet.

I figure if I move the touch screen keyboard into the synth and cut out the trip through the midi subsystem that might make the situation a little better (again, depending on where the real problem is)

Anyways, it got me thinking, I think processing input events synchronously with rendering might be a mistake, and it would be better if I could process input events on the thread where I'm generating the audio.

I added an instruction called "boop" that implements a momentary switch, and built a small piano out of it, and also built a metronome in the patch. That is very playable, so the latency must have been something to do with having one program implement the midi controller and the other implement the synth. I think I had this same problem with timidity a while back, so maybe it's just linux not being designed for this sort of thing.
with one program being the midi controller and the synthesizer it's pretty playable, but sometimes touch events seem to get dropped or delayed. that is probably my fault though, there's a lot of quick and dirty python code in that path, though I'm not sure why it's only occasional hitches
k, I've got tracy partially hooked up to mollytime (my synthesizer), and with mollytime running stand-alone playing the drum machine patch (low-ish program complexity), the audio thread (which repeatedly evaluates the patch directly when buffering), the audio thread wakes up every 3 milliseconds, and the repeated patch invocations take about ~0.5 milliseconds to run, with most patch evals taking about 3 microseconds each.
almost a third of each patch invocation is asking the also if there's any new midi events despite there being no midi devices connected to mollytime, but that's a smidge under a microsecond of time wasted per eval, and it's not yet adding up enough for me to care so whatever
anyways, everything so far is pointing toward the problem being in the UI frontend I wrote, which is great, because that can be solved by simply writing better code probably unless the problem is like a pygame limitation or something
I switched pygame from initializing everything by default like they recommend to just initializing the two modules I actually need (display and font) and the audio thread time dropped, probably because now it's not contending with the unused pulse audio worker thread pygame spun up. The typical patch sample eval is about 1.5 microseconds now, and a full batch is now under a millisecond.
tracy is a good tool for profiling C++, but the python and C support feels completely unserious :/
making building with cmake a requirement to use the python bindings and then only providing vague instructions for how to do so is an odd choice.
using cmake is a bridge too far, so I figure I'll just mimic what the bindings do, but it turns out they're a bit over engineered as they're meant to adapt C++ RAII semantics to python decorators and I don't want to pay for the overhead for that when I could just have a begin and end function call to bind instead. that probably will have to be wrapping new and delete like they do though because there is no straight forward C call for this.
The relevant C APIs are provided as a pile of macros that declare static variables that track the context information for the begin and end regions. This seems to be on the theory that you would never ever ever want to wrap these in a function for the sake of letting another language use them in a general purpose way. The inner functions they call are all prefixed with three underscores, which is basically the programmer way of saying "trespassers will be shot without hesitation or question"
also there's this cute note in the docs saying that if use the C API it'll enable some kind of expensive validation step unless you go out of your way to disable it, which you shouldn't do but here's how 🙄
C++ RAII semantics are so universally applicable to all programs (sarcasm) that even the C++ standard library provides alternatives to scope guard objects if you don't want to use them. come on man

tragic. pygame.display.flip seems to always imply a vsync stall if you aren't using it to create an opengl context, and so solving the input events merging problem is going to be a pain in the ass. it is, however, an ass pain for another night: it is now time to "donkey kong"

EDIT: pygame.display.flip does not imply vsync! I just messed up my throttling logic :D huzzah!

😎
ideally the ordering of those events would be represented in the patch evals but they just happen as they happen. It's plenty responsive with a mouse though. I'll have to try out the touch screen later and see if the problem is basically solved now or not.
ok I can go ape shit with a mouse and it works perfectly and like I mean double kick ape shit, but the dropped events problem still persists with the touch screen ;_;
it's possible that I'm doing something wrong with the handling of touch events and it's something i can fix still, but now i'm wondering if there's a faster way to confirm if "going ape shit" playing virtual instruments is a normal intended use case for touch screen monitors by their manufacturers and the people who wrote the linux infrastructure for them, or if they all were going for more of a "tapping apps" vibe
and by "going ape shit" i just mean gently flutter tapping two fingers rapidly to generate touch events faster than i could with just one finger, such as to trill between two notes or repeatedly play one very quickly. doing that on one mouse button to generate one note repeatedly very fast feels a lot more aggressive
if i do the same on my laptop's touch pad (which linux sees as a mouse) the same thing happens, but if I really go ham on it such that it engages a click with the literal button under the touch pad, then the events all go through just fine. this is why i'm starting to think there's some filtering happening elsewhere
i wonder if there's a normal way to ask linux to let me raw dog the touch screen event stream without any gesture stuff or other filtering, or if this sort of thing isn't allowed for security and brand integrity reasons

someone brought up a really good point, which is that depending on how the touch screen works, it may be fundamentally ambiguous whether or not rapidly alternating adjacent taps can be interpreted as one touch event wiggling vs two separate touch events

easy to verify if that is the case, but damn, that might put the kibosh on one of my diabolical plans

ok great news, that seems to not be the case here. I made little touch indicators draw colors for what they correspond to, and rapid adjacent taps don't share a color.

bad news; when a touch or sometimes a long string of rapid touches is seemingly dropped without explanation, nothing for them shows up in evtest either D: does that mean my touch screen is just not registering them?

side note: if you try to use gnome's F11 shortcut to take a screen recording, it doesn't give you the option to pick which screen, and it just defaults to recording the main screen. i assume this too is security

I want to add a pair of instructions for switching between (0, 1) range and (-1, 1) range. so

u = s * .5 + .5

and

s = u * 2 - 1

what are good short names for these operations?

EDIT: can be up to three words, less than 7 letters each preferably closer to 4 each

@aeva boolToNormal and vice-versa? 
@aeva unit is a sensible name for one side, but the other uhhh full scale maybe?
@aeva actually unit could well be -1 to 1 too i hate this
@halcy I would have added these instructions weeks ago but I'm at a loss of what to name them lol

@aeva yeah I feel that problem even having just briefly thought about it and read the replies. oof.

Anyways it was half a joke but actually ac and dc don’t even feel that terrible

@aeva ultra comedy option make the logic and lerping also operate on -1 to 1 input ranges

@halcy so far there's no logic gates yet, and everything that needs a clock signal is set up such that I can drive them with a square wave.... and I've considered reworking lerp to use this, and I'm also considering having samplers take 0-1 for the forward range and -1-0 for an inverted range for reasons

so you're not far off the mark, it's just that the 0-1 range is still mathematically useful though

@aeva so like unit2fullscale / fullscale2unit or something to that end

now I‘m curious actually, do you have a type system of some kind for these?

@aeva you need to uhhhhhhhhhhhh `apt remove libannoyaevatremendously`

@aeva The new catch phrase for anything that breaks?

Computer overheating? Must be security.

@jkaniarz a bit generous to assume it's broken and not just over designed or straight up incompetent
@aeva The computer or the security system? Where I work, it’s both.
@aeva libinput or evdev? Pretty sure I've touched the rawness of touchpad events through there.
@aeva oh, or, are you thinking a layer below those? hmmmm...
@aud i have no idea how this works below SDL, so probably one of those two things you just said. i don't want to go so low i need to write my own drivers that sounds like a bad time
@aeva ah okay, then yeah! I think evdev is what you want, maybe. You can even just straight up read the events directly into the terminal with it.
@aeva @aud fortunately (????????) you can use perf probe to put a "tracing probe" in one or more of pretty much any function inside your kernel and its loaded modules and output a variety of values that local variables may hold, so ...
you don't need to write a driver of your own at least! no compilation necessary! just look for a suitable spot in the driver you already have where the events are at the desired amount of raw-ness.
but from the rest of the thread it sounds like you might have found out what you wanted to know?
@aeva lmk if yuo find a way i might need to figure that out myself eventually
@aeva ioctls on raw device descriptors? or do you mean like reading the stream?
@agentultra i like your funny words
@aeva You are deeper beyond where I code, so forgive me if this is dumb. I believe mobile touch screens have multi finger tracking. So: If touch 1 and touch 2 are too fast or similar finger recognition, the touch event array could be stuck at length 1 and simply move the existing event to position 2, like a drag/teleport instead of reading a second touch. Might be that you are tapping faster then the touch device can track, or that both fingers are being read as the same one? Just thoughts.
@aeva I'm thinking especially if I use "on click" in unity engine for example, it responds to tap 1 but might miss fast DDR/Mario Party tapping events because it still thinks the first tap hasn't left the screen. Because the taps might be too fast for the framerate/updating of the control array. Musical play speed probably bumps against this. I know it's a problem for a lot of the rhythm mobile stuff and why a lot of them use holds instead of like, piano run inputs.
@aeva On the other hand, I do have one of those USB drum pad input things. And that has never lost a tap, far as I can tell. Though that probably works very differently to a typical touch screen device.
@darkgriffin this makes me think if a tap suddenly warps by some distance it could be made to be interpreted as a release and second tap on the same frame, but being also able to do drag gestures would be difficult maybe
@darkgriffin oh dang that's a really good point
@aeva wait what profiling tool is this
GitHub - wolfpld/tracy: Frame profiler

Frame profiler. Contribute to wolfpld/tracy development by creating an account on GitHub.

GitHub
@glyph tbh I'd rather be using pix but *gestures at all this linux* you work with what you've got
@aeva my interest is (obviously?) piqued by the fact that this directly supports python, which I don't think pix does? in any case: thanks
@glyph I would say its support for python is greatly overstated. You'd have a better time using PIX with python via the ffi imo.
aeva (@[email protected])

making building with cmake a requirement to use the python bindings and then only providing vague instructions for how to do so is an odd choice.

Gamedev Mastodon
@glyph tracy is so committed to C++ RIAA style scope guards that it's really obnoxious to map it to the semantics of other languages. I ended up improvising a call trampoline to make it work without their official bindings, but that can only measure at function level granularity (which as it happens the official bindings also are limited to function level granularity)
@aeva to be fair, if you ask the C++ standard library whether it prefers chocolate or vanilla ice cream, it asks for pistachio, toothpaste and burnt rubber
@rygorous @aeva now with Reflection, we can finally move sizeof from the core language into the standard library, where it belongs.
@GyrosGeier @aeva or compiler-implemented std::same_size_as trait and Peano arithmetic. ez
@aeva my favorite example being <random>, which answers "can we have a standard RNG that sucks less" with a resounding "here's like 10 that suck more, but in different ways, and by the way FUCK YOU for asking"
@aeva side quest unlocked:
Create C API profiler
Accept/Deny
@pupxel Deny. I'm just going to make a trampoline where python calls a C++ function to set up a profiling scope, and then that C++ function immediately calls a python callback for the thing to be profiling, and routes the return values back to the original caller