Say you are working with eight sensors on the body.. Is there some (free/open source) library for live processing that in different ways.. Like taking rolling averages, measuring changing energy levels etc? + gesture classifying as well but also the basic stuff.. There must be standard ways of doing this in e.g. puredata?
@yaxu I don't think I've seen "higher level" stuff like whole classification tools other than things tied to specific hardware (and not f/oss) - it's task-specific enough that people tend to just roll their own thing. there are good base libraries such as labstreaminglayer, though
@yaxu (tool tied to specific hardware: https://biosignalsplux.com/products/software/opensignals.html despite the name, of course, not open whatsoever, and has a few plugins to analyze specific kinds of data, that cost... a lot of money)
biosignalsplux | OpenSignals

@yaxu in general, I'd recommend getting familiar with scipy and a ML library of your choice (sklearn / pytorch / tensorflow / god knows what), if you aren't already, and then using those + labstreaminglayer to build up what you need (or maybe someone has built something based on LSL that matches your problem roughly, then you can go from that - i.e. there's lots of tools for EEG stuff specifically already, iunno motion as much but I know colleagues did record motion data through it)
@halcy great thanks for the pointers!
@yaxu (I _should_ mention that there _is_ a tool that is graphical, if you prefer that, that has wide industry use, called "labview". all my experience with it suggests that you want to stay very far away from it, even a simple HRV detector was basically a huge pain to make and nigh undebuggable, but maybe that's a good keyword to look if there's anything that is similar but better, if that is your jam)
@halcy
Since you mention labview... I'd also like to throw in vvvv, see http://visualprogramming.net where we focus on all things IO, 3d and more in a modern visual live programming environment.
@yaxu
vvvv

A visual live-programming environment that takes you from rapid prototyping to final production.

@yaxu
Marije Baalman uses her own microcontroller, osc and supercollider. Worth it to have a sniff around!
@wendy @yaxu +1 for Marije! Ping @nescivi
@wendy @yaxu @nescivi she’s currently working on a book detailing all kinds of interaction strategies, due out this fall: https://justaquestionofmapping.info/
Just a Question of Mapping

Ins and outs of composing with realtime data

@yaxu pinging @kf (who I think uses body sensors with supercollider)
@mathr @yaxu ping ping! Yes, indeed, that's been our party trick since 2014 or so. We're using #minibee accelerometers strapped to a dancer's wrists or ankles. I haven't found any standard libraries, but I've been experimenting with the GRT toolkit for basic posture recognition (https://gitlab.com/kflak/minibee-posture-recognition), in addition to a bunch of classes in #supercollider for working with the data to trigger/control events (https://gitlab.com/kflak/minibeeutils).
Kenneth Flak / MiniBee posture recognition

OpenFrameworks machine learning app for recognizing postures of MiniBee movement sensors over the OpenSoundControl protocol.

@mathr @yaxu once upon a time I wrote a few blog posts detailing some of the principles at work: https://roosnaflak.com/tech-and-research/minibee-tutorials/, and we are currently doing a project called 100 sketches (https://roosnaflak.com/100-sketches/) focusing on movement/sound interaction with accompanying sc source code.
MiniBee & SuperCollider Tutorials | Roosna & Flak

@yaxu
are you aware of http://www.wekinator.org/ ? a standalone tool for standard ML tasks talking OSC