Mainly use Bitwig, but also delve in Usine for anything extra.
| Audius | https://audius.co/xiso |
| Soundcloud | https://soundcloud.com/x-iso |
| Bandcamp | https://xiso.bandcamp.com |
| Youtube | https://www.youtube.com/channel/UC34LPe6Al2mLA5GQGBP205w |
| Audius | https://audius.co/xiso |
| Soundcloud | https://soundcloud.com/x-iso |
| Bandcamp | https://xiso.bandcamp.com |
| Youtube | https://www.youtube.com/channel/UC34LPe6Al2mLA5GQGBP205w |
@sean_ae and since I was having a blast with Usine at the time, and it did support video data, I made a synth with this idea, which looked like this:
https://www.youtube.com/watch?v=C8DW8F0mqEE
basically 3 osc for RGB or HSL data of the video, you pick X or Y line to scan and it gets you pixel data from it, that goes into mix with traditional OSC shapes doubled as fallback if picture is basically one color fill, result goes to OSC wavetable array. then further processing and some modulators to keep it alive.
@sean_ae hey Sean, I wanted to ask about one thing, related to once discovered 444 playlist (I think there were more) on youtube and then 2020 live streams that seemed to feature using video feed as data to manipulate audio or even midi. what did you try in the end and what seemed useful in regard of extracting live data from picture/video?
personally, around 2018 idea occurred to me that you could actually extract wavetable from picture/video on the fly, instead of traditional sonogram method
Got myself Gulikit king kong 2 pro gamepad recently and since it's super precise in controls decided to test it out in my DAWs as well.
There's a controller script to use it in Bitwig, but there are some things to iron out, so instead I have my Usine gamepad patchbay rack to convert any buttons and axis shift into MIDI data, which then passes it to virtual midi port. somehow crashed audio engine once still.