I finally got the data transfer from the AISC110C high-speed camera sensor working!

It's a 5€ chip that outputs 80x120 video at up to 40k fps.

Data is read with a Xilinx Spartan7 and transmitted via USB3 with a Cypress FX3, each on its own little PCB.

The front PCB is exchangeable, making this a neat modular platform. I already have an analog video frontend with the ADV7182 and am working on a Cameralink interface.

Videos are coming tomorrow, I need more light for the high framerates.

A lighter being lit, recorded at 2000 fps, played back at 25 fps.

Mastodon won't accept the video directly, so I had to convert to GIF, which introduced the dithering.

I'll try to find a better way to convert...

@stdlogicvector what was your original video container and codec? (I ask because I still try to figure out what works best. So far, webm does work well, but according to https://docs.joinmastodon.org/user/posting/#media an MP4 container containing H.264 should work "better" in the sense that transcoding probably won't change the file.
(and in my cases, the magical invocation would be `ffmpeg -i video.mpeg -c:v libx264 -b:v 1000k -preset slower out.mp4`)
Posting to your profile - Mastodon documentation

Sharing your thoughts has never been more convenient.

@funkylab The camera outputs raw Y8/GREY8 video. I only specified the .avi file extension and ffmpeg did the rest.

Your spell works! Thanks!

(Recorded at 3000fps, played back at 25fps)

@stdlogicvector ah right, low-res. Drop the bitrate-specifying `-b:v 1000k`, you're staying below that, anyways. Add, in its place, a `-pix_fmt gray10le`, because YUV color space sure is boring if all you have are colorless shades of grey.

@funkylab It works and files are smaller now!

(Recorded at 1000fps, played back at 25fps.)