Profile pic by Ole Johannsen
| homepage | https://urs-waldmann.de/computer-vision/ |
| homepage | https://urs-waldmann.de/computer-vision/ |
Thrilled to announce that my dissertation is now published! 🎉 You can check it out here: https://kops.uni-konstanz.de/handle/123456789/71609
The study of collective behaviour among animals hinges on accurately estimating and tracking their movements. Central to our research at the Centre for the Advanced Study of Collective Behaviour is the challenge of accurately estimating the poses and tracking the movements of multiple animals in their natural habitats. One of the primary challenges in studying animal collective behaviour is the scarcity of data, which predominantly focuses on human subjects, with a notable deficiency in annotated data for animals. Although there are some existing animal datasets, they often encompass various species, limiting their utility for specific research purposes. Moreover, the availability of annotated multi-instance animal data remains sparse compared to human datasets, further exacerbating the resource gap. We propose novel approaches to address this disparity and facilitate advancements in pose estimation and tracking methodologies for collective behaviour studies. Firstly, we leverage pigeon data collected in controllable indoor environments to train models capable of performing reliably in wild settings. We also demonstrate that it is possible to train a model on data containing a single pigeon to predict keypoints from multiple pigeons stably and accurately. This provides an alternative in the domain shift to other species. With interactive speed, this model tracks and estimates the 3D poses of up to ten pigeons. Second, we explore unsupervised label propagation that obviates the need for annotated data to propagate poses through video sequences. Our pipeline can effectively track the posture of small objects relative to the frame size, enhancing the applicability. Third, our pioneering 3D pose estimation pipeline, trained exclusively on synthetic data, robustly predicts keypoints from multi-view silhouettes and is thus robust to transformations that leave silhouettes unchanged, such as variations in texture and lighting. This method successfully narrows the domain gap where real-world annotations are scarce by leveraging synthetic data. Lastly, to the best of our knowledge, we are the first to offer a pipeline for neural rendering of textures, facilitating downstream tasks such as individual re-identification. Our method offers an efficient alternative to existing approaches based on convolutional neural networks (CNNs) and vision transformers, operating at interactive speeds. We think that our contributions promote systematic advancements in the study of animal collective behaviour and offer novel methodologies for 3D pose estimation and individual re-identification.
Our first lecture of the day at the #KonstanzSchoolCollectiveBehavior behavior is #FumihiroKano talking about using #gaze #tracking to study #collective #animal #cognition. @cbehav
Paris gives first glimpse at AI's Olympic future
https://www.axios.com/2024/08/01/olympics-games-ai-athlete-initiatives?utm_source=flipboard&utm_medium=activitypub
Posted into Trending in Olympics @trending-in-olympics-thenewsdesk
I'm happy to present the last paper from my thesis!
Lisa Li and I set out to build a model of fly walking which is based on 3D kinematics data, handles perturbations, and includes sensorimotor delays. (This was supervised by Bing Brunton and @tuthill )
We set up a new modeling framework, generated fly walking with kinematics matched to real data, a simple metric for quantifying similarity of trajectories, and found constraints on delays for robust walking!
https://www.biorxiv.org/content/10.1101/2024.04.18.589965v1
#neuroscience #drosophila #walking #preprint
1/7