@bwaber Thanks for checking out my talk about #DeepMReye! Would love to hear more about the things you would want to see to fully trust it. More datasets? More types of viewing behavior? More scanning sequences etc?

The paper includes ~300 subjects scanned with a total of 15 sequences on 5 MRI scanners, but a lot more has been done since the paper was published. Happy to chat :)

#DeepMReye has a new #streamlit app! 🎊 Camera-less #eyetracking in #fMRI has never been easier. https://github.com/DeepMReye/DeepMReye#option-4-streamlit-app

â—¦ Install & open
â—¦ Pick a pretrained model
â—¦ Upload fMRI data
â—¦ Download gaze coordinates

The app makes #DeepMReye even more accessible as no coding is required. If you use it, be aware though that the pretrained models are not perfect: Validate your results or train your own model using our notebooks!
https://github.com/DeepMReye/DeepMReye/blob/main/notebooks/deepmreye_example_usage.ipynb

Huge kudos to @cyhsm for creating the app!

GitHub - DeepMReye/DeepMReye: Main model and preprocessing code

Main model and preprocessing code. Contribute to DeepMReye/DeepMReye development by creating an account on GitHub.

GitHub

Our paper on #DeepMReye - Magnetic resonance-based eye tracking using deep neural networks - was just made #OpenAccess by @NatureNeuro

It was originally published under the wrong license in 2021. This release closes the circle and makes code, data, and paper accessible to everyone! #OpenScience

Code: https://github.com/DeepMReye/DeepMReye
Data: https://osf.io/mrhk9/
Paper: https://www.nature.com/articles/s41593-021-00947-w

w/ co-authors @cyhsm & Christian Doeller

GitHub - DeepMReye/DeepMReye: Main model and preprocessing code

Main model and preprocessing code. Contribute to DeepMReye/DeepMReye development by creating an account on GitHub.

GitHub
Thanks so much, @RemiGau, for putting this together, and in general for contributing to #DeepMReye in the past months! It is such a useful extension and will make it easier to run it on #fMRIprep outputs etc. for many people.

#DeepMReye now has a wrapper for #BIDS data! https://pypi.org/project/bidsmreye/

This is great for example to decode gaze position in #fMRI datasets processed with #fMRIprep

Thank you @RemiGau for this amazing contribution to our package! w/@CYHSM
---
RT @NauMatt
#DeepMReye is out!
Use #deeplearning to perform #eyetracking in #fMRI without camera! https://www.nature.com/articles/s41593-021-00947-w

@cyhsm & I are thrilled to finally sha…
https://twitter.com/NauMatt/status/1457742155859038217

bidsmreye

bids app using deepMReye to decode eye motion for fMRI time series data

PyPI

Just migrated servers!

I am a #CogNeuro postdoc at NIH passionate about visual perception, gaze behavior, and memory.

I use computational tools (incl DNNs), human brain imaging & eye tracking to characterize how #vision, #gaze, and #memory interact on the levels of brain activity & subjective experience, and I develop open-source #machinelearning approaches to advance how brain activity & behavior can be studied together (e.g., #DeepMReye https://github.com/DeepMReye).
Excited to connect here!

DeepMReye

DeepMReye is a software package for magnetic resonance-based eye tracking for (f)MRI experiments. Contact: matthias.nau[at]nih.gov & markus.frey[at]ntnu.no - DeepMReye

GitHub

#introduction
I am a #CogNeuro postdoc at the National Institutes of Health (USA) passionate about visual perception, gaze behavior, and memory.

I use computational tools (incl DNNs), human brain imaging & eye tracking to characterize how #vision, #gaze, and #memory interact on the levels of brain activity & subjective experience, and I develop open-source #machinelearning approaches to advance how brain activity & behavior can be studied together (e.g., #DeepMReye https://github.com/DeepMReye)