RE: https://mathstodon.xyz/@gconstantinides/116448779816230899

We're hiring for a project on #SpikingNeuralNetworks and #neuromorphic computing, to start in October this year, for 36 months. Can hire at pre- or post-PhD level. Feel free to email me informally, or apply at the link below. Please do share with your networks if you know someone who would be interested.
#ComputationalNeuroscience

Incorporating structural #plasticity in #SpikingNeuralNetworks (#SNN) enables dynamic #synaptic connectivity, reflecting the #brain's adaptability. By modeling synaptic growth and pruning based on #calcium concentration, we can simulate processes such as #learning and #MemoryFormation. In this post, I reproduce the #NESTSimulator tutorial on structural plasticity, demonstrating its impact on network stability and #homeostasis:

๐ŸŒ https://www.fabriziomusacchio.com/blog/2026-02-01-structural_plasticity/

#CompNeuro #Neuroscience #NeuralNetworks

Lukas Ziegler (@lukas_m_ziegler)

์ฅ์˜ ์ด‰์ˆ˜๋ฅผ ๋ชจ์‚ฌํ•œ 24๊ฐœ์˜ ์ธ๊ณต ์ˆ˜์—ผ๊ณผ ์ŠคํŒŒ์ดํ‚น ๋‰ด๋Ÿฐ์„ ํ™œ์šฉํ•ด ์ฃผ๋ณ€์„ ํƒ์ƒ‰ํ•˜๋Š” ๋กœ๋ด‡ 'WhiskEye'๋ฅผ ์†Œ๊ฐœํ•˜๋Š” ๋‚ด์šฉ์ž…๋‹ˆ๋‹ค. ์‹ ๊ฒฝ๊ณ„ ๋ชจ์‚ฌ์™€ ์žฅ์†Œ ์ธ์‹ ๊ธฐ๋Šฅ์„ ๊ฒฐํ•ฉํ•œ ๋ฐ”์ด์˜ค๋ฏธ๋ฉ”ํ‹ฑ ๋กœ๋ด‡์œผ๋กœ, ์ž„๋ฒ ๋””๋“œ ๋กœ๋ณดํ‹ฑ์Šค์™€ ์‹ ๊ฒฝ๋ชจ์‚ฌ ๊ธฐ์ˆ ์˜ ์‘์šฉ ์‚ฌ๋ก€๋ฅผ ๋ณด์—ฌ์ค๋‹ˆ๋‹ค.

https://x.com/lukas_m_ziegler/status/2010646049828831313

#whiskeye #robotics #spikingneuralnetworks #biomimicry

Lukas Ziegler (@lukas_m_ziegler) on X

This robot explores the world like a mouse! ๐Ÿ That's an OG project, guess the year ;) A mouse-inspired bot called WhiskEye travels the world with 24 artificial whiskers and spiking neurons that mimic a nervous system. Recognizing familiar places is essential for engineering

X (formerly Twitter)

Postdoc fellowship opportunity for ECRs (<3 yrs post-PhD). Note that if you want to apply to work with me as your mentor, our dept has an internal deadline of Dec 4th so please email me asap. Our internal process is shorter than the full application.

https://royalcommission1851.org/fellowships/research-fellowships

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

Reminder if you missed #SNUFA spiking neural network and neuromorphic workshop earlier this month, all our talks were recorded and are now available to watch.

https://www.youtube.com/playlist?list=PL09WqqDbQWHEyTZPLMKEi9-RD6kax0tCR

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

SNUFA 2025 Workshop

YouTube

Psst - #neuromorphic folks. Did you know that you can solve the SHD dataset with 90% accuracy using only 22 kB of parameter memory by quantising weights and delays? Check out our preprint with Pengfei Sun and Danyal Akarca:

https://arxiv.org/abs/2510.27434

Or check out the TLDR thread on Bsky:

https://bsky.app/profile/did:plc:niqde7rkzo7ua3scet2rzyt7/post/3m5jpksani22m

#SpikingNeuralNetworks #ComputationalNeuroscience #Neuroscience

Exploiting heterogeneous delays for efficient computation in low-bit neural networks

Neural networks rely on learning synaptic weights. However, this overlooks other neural parameters that can also be learned and may be utilized by the brain. One such parameter is the delay: the brain exhibits complex temporal dynamics with heterogeneous delays, where signals are transmitted asynchronously between neurons. It has been theorized that this delay heterogeneity, rather than a cost to be minimized, can be exploited in embodied contexts where task-relevant information naturally sits contextually in the time domain. We test this hypothesis by training spiking neural networks to modify not only their weights but also their delays at different levels of precision. We find that delay heterogeneity enables state-of-the-art performance on temporally complex neuromorphic problems and can be achieved even when weights are extremely imprecise (1.58-bit ternary precision: just positive, negative, or absent). By enabling high performance with extremely low-precision weights, delay heterogeneity allows memory-efficient solutions that maintain state-of-the-art accuracy even when weights are compressed over an order of magnitude more aggressively than typically studied weight-only networks. We show how delays and time-constants adaptively trade-off, and reveal through ablation that task performance depends on task-appropriate delay distributions, with temporally-complex tasks requiring longer delays. Our results suggest temporal heterogeneity is an important principle for efficient computation, particularly when task-relevant information is temporal - as in the physical world - with implications for embodied intelligent systems and neuromorphic hardware.

arXiv.org
SNUFA 2025 Workshop

Spiking neural networks as universal function approximators (SNUFA) online workshop 2025. For more see http://snufa.net/2025/

YouTube

Spiking NN fans - the #SNUFA workshop (Nov 5-6) agenda is finalised and online now. Make sure to register (free) soon. (Note you can register for either day and come to both.)

Agenda: https://snufa.net/2025/
Registration: https://www.eventbrite.co.uk/e/snufa-2025-tickets-1549418545579

Thanks to all who voted on abstracts!

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

SNUFA 2025

Spiking Neural networks as Universal Function Approximators

SNUFA
โ€žVor wenigen Wochen stellte ein chinesisches Forscherteam das Modell โ€žSpikingBrain 1.0โ€œ vor โ€“ eine #KI auf Basis eines #SpikingNeuralNetworks. Diese Technik soll nicht nur weniger Energie verbrauchen, sondern auch ohne Nvidia-Chips und ohne groรŸe Datenmengen auskommen.โ€œ๐Ÿค”
Lina Knees via #Handelsblatt

Message for participants of the #SNUFA 2025 spiking neural network workshop. We got almost 60 awesome abstract submissions, and we'd now like your help to select which ones should be offered talks. To take part, follow the "abstract voting" link at:

https://snufa.net/2025/

It should take <15m. Thanks! โค๏ธ

#neuroscience #SpikingNeuralNetworks #ComputationalNeuroscience

SNUFA 2025

Spiking Neural networks as Universal Function Approximators

SNUFA