My Keynote, on AI Policy in Africa, from IndabaX Rwanda 2023 at #ICLR2023 is now available to view online. Thank you for the organisers of IndabaX Rwanda and Black in AI for having me.

https://iclr.cc/virtual/2023/14362

@[email protected] @[email protected] @[email protected] @[email protected]
#AfricanMachineLearning

Keynote 1: AI Policy in Africa

How many of those who want to create “AGI benefiting ALL of humanity” & concerned about
“existential risks” were at #ICLR2023 in Kigali? We know they show up to any of the conference rotations when they’re in their usual places with visas barring most ppl in the world. But I am blabbering on about the global apartheid system of visas that restricts the movement of non white people. Genius minds like Hinton & others can’t be bothered with such minuscule issues: they’re thinking about HUMANITY!

And that is it for #ICLR2023.

Thank you for all following this thread and attending @iclr_conf in Kigali. It was a pleasure organising this and working to make it the strongest conference it can be.

FIN/n

Some of the posters at the #AfricaNLP workshop at #ICLR2023

#AfricanMachineLearning
#AfricanNLP

Our poster at the AfricaNLP workshop at #ICLR2023. Rozina Myoya, from our lab @DSFSI_Research , will be at the poster session.

#AfricanNLP
28/n

I'm presenting a poster in the Sparse Neural Network workshop @sparsenn at #ICLR2023 on "Efficient Real Time Recurrent Learning through combined activity and parameter sparsity". Come by if you're around!

Link to paper: https://arxiv.org/abs/2303.05641

Efficient Real Time Recurrent Learning through combined activity and parameter sparsity

Backpropagation through time (BPTT) is the standard algorithm for training recurrent neural networks (RNNs), which requires separate simulation phases for the forward and backward passes for inference and learning, respectively. Moreover, BPTT requires storing the complete history of network states between phases, with memory consumption growing proportional to the input sequence length. This makes BPTT unsuited for online learning and presents a challenge for implementation on low-resource real-time systems. Real-Time Recurrent Learning (RTRL) allows online learning, and the growth of required memory is independent of sequence length. However, RTRL suffers from exceptionally high computational costs that grow proportional to the fourth power of the state size, making RTRL computationally intractable for all but the smallest of networks. In this work, we show that recurrent networks exhibiting high activity sparsity can reduce the computational cost of RTRL. Moreover, combining activity and parameter sparsity can lead to significant enough savings in computational and memory costs to make RTRL practical. Unlike previous work, this improvement in the efficiency of RTRL can be achieved without using any approximations for the learning process.

arXiv.org

Join Jade Abbott (@alienelf), our Lelapa AI @LelapaAI co-founder at the Practical Machine Learning for Developing Countries workshop today.

27/n

---
RT @LacunaFund
TODAY! If you are at #ICLR2023 go check out these Lacuna Fund grantees present.

AfricaNLP - @asmelashteka, @Shmuhammadd

Machine Learning for Remote Sensing - @j_nabende

Practical Machine Learning for Developing Countries - @RKiire, @alienelf, @asmelashteka
https://twitter.com/LacunaFund/status/1654365313566400512

Lacuna Fund on Twitter

“TODAY! If you are at #ICLR2023 go check out these Lacuna Fund grantees present. AfricaNLP - @asmelashteka, @Shmuhammadd Machine Learning for Remote Sensing - @j_nabende Practical Machine Learning for Developing Countries - @RKiire, @alienelf, @asmelashteka”

Twitter
The final day of #ICLR2023. At the Practical Machine Learning for Developing Countries workshop, Kathleen (@siminyu_kat kicked off the day with "Honoring Kiswahili with Technology and Community" 25/n

Why we fight
---
RT @gneubig
I had to travel 26 hours and spend $2000+ to join #ICLR2023 in Rwanda.
But people in Africa have to do this every time a conference is held in US.

What happens when we make it easier to participate?

1530% higher registrations from Africa.

This is important and must continue.
https://twitter.com/gneubig/status/1654053264965369857

Graham Neubig on Twitter

“I had to travel 26 hours and spend $2000+ to join #ICLR2023 in Rwanda. But people in Africa have to do this every time a conference is held in US. What happens when we make it easier to participate? 1530% higher registrations from Africa. This is important and must continue.”

Twitter
Evening before the last day of @iclr_conf #ICLR2023. We are almost there. What a busy conference. Great reconnecting, meeting new people and forging new collaborations. 24/n