I struggle to understand Silicon Valley’s libertarians’ allergic reaction to discussing problems caused by the extreme homogeneity of the research circle. The same is true for many in the AI research community of which I am a part. As this thoughtfully written letter to SCOTUS from physicists explains,
The implication that physics or “hard sciences” are somehow divorced from the social realities of racism in our society is completely fallacious.
The exclusion of people from physics solely on the basis of the color of their skin is an outrageous outcome that ought to be a top priority for rectification.
The rhetorical pretense that including everyone in physics class is somehow irrelevant to the practice of physics ignores the fact that we have learned and discovered all the amazing facts about the universe through working together in a community.
The benefits of inclusivity and equity are the same for physics as they are for every other aspect of our world.
The statement holds true for AI and any type of “ism”. One would think that the people trying to “stop AI from harming society” would pay attention to this sort of stuff.
No American company would call this a production ready person detection system.
At this point in these types of conversations people usually mention that there aren’t qualified this-or-that group of people who are deep learning researchers. I can name at least 10 extremely qualified female researchers in my sleep--including one who left the field due to exclusion--and swathes of them can be found here (LINK).
I am very concerned about the future of AI. Not because of the risk of rogue machines taking over. But because of the homogeneous, one dimensional group of men who are currently involved in advancing the technology.
Concerned AI researcher
@timnitGebru My brain hurts from that list but not just because of gender balance. There's exactly 1 psych professor in the panel. And probably the whole robot-go-conscious-brrrrr thing wasn't pushed by him.
Deep learning isn't brains and was never meant to be brains. Ever heard of a metaphor?
I am concerned for both reasons, b/c one promotes the other re/ bias, etc.
But what also has me worried is that even though some #AI luminaries have seen the light and called for a moratorium, it is for the wrong reasons.
Legislation will never catch up with the exponential evolution of #GAI, not even in the #eu
Concerned observer and commentator
« Legislation will never catch up with the exponential evolution of … »
Probably not if things go on like they went so far.
Isn’t that (forecasted) exponential evolution dependent (/directly related to) a primary energy demand exponential growth ?
🤔
(1/3)
Our exchange gave me pause. This is why I have just finished writing a thread regarding the current state of #AI regulations in the West which you might enjoy reading:
https://mastodon.social/@HistoPol/110528310717257043
Re/ energy demand and exponential growth--what are you referring to?
Here are some preliminary thoughts of mine on the topic:
#AI, though needing a lot of energy, is not the same as #CryptoMining...
(2/3)
... regarding energy consumption (hypothesis.)
Furthermore, putting data centers in countries like #Greenland with freely available energy gets around this limitation to some degree.
Also, another hypothesis of mine is that the additional energy consumption will now chiefly be proportional to the number of requests in #ChatGPT....
(3/3)
...
One of my chief hypothesis is (see linked thread above at the end) is that the next evolutionary quantum-leap for #AI will be #embodiment. This can be explained by #SystemsTheory (#Luhmann et al) and other social sciences:
https://mastodon.social/@HistoPol/110485223389507009
Last, but not least, #PrivateGPT is on the rise and that requires only a gaming PC, not data-centers.
@HistoPol
Its not the same for bitcoin precisely *because* Bitcoin doesn't need to expend copious amounts of energy, whereas AI basically does.
In a future where we have less energy, bitcoin wins.
Truth will always win out in the end. Just like fiat currencies over the millennia always reach their final price, zero.
🙏
By "exponential growth of our primary energy demand", I refer to a curve like this one https://ourworldindata.org/grapher/global-primary-energy.
Maybe I should have started by asking myself for a link that explain more in detail what you call "the exponential evolution of ai" ?
🤓
@dalias @HistoPol @timnitGebru
propaganda is certainly (a large part of) the Silly-Con-Valley marketing-mechanism shamelessly called self-fullfilling-prophecies 😏
Yet it has kind-of-worked for many local gurus to become globaly-influencial enough for redirecting a large part of the global primary energy demand and all the forms of capital relying on it (finance, intelligence/skills, infratsructures) towards their own personal goals/profits.
And looks like the're up for a new try.
🤞everyone
@timnitGebru But the data that goes in can eventually turn them into rogue machines.
Just look at what Trump did to his vapid followers.
@timnitGebru
Rogue machines have already taken over, though.
Australia allow the doomsday machine we call Facebook to pay off our media.
One solice is that #AI will not survive, in a future with less abundant #energy to waste.
Interestingly #Bitcoin can survive when energy is less abundant. Bitcoin can scale down, AI won't be able to really that.
What i see is a group totally engaged with AI research.
Isn't it time for other groups to engage with the impact AI will have on Society.
In short : let us have a much wider perspective on what is (going) to happen.
"... (many speak as though they belong to a different species with superior intelligence and rationality) ..."
Written in 2015, still accurate in 2023 - maybe moreso than ever, to hear chucklheaded knobgoblins like Musk talk about themselves. 😒 And their self-aggrandizing rhetoric is only getting worse.
Meanwhile, they fret openly about the "existential risk" of AI without acting on their handwringing pleas for a pause on training new models ... 🙄