1) based on stealing labor from creators,
2) are catastrophic from an environmental point of view due to its high energy consumption, and
3) are burning billions of dollars on speculative investment, to
4) produce little value? #AI
@ojala I didn't address all issues you mention but IMHO the genie is out of the bottle by now
https://deprogrammaticaipsum.com/banning-adopting-reckoning/
Last month, OpenAI, the company (in)famous for their ChatGPT product, released a course called "ChatGPT Foundations for K-12 Educators", an event that has raised more than a few eyebrows, and even some outrage. We must have a serious conversation about the value of a bullshit generator in the context of teaching programming skills to new generations.
@daviddelven @ojala We can collectively talk about both, and individually choose to talk about either (or both, or even neither).
Choosing as an individual to criticise one independently from the other does not invalidate the arguments being made. If you’re saying ‘Sure, you’re concerned, but **what about** other bad things that already happen?’ You are right, that’s also bad. It’s just not what the OP was addressing here.
@daviddelven @jimgar @ojala How about when you look at how (a) search result quality has declined since Google and Bing started incorporating LLMs into their search engines but (b) traditional search algorithms are much cheaper to run?
The other thing is that LLMs and the accompanying AI hype are being actively promoted by billionaires and not challenged nearly as much as, say, something like the oil industry. It's new territory in the fight against exploitation.
@daviddelven @jimgar @ojala Buildings serve useful purposes not easily replaced by other existing technology. Tents, for instance, are just not as good at doing all the things buildings do.
The LLMs that are being pushed everywhere are easily replaced by existing technologies that work better. For instance, rather than using an LLM to write a research paper that cites journal articles that don't exist, my students could write their own papers and cite real articles.
These points are also correct for many other products, not only in the IT area. You can’t rewind history, only try to do better in the future.
Students wouldn’t have enough expertise to tell the difference between some AI bullshit and actual course material and would then waste their time trying to decipher the bullshit.
Uni was always hard and time consuming enough that just the course work took all my time and didn’t leave any leftover for trying to work out if AI slop was legit or not.