RE: https://cloudisland.nz/@daisy/116524986997671719
The software development profession destroyed itself in a matter of months.
Which is quite hilarious, in a dark way.
If you can get out, get out. The sector will never recover from this.
Minneapolis now, France soon.
Formerly data science/machine learning, soon to be something else entirely.
Leaving tech, and so should you.
RE: https://cloudisland.nz/@daisy/116524986997671719
The software development profession destroyed itself in a matter of months.
Which is quite hilarious, in a dark way.
If you can get out, get out. The sector will never recover from this.
RE: https://techhub.social/@Techmeme/116524670925209298
A and B rounds simultaneously, interesting.
Also, who wants to make games this way? Who wants to make games? People want to buy and play games, not spend their time prompting.
RE: https://techhub.social/@Techmeme/116525180645632039
There’s a future, you’re just not in it.
RE: https://techhub.social/@Techmeme/116525023633643624
There’s a future, but you’re not in it.
RE: https://techhub.social/@Techmeme/116524708796842547
This agent nonsense is so out of control.
RE: https://mastodon.social/@arstechnica/116524142742368295
Everyone! Everyone!
Come quick!
I found a scam!
We've known about this for well over a decade: that over-representation of white people in training data leads to this outcome.
This can be fixed, because while the algorithms aren't "racist" as they don't think, they reproduce racist outputs thanks to a combo of racist sentiment in training data and over-representation of particular groups.
That this STILL isn't fixed in 2026 really shows how shoddy these models are, as well as the incompetence of those building them. https://bsky.app/profile/kimberlyeatkins.bsky.social/post/3ml4dbhc3fs2n