0 Followers
0 Following
1 Posts
I think this one is just more obviously a bubble. If the pop is inevitable, then, the sooner the better. That, and the AI everywhere all at once is tangible and in your face. No one I know likes it.
That guy to all his friends: “AI makes me 10x more productive!”
My company hired a consulting firm to help with a transition period. The consulting firm sent my boss an email that outlined the plans for what we should do and how they are going to help. Without directly giving it away, the email was clearly AI output, and my boss instantly terminated their contract. We aren’t exactly anti-AI, but to the point of the post, it’s just so rude… and my boss is pretty fuckin cool.
How much more will DoD pump into it now?
Yes. Custom layout so I was forced to learn touch typing. I’m way faster now than I was with with qwerty after years of programming because I would always find myself looking at the keyboard. So, I guess with qwerty, the same could be accomplished by talking the characters off.
I think that little island there is Alcatraz.
Ha. I did a reality check when I couldn’t make sense of the text.
Sotreé Seet*
Snowflakes

I can definitely respect a limited approach. I personally don’t find any benefit from it. Anecdotally, I’ve become much more productive since switching from OOP style C++, to just straight C. I think a lot of that comes from the boilerplate and ceremony required to make it do the thing, but in C, you just do the thing.

I also think even using objects tends to encourage poorer design choices by thinking in terms of individual items (and their lifetimes). This is enforced by the constructor/destructor model. As opposed to thinking in terms of groups of items which leads to simpler and safer code.