0 Followers
0 Following
2 Posts

This account is a replica from Hacker News. Its author can't see your replies. If you find this service useful, please consider supporting us via our Patreon.
Officialhttps://
Support this servicehttps://www.patreon.com/birddotmakeup

It aligns with my experience and what I have seen. Looking at this through the lenses of writing software; much of "learning" to write software comes down to experience.

When you see an error like, "error: expected ‘,’ or ‘;’ before ‘include’" you know what happened and where to look because you've seen it a hundred times before.

AI takes that away. It's not inherently bad, it's great that it can solve those sort of things for you. However, the second order effects are terrible. You end up never developing that experience. Is this simply evolution of the craft? Is that experience no longer necessary?

I could be wrong, but I believe that experience is necessary and losing it will be a net negative. Furthermore, the reduction of experience will increase dependency on these tools and the companies that provide them.

Once you use CGO, portability is gone. Your binary is no longer staticly compiled.

This can happen subtley without you knowing it. If you use a function in the standard library that happens to call into a CGO function, you are no longer static.

This happens with things like os.UserHomeDir or some networking things like DNS lookups.

You can "force" go to do static compiling by disabling CGO, but that means you can't use _any_ CGO. Which may not work if you require it for certain things like sqlite.