this is something that troubled me a lot in a work context as well. i’d see more inexperienced developers, or those who lacked context for a problem, go to chatgpt first. the opportunity cost here is not just the risk of wrong information, but that it doesn’t forge relationships between colleagues, or build the psychological safety that it’s okay to ask questions or not know things. and experienced folks don’t get to practice the rewarding skill of mentoring others.
https://hci.social/@chrisamaphone/114870977619144528
chris martens (@[email protected])

beyond just the education context… there really is a kind of abuser logic to these products. the goal is to isolate you from social support. I think this is probably a really important thing to pay attention to and raise alarms about; I feel like i have not seen enough discussion of it by comparison with other issues (like not working/being wrong)

🌱 hci.social
@kat I am more concerned that inexperienced developers lack the experience to know the difference between a good solution, a bad solution, or at worst a completely wrong solution. There are no checks and balances.

@gregsbrain @kat

Also, doing the work of finding and evaluating the information yourself builds the pathways in the brain and reinforces actual learning in a way that getting a spoon-fed answer doesn’t.

@gregsbrain @kat or worse, the solution is subtly bad. It doesn't error, passes tests, and seems to work, but you have a time bomb of corrupted data or other major security concerns.

Also, I agree that human connection is extremely important. That and having to put in the hard hours to figure out the problem and solution so the next time is much easier because you have those tools. Then you share those tools and knowledge with others. I'm afraid that we're pulling up the ladder behind us.