@
[email protected] @
[email protected] As someone researching AI in my Industrial-Organizational Psychology grad program (and as a blind person), I've pored over studies from psych journals claiming "AI causes mental health issues." They almost always add: when used this way. AI isn't the problem how people use it is.
AI is brand new tech, yet there's a massive gap in AI literacy and skills training. Here's a real example: Upload 25 journal articles to an AI chatbot, plus details on your research question. It instantly flags the top 5 to read. This smartly combines traditional keyword search with AI-powered filtering.
For me, a single 15-page journal article takes ~2.5 hours to read via screen reader most have terrible markup, making skimming impossible. Sifting 25 articles? That's 75 hours of grueling, often irrelevant reading.
AI cuts that to 15 hours of focused reading on what matters. Crucially, I still read the originals directly (not AI summaries). I log into my university library, run keyword searches, scan titles, judge relevance, and critically evaluate every source. AI helps prioritize it doesn't replace research skills, critical thinking, or human judgment.
We need AI literacy taught in schools, colleges, universities, and workplaces not just "AI exists," but how to use it right to amplify productivity without the pitfalls. Let's bridge the gap before misinformation spreads.