AI Doesn’t Reduce Work—It Intensifies It https://hbr.org/2026/02/ai-doesnt-reduce-work-it-intensifies-it
“AI introduced a new rhythm in which workers managed several active threads at once: manually writing code while AI generated an alternative version, running multiple agents in parallel, or reviving long-deferred tasks because AI could “handle them” in the background. They did this, in part, because they felt they had a “partner” that could help them move through their workload.
While this sense of having a “partner” enabled a feeling of momentum, the reality was a continual switching of attention, frequent checking of #AI outputs, and a growing number of open tasks. This created #cognitiveload and a sense of always juggling
… What looks like higher #productivity in the short run can mask silent workload creep and growing cognitive strain as employees juggle multiple AI-enabled workflows
… overwork can impair judgment, increase the likelihood of errors, and make it harder for organizations to distinguish genuine productivity gains from unsustainable intensity
… the cumulative effect is fatigue, #burnout, and a growing sense that work is harder to step away from, especially as organizational expectations for speed and responsiveness rise."
#LaborEcon

AI Doesn’t Reduce Work—It Intensifies It
One of the promises of AI is that it can reduce workloads so employees can focus more on higher-value and more engaging tasks. But according to new research, AI tools don’t reduce work, they consistently intensify it: In the study, employees worked at a faster pace, took on a broader scope of tasks, and extended work into more hours of the day, often without being asked to do so. That may sound like a win, but it’s not quite so simple. These changes can be unsustainable, leading to workload creep, cognitive fatigue, burnout, and weakened decision-making. The productivity surge enjoyed at the beginning can give way to lower quality work, turnover, and other problems. To correct for this, companies need to adopt an “AI practice,” or a set of norms and standards around AI use that can include intentional pauses, sequencing work, and adding more human grounding.
