Both of my grandfathers worked as physical laborers. If I told them people now pay monthly fees to lift heavy things in climate-controlled rooms, they’d ask: “Don’t they get exercise at work?” For many of us, the answer is no. The economic shift from manual labor to knowledge work eliminated physical exertion from daily routines. We had to intentionally add it back. We’re now at a parallel inflection point with cognitive work. And most of us don’t yet realize it.
When automation eliminated manual labor from many jobs, we didn’t immediately recognize what we’d lost. The health consequences gradually emerged. Eventually, we built an entire industry around fitness because physical fitness was no longer embedded in how we worked.
AI is doing to knowledge work what industrial automation did to physical labor. As these tools handle more cognitive tasks, we risk mental atrophy from disuse. The difference? We are watching it happen in real time.
Recent Harvard and OpenAI research examined over a million ChatGPT conversations. The findings reveal a pattern worth examining.
About 40% of messages prompt AI to do tasks like write emails, create documents, generate reports, build time lines. Another 49% ask for information or guidance to inform decisions. The distinction matters. “Doing” messages consistently received lower quality ratings than “Asking” messages (good to bad ratio of 2.76 vs 4.45). More significantly, educated users in professional occupations are substantially more likely to use AI for Asking. And while Asking usage is growing faster overall, work messages still tilt toward Doing (56% vs 35%).
Translation: The users who get the most value from AI aren’t automating tasks, they’re augmenting thinking. Yet at work, we’re still primarily having AI do things rather than help us think. That’s a missed opportunity.
There’s a reason sophisticated users choose differently. They understand something fundamental which is learning requires struggle. When we encounter difficulty and work through confusion, our brains form stronger neural pathways. This isn’t motivational rhetoric, it’s a neurobiological necessity.
Cognitive offloading provides real benefits. We should use calculators for arithmetic and GPS apps for routing. But there’s a threshold where offloading becomes atrophy. When we stop engaging in cognitive struggle, we stop building the mental models that enable judgement.
What does taking the cognitive stairs look like?
Taking the Elevator:
Taking the Stairs:
The first examples treat AI as a task completion engine. The second treats it as a dialogue partner. The first produces a document. The second produces understanding.
This approach is called Socratic prompting: transforming product-oriented requests into dialogue-oriented questions. Instead of asking AI to create something for you, ask AI to help you think through the problem.
When information becomes freely available through AI, the premium value shifts to judgement development. But judgement can’t be developed through automation. It requires deliberate cognitive engagement.
The labor market is already signaling this shift. Employers increasing seek “Ai-augmented decision makers”: professionals who leverage technology while maintaining independent judgment. The Harvard research confirms this: users in highly-paid professional occupations are substantially more likely to use ChatGPT for “Asking” rather than “Doing.”
Here’s the strategic implication: Short-term productivity gains from AI automation are real and immediate. But if your workforce stops thinking deeply because AI handles all cognitive labor, you’re trading today’s efficiency for tomorrow’s judgement capacity.
Universities are witnessing this dynamic in real time. Most students now use AI tools for academic work. Yet employers consistently report graduates lack critical thinking and decision-making skills.
When students use AI to complete assignments rather than support learning they accumulate credentials without developing capabilities. They can produce work that meets requirements but cannot explain the reasoning or adapt it to novel situations. They’re taking the elevator so consistently that they can no longer climb the stairs.
Organizations hiring these graduates will inherit the consequences of workers fluent in AI task automation but incapable of the independent thinking that justifies premium compensation.
Ten years from now, will “cognitive fitness routine” be as standard as “workout routine?” The answer depends on choices we are making now, mostly without realizing they’re choices at all.
Use AI to amplify your capabilities, not just shrink your effort. Choose dialogue over delegation. Ask questions before you ask for answers. Take the cognitive stairs often enough that you maintain the ability to climb them.
Because when knowledge is free, judgement is premium. But only if you can still do it.
The elevator is always there. It is fast, easy, tempting. But the people who build careers that matter will be the ones who regularly choose the stairs. Not because the destination is different, but because the journey builds something the elevator never will: the capacity to think.
That capacity, more than any specific skill or knowledge, will determine who thrives as AI reshapes knowledge work. The question isn’t whether to use AI; it’s whether you’re using it in ways that preserve your ability to think without it.
Start tomorrow with one prompt. Before you ask AI to create something, ask it to help you think about something. Notice the difference. Then ask yourself: am I building capability, or trading it for convenience?
The cognitive muscles you build today determine whether or not you’re valuable tomorrow.