Every technological revolution has its awkward adolescence. We’re living through AI’s right now. Recent research from Stanford and BetterUp has given this moment a name: “workslop.” It’s the flood of hastily AI-generated content that clogs inboxes, clutters presentations, and quietly erodes productivity. The email that reads like it was written by a committee of robots. The strategy document with oddly formal phrasing and zero original insight. The presentation deck that says nothing new.
If this sounds familiar, you’re not imagining it. And if you’re a manager watching your team’s output simultaneously increase in volume and decrease in quality, you’re not alone.
But here’s what history teaches us: this phase is predictable, necessary, and temporary. The question isn’t whether we’ll move through it. It’s how quickly we can get to the other side.
Why Workslop Happens
When personal computers arrived in offices, workers treated them as expensive typewriters. When the internet became ubiquitous, we spent years learning that you can walk 10 feet to talk to someone instead of firing off another email. Each time, we mistook the tool for the solution.
We’re making the same mistake with AI. Only faster, and at greater scale.
The core problem is one of delegation versus collaboration. AI will deliver increased speed and efficiency, but most organizations have accidentally encouraged their people to treat it as something to offload to rather than something to work with. An associate generates a client memo with Claude and sends it along, complete with the telltale “AI can make mistakes, please double-check” footer still attached. A manager asks ChatGPT to write a strategy document and forwards it without adding context, nuance, or judgment.
This isn’t a technology problem. It’s a mindset problem that technology has exposed.
When content creation becomes effortless, the cognitive work of thinking deeply becomes optional. And when it becomes optional, people can opt out. What researchers are calling “cognitive atrophy” is really just a gradual disconnection from the thinking process itself. We’re delegating not just the execution, but the strategy. AI will get you 70% of the way there, but someone still needs to own that final 30%, and right now it seems some people are checking out before the finish line.
The Way Through
The good news? Workslop isn’t a crisis. It’s a phase. Organizations that recognize it as such can compress what might take years into months.
Start by redefining what you measure. The drive to do more with less can create pressure to crank out more work in the same time, with AI as the productivity multiplier. But leaders need to resist the assumption that one person plus AI should equal twice the output. If you’re still evaluating employees primarily on volume, you’re incentivizing exactly the behavior you don’t want. Prose and code generation are now commoditized. What matters is the quality of thinking that directs these tools. In your performance management processes, assess people on their judgment, their ability to steer AI effectively, and their capacity to iterate toward genuinely excellent outcomes.
Draw bright lines. Leaders need to align on the AI vision, the guardrails, and how they’ll hold people accountable. Establish explicit standards for what constitutes acceptable AI-assisted work. Some organizations are implementing simple rules: AI-generated content must be marked during internal review. Client-facing materials must demonstrate clear human value-add. Any work bearing AI watermarks or disclaimers gets automatically returned. These aren’t punitive measures. They’re cultural signals about what professionalism means in an AI-augmented workplace. Without mutual commitment from leaders to embed these standards, the bright lines blur.
Embrace experimentation, but guide it. The workslop phase exists because people need room to learn, and that requires a growth mindset, not a fixed one. Risk aversion kills experimentation. Moving through this phase means reframing failure as data, celebrating what you learn from missteps, and managers modeling vulnerability about their own learning curve. Managers can accelerate this shift by tapping into people’s intrinsic motivation for mastery. But experimentation without feedback loops doesn’t create change. So have forums where teams share what’s working and what isn’t, and celebrate the wins and the learnings of human-AI collaboration.
Learn from unexpected sources. Universities faced the workslop crisis before corporations did. Many have developed sophisticated approaches to maintaining rigor while embracing AI tools. They’ve created assignments that inherently require human judgment, implemented systems that flag low-quality automated work, and redesigned evaluation criteria to emphasize critical thinking over production. These aren’t perfect solutions, but they’re battle-tested ones that can translate to corporate contexts.
Resist the delegation instinct. The most important cultural shift is also the simplest: don’t treat AI as your copilot. Treat it like a student, and you’re the teacher. This reframes the entire relationship. You’re not handing off work. You’re responsible for what that student produces, which means staying engaged in the iterative process, using tools to enhance rather than replace human judgment, and taking full ownership of outputs regardless of how they were generated. Organizations that successfully embed this mindset move through the workslop phase measurably faster. The upside? New Stanford research tells us that employees trust AI more when they can see it as a collaborator, not a closed system.
An Unexpected Opportunity
Here’s what makes this moment genuinely unique: the traditional corporate hierarchy of expertise has temporarily inverted. Right now, a brilliant 22-year-old who knows how to work with AI tools can create more value than their manager who doesn’t. This isn’t a threat to experienced leaders. It’s an opportunity. Junior employees have rare insight into what actually works, and smart managers are creating channels for those employees to lead the way forward. You’re not being replaced. You’re being offered a shortcut to expertise that would otherwise take years to develop.
The companies that emerge strongest from the workslop phase won’t be those that restricted AI use or pretended the problems didn’t exist. They’ll be the ones that acknowledged the awkwardness, called it out, learned from it quickly, and built cultures where humans and AI genuinely complement each other. Experience shows us that the most critical cultural factors that will shape the success of AI include the degree of autonomy of teams to shape workflows, the measures and controls put in place, and what gets rewarded and recognized.
We’re in the messy middle of the AI adoption curve. Workslop is almost certainly happening in your organization right now. The only question is whether you’re managing the transition or hoping it resolves itself.
History suggests which approach works better.
The early-rate deadline for Fast Company’s World Changing Ideas Awards is Friday, November 14, at 11:59 p.m. PT. Apply today.
  

