The people who made it through the 3-week trough are the ones who expected to manage the AI โ not be managed by it, and not expect magic.
API integrations, RAG architectures, fine-tuning, developer tools
Where does AI fit? How do I verify? When do I trust it?
Tool tours, prompting fundamentals, generic use cases
"How do I use this tool?"
"Where does this tool fit in my workflow and how do I know when to trust it?"
"The best users of AI are good managers. They're good teachers. The skills that make you good at AI are not prompting skills. They're people skills."โ Ethan Mollick
Employees know what's allowed โ and the default leans toward "yes" rather than a giant red stop sign.
Not just tool tours and prompting basics โ includes workflow integration, quality judgment, task decomposition.
The people driving adoption have domain expertise and management skills, not just technical enthusiasm.
Practical applications are surfaced, shared, and celebrated. Social proof drives adoption.
Knowledge about where AI fails spreads across the org, not just where it succeeds.
"This is how we do RFPs now" โ not "try the AI thing when you have time."
Apprentice model isn't collapsing โ juniors still build judgment through appropriate work.
If you can't answer these questions, your people are probably stuck at 101 โ in the trough, and most won't figure it out on their own because the organizational context doesn't support their learning.