In the previous lesson, you explored how Inna Landman redirected Procore's AI transformation from over-engineered frameworks toward readiness-driven starting points and how compressed cycle times and new cost structures ripple across the entire organization. But identifying where to start is only half the challenge. The harder question is what you say to the workforce while transformation is underway — especially when you can't promise that every role stays the same. This lesson unpacks two connected ideas from the conversation: how candid communication and culture change must accompany AI tool adoption, and why modeling vulnerability isn't weakness.
You'll recall the moment in the conversation when the discussion turned to employee fear. External headlines about AI-driven layoffs are creating real uncertainty inside organizations, and as Landman discussed, Procore is no exception. The instinct for many leaders is to reassure and tell people their jobs are safe. But the conversation surfaced why that instinct backfires. As was explored directly:
Landman: "We can't say everything's going to be the same and everyone's jobs are safe."
Instead, the principle Landman articulated was deceptively simple:
Landman: "Clarity is kindness in moments of transformation."
What does that look like in practice? It means sharing the direction — this is where we're heading, these are the functions moving first, here's what we know today about how roles and workflows will evolve — without padding it with promises you can't keep. It also means explicitly naming what you don't yet have answers for. For your workforce, that honest framing is far more credible than either false reassurance or evasive silence.
Building on this communication challenge, Kyle Forrest's research framing added a critical structural dimension. Deloitte's Human Capital Trends data found that roughly were not thinking about culture alongside their AI initiatives, creating what the conversation called . This is what accumulates when you deploy tools but never update the behavioral norms, incentives, and decision rights that surround them. As the conversation explored: . For people leaders, this means communication alone isn't enough. Clarity must extend to beyond the technology, including who makes which decisions, what gets rewarded, and how performance is measured in AI-augmented workflows.
