AI Will Change Everything. The Real Question Is: What Won’t You Compromise?
- September 2, 2025
- 10:51 pm
- Theresa Agresta
Technology + culture change, in an instant, at scale.
This fall, schools across the U.S. and beyond are reopening with sweeping bans on student cell phones. In at least twenty-six U.S. statesand counting, laws now significantly limit student phone use during the school day.
In Indiana, after implementing a statewide ban, teachers reported that students were suddenly “present” again — making eye contact, talking to one another, and engaging more deeply in class discussions. Schools in Florida noted sharp drops in cyberbullying and disciplinary incidents within weeks. And in the UK, early studies found that banning phones in schools increased student test scores by as much as 6%, with the largest gains for struggling learners.
It’s a rare chance to watch how a technology shift can transform culture almost overnight.
But the story of cell phones in schools isn’t just about recent bans. It’s about the years before, when educators watched as devices quietly redefined classroom life — and struggled to push back. The cultural costs accumulated until they became impossible to ignore.
We all experienced losses during the years when cell phones reigned supreme in schools. Students’ developing brains were rewired to respond to constant pings and notifications, when they should have had the luxury of uninterrupted learning and reflection. Many endured cyberbullying, carrying fear and shame into classrooms that should have been safe havens.
Families watched as attention spans shortened, dinner-table conversations dwindled, and children grew anxious about missing out. Communities lost a measure of trust in the school environment itself, as respect between students and teachers eroded. And teachers — who dedicated their days and hearts to shaping the next generation — found themselves battling for attention and authority instead of being honored for their role in growth and learning.
If we’re not intentional, AI could trigger similar cultural fractures in the workplace as we experience the human cost of unchecked adoption through the erosion of trust, respect, attention, and dignity.
Just as schools are now clawing back ground they never should have lost, organizations may face the same challenge — trying to rebuild culture after it’s already been frayed by an uncritical rush toward new tools.
That’s the cautionary tale for today’s leaders: Ignore the cultural implications of AI, and you won’t just adopt the tools — you’ll adopt the fractures they create and find yourself in a workplace reshaped by the technology on its own terms.
AI isn’t just changing productivity.
Most organizations are still treating AI adoption as a technology rollout: a set of tools, a training program, a list of productivity targets. But this perspective misses the larger truth: AI is already reshaping corporate culture from the inside out.
Martin Seiler, CHRO of Deutsche Bahn, put it plainly in a recent interview with management consultancy Rowland Berger: “AI is changing our corporate culture because it requires us to have the courage to try things out – even when there’s a risk of failure.” This demand for experimentation and imperfection cuts against deeply ingrained norms in many companies—particularly those where risk-aversion, exhaustive piloting, and hierarchical sign-offs have long been rewarded.
Brian Balfour, CEO of Reforge, went even further as a guest on Lenny’s Podcast: “AI is fundamentally a full culture change for every company.”
The reality is that adopting AI is not just about buying or developing technology. It requires new ways of thinking, new behaviors, and a tolerance for vulnerability. Employees are being asked to question not only how they work, but whether the outputs they’ve always produced are even relevant in an AI-enabled world.
Adopting AI requires change at two levels:
- At the organizational level: taking a holistic view that considers technology integration alongside data strategy and governance, talent development, organizational structure, and ethics and risk management
- At the individual level: asking every employee to change significant portions of their approach to work, their decisions, and their daily outputs.
Interestingly, McKinsey’s 2025 Superagency in the Workplace report shows that while employees are experimenting and eager to learn, senior leaders are often the biggest laggards in AI adoption. And, while employees aren’t resisting AI they are asking: Where are the guardrails? What’s the story? How will I know what “good” looks like in this new world of work?
What they get instead are fragmented tools, vague assurances, or over-engineered change management programs that treat them as problems to be managed instead of partners in co-creation.
Different Cultures. Different Challenges.
AI experimentation will look different in every organization. Your current culture will shape both your opportunities and your roadblocks.
- A culture grounded in Control and Order may resist the ambiguity of generative tools.
- A culture centered on Innovation and Risk-taking may experiment quickly but struggle to scale responsibly.
- A culture driven by Service and Belonging may hesitate to automate in ways that feel impersonal or at odds with human connection.
Each of these examples represents a clear cultural pattern that can be identified through the 12 Archetypes of CultureTalk and bring clarity to what you are experiencing and why.
- The Ruler Archetype (Control and Order) values discipline, structure, and predictability. In an AI context, this culture might excel at creating governance frameworks, setting ethical standards, and managing risks. But it may also slow adoption, as leaders hesitate to embrace tools that feel untested or unpredictable. Employees could experience frustration if experimentation feels overly restricted.
- The Revolutionary Archetype (Innovation and Risk-taking) thrives on disruption and bold leaps forward. In these organizations, AI pilots may launch quickly and generate excitement, but without enough oversight, the risks of inconsistency, burnout, or unintended consequences grow. Scaling sustainably may be harder than sparking initial momentum.
- The Caregiver and Everyperson Archetypes (Service and Belonging) prioritize empathy, fairness, and human connection. Cultures like these may view AI through the lens of how it impacts people — protecting jobs, preserving relationships, and ensuring no one feels left behind. This can be a powerful check on adopting technology too quickly, but it may also mean leaders hesitate to automate in areas where efficiency gains are significant.
The same AI tool can either ignite innovation or deepen cynicism—depending on the cultural soil it lands in.
Understanding these cultural narratives doesn’t just explain why adoption looks different from one company to another. It also helps leaders identify where culture can accelerate change, where it will resist, and where intentional guardrails are needed to keep the human story front and center.
Understanding the narratives and Archetypes that underpin your culture gives leaders the insight to:
- See where to invest for early wins.
- Identify how mindsets and behaviors need to shift for new approaches to take hold.
- Spot the roadblocks that will require patience and deliberate change.
This is the difference between organizations that bolt AI onto existing systems and those that integrate it into a sustainable future.
Shaping your culture to support your AI roadmap
Even if your industry, your stakeholders, and your customers demand AI-driven change, your culture may not yet be set up to support it. Unfortunately, it can become something of a chicken and egg problem. As many change leaders will share from experience, culture is often the number one barrier to change.
That’s why leaders must ask:
- What aspects of our culture will accelerate AI adoption?
- Where will our culture slow it down?
- Where do we need to push back — and say “not this” — because the change is incompatible with who we are and what we stand for?
Schools waited years to resist the pull of cell phones, even as they saw the damage to focus, community, and learning. Leaders today don’t have the luxury of waiting until AI’s unintended consequences become overwhelming.
A valuable first step is to uncover where their culture actually stands today. A Baseline Culture Audit can peel back assumptions and reveal the real patterns driving behaviors. With that clarity, leaders can build a Culture Narrative that connects today’s reality to an aspirational future — giving employees a story that makes sense of change.
- For a Ruler culture, this might mean loosening control in small, safe experiments while reinforcing trust in standards and governance.
- For a Revolutionary culture, it could mean tempering speed with systems that protect employees from burnout and ensure wins can be scaled.
- For Caregiver or Everyperson cultures, leaders may need to explicitly honor the value of human connection, showing how AI can remove drudgery while deepening service, fairness, or belonging.Caregiver or Everyperson cultures, leaders may need to explicitly honor the value of human connection, showing how AI can remove drudgery while deepening service, fairness, or belonging.
No matter the starting point, the combination of cultural insight and a guiding narrative gives organizations a roadmap that matches both the promise of AI and the people who must live it every day.
The choice in front of leaders
AI will change your business. It will change the pace of your work, the skills your people need, the expectations of your customers, and the very definition of what success looks like. It will also change your culture — whether you’re ready or not.
The real question is whether those changes will reflect your values and aspirations, or whether they’ll unfold by default, shaped by the loudest voices, the fastest adopters, and the path of least resistance.
Leaders have a rare opportunity right now: to step back, understand the deep narratives that already drive behavior in their organizations, and make intentional choices about how AI should fit — and where it should not.
If you’re ready to align your culture with your AI ambitions, we’d love to connect.