Politics and Stakeholder Management
Eng Manager
AI can't run a 1:1. It can't sense that someone is disengaged. People work is 100% human.
Tech Lead
Technical excellence doesn't ship if stakeholders aren't aligned. You own the alignment. AI writes the code.
Cto
Board presentations, investor updates, exec alignment — no AI. Your job is human-to-human.
Politics and Stakeholder Management
TL;DR
- Over 95% of AI projects fail. Partly because leaders focus on wrong success metrics. Organizations build "right" tools, hit milestones—and fail to deliver value that matters to stakeholders.
- AI can't read a room. It can't build trust. It can't negotiate. Lack of explainable AI creates communication breakdowns; stakeholders who can't understand how AI decides don't trust it. Adoption faces resistance.
- Fear of job loss, lack of understanding of algorithms, sense of exclusion from decision-making—these human factors effectively block technology adoption. AI only works "on paper" without education, reskilling, and genuine stakeholder engagement.
"Politics" sounds dirty. It's really just: getting humans to agree so work can happen. AI has nothing to contribute there. You do. Research is blunt: organizations fail to genuinely engage stakeholders in AI decision-making, prioritizing implementation speed and company profits over stakeholder concerns and expertise.
What AI Can't Do
Read the Room
- "Is the VP actually on board or just nodding?" — You sense it. AI can't.
- "Should I push back now or wait?" — Timing. Relationship. You know. AI doesn't.
- "Is this meeting going off the rails?" — You redirect. AI isn't in the meeting.
Build Trust
- "Will they trust this recommendation?" — Trust is earned over time. AI has no history with your stakeholders. You do. When stakeholders can't understand how AI makes decisions, trust erodes. Legitimacy concerns—job losses, biased decisions, societal impact—aren't algorithm problems. They're human problems. You address them.
- "Do I need to show my work?" — Some audiences need the proof. Some want the headline. You calibrate. AI gives one output.
- "Have I overpromised?" — Relationship management. AI doesn't know what you said last quarter.
Negotiate and Broker
- "Engineering wants 6 months. Product wants 6 weeks." — Someone has to broker. Compromise, scope reduction, phased delivery. AI can't sit in that negotiation.
- "Two teams both want to own this." — Territory. You navigate. AI suggests a rational split. The org might not be rational. AI's benefits are consistently overstated; deploying AI to complex social problems displaces grounded expertise from domain professionals who understand stakeholder needs. You're that professional.
Manage Up, Down, and Sideways
- Up: Translate technical reality for execs. "We can't do that in 3 months" → "Here are the options and trade-offs." AI can draft the slide. You deliver it. You handle the pushback.
- Down: Motivate, unblock, career conversations. 100% human.
- Sideways: Peer alignment. "Will the platform team support this?"—You ask. You build the relationship. AI can't. Coercive deployment—AI pushed without stakeholder buy-in—undermines due process. Benefits accrue to companies, not workers or the public. You're the buffer.
Why This Matters
Projects fail more often because of misalignment than because of bad code. The perfect architecture that no one agreed to is worthless. The messy architecture that everyone bought into ships. You're the one who gets the buy-in. 95% failure rate isn't a tech problem. It's a stakeholder problem.
How to Use This as a Moat
- Own the stakeholder map. Who cares? Who can block? Who needs to be informed? Document it. AI can't. You can.
- Run the meetings. AI can draft agendas and summaries. It can't facilitate. It can't sense when to table something. You can.
- Be the translator. Technical → business. Business → technical. AI generates content. You adapt it for the audience. That's a skill.
- Build the relationship before you need it. When you need a favor, it's too late to build trust. AI can't do favors. You can. Education and reskilling reduce fear. Inclusion in decision-making reduces resistance. You drive that.
Quick Check
'Engineering wants 6 months. Product wants 6 weeks.' Who brokers that?
Quick Check
Why do over 95% of AI projects fail?
Do This Next
- List 3 stakeholders for a project you care about. For each: What do they want? What would make them say no? AI doesn't know. You should.
- Identify one alignment gap in your current work. Two people want different things. Write down how you'd broker it. That's the human skill. Practice it. Today: have one conversation where you explicitly address a stakeholder concern instead of assuming it away.