This episode argues that most Microsoft 365 problems are not technical, but organizational. Technical experts often design tenants that are logically perfect but fail in real-world use. They focus too much on configuration and not enough on how people actually work. As a result, systems become difficult to manage and quickly lose structure. The speaker highlights that Microsoft 365 should be treated as an operating system for the business, not just a collection of tools. Many issues like oversharing and sprawl come from missing governance, not bad technology. Technical teams often fall into the trap of building complex, elegant solutions without clear ownership. Over time, these systems break down because no one is responsible for maintaining them. Governance is often treated as a one-time project instead of an ongoing process. This leads to long-term risks, especially around security and scalability. The episode emphasizes the importance of intent-based governance rather than just technical settings. Organizations need to define ownership, lifecycle, and accountability from the start. Good architecture is ultimately about supporting people, not just systems. Success should be measured by usability and sustainability over time. The key takeaway is that simplicity, clarity, and governance matter more than technical perfection.
In the world of technology, you often encounter a paradox known as the Architect’s Confession. Highly skilled architects and engineers create systems that sometimes fail to meet user needs. This disconnect can lead to frustration and dissatisfaction among tenants. A recent analysis shows that effective action planning significantly enhances tenant satisfaction and asset performance. Professionals who analyze data can turn insights into actionable steps, improving retention rates. When technical decisions overlook user experience, the results can be detrimental. Understanding these dynamics is crucial for anyone involved in system design.
Key Takeaways
- Simplicity is key. Avoid overengineering systems that complicate user experience.
- Align technical solutions with user needs. Gather feedback to ensure systems are intuitive.
- Involve stakeholders in the design process. Their insights can lead to better system outcomes.
- Recognize the importance of governance. Effective governance aligns technical capabilities with user expectations.
- Foster a culture of accountability. Encourage team members to take ownership of their contributions.
- Utilize collaboration tools to enhance communication. Tools like Microsoft Teams can streamline teamwork.
- Implement continuous feedback loops. Regular feedback helps improve systems and user satisfaction.
- Treat governance as an ongoing process. Regularly review and adapt governance strategies to meet evolving needs.
Technical Systems

Complexity and Overengineering
Allure of Sophisticated Solutions
In the realm of technical systems, complexity often tempts architects and engineers. You might find yourself drawn to sophisticated solutions that promise to enhance functionality. However, this allure can lead to overengineering. Overengineering occurs when you create systems that are more complicated than necessary.
Consider these common examples of overengineering in enterprise environments:
- Building overly complex custom ABAP modules when standard functionality in SAP S/4HANA suffices.
- Implementing intricate logic in SAP Fiori/freestyle UI5 apps instead of using simpler Fiori Elements standard list reports.
- Assuming that high-end GPUs are necessary for all applications, which can lead to unnecessary costs for your organization.
These choices can create systems that are difficult to maintain and understand. You may think that complexity equals sophistication, but it often results in confusion and inefficiency.
Consequences of Unnecessary Complexity
The consequences of unnecessary complexity can be severe. When systems become too intricate, they can hinder performance and frustrate users. You may notice that users struggle to navigate these systems, leading to decreased productivity. Additionally, the time and resources spent on maintaining complex systems can drain your organization’s budget.
To avoid these pitfalls, focus on simplicity. Strive for solutions that meet user needs without unnecessary bells and whistles. A streamlined approach can enhance user satisfaction and improve overall system performance.
User Needs Misalignment
Technical Specs vs. User Experience
Aligning technical solutions with user needs is crucial. Often, technical specifications do not match user experience. You might develop a system based on technical requirements without considering how users will interact with it. This misalignment can lead to frustration and decreased adoption rates.
For example, if you create a system that requires extensive training to use, you may find that users resist adopting it. Instead, prioritize user experience by gathering feedback and understanding their needs. This approach will help you design systems that users find intuitive and easy to navigate.
Neglecting Stakeholder Input
Neglecting stakeholder input can further exacerbate user needs misalignment. When you fail to involve stakeholders in the design process, you risk overlooking critical insights. Engaging with users and stakeholders ensures that you understand their requirements and expectations.
By fostering open communication, you can create systems that truly serve their intended purpose. Remember, the best technical solutions arise from collaboration and a deep understanding of user needs.
The Architect’s Confession: Pitfalls of Technical Excellence
Overconfidence in Abilities
Underestimating Governance Challenges
As a technical expert, you may feel confident in your abilities. However, this overconfidence can lead to significant governance challenges. You might overlook the complexities of organizational behavior and user needs. This oversight often results in ineffective designs that fail to serve their intended purpose.
The Dunning-Kruger effect plays a crucial role here. This psychological phenomenon occurs when individuals with limited knowledge overestimate their competence. In technical environments, this can manifest as:
- Ignoring user needs and organizational behavior.
- Focusing solely on technical specifications rather than human factors.
- Designing systems without considering how users will interact with them.
Successful system design requires a balance between technical prowess and an understanding of human psychology. The best architects prioritize questions about user operation and adaptability over purely technical considerations. By doing so, you can create systems that truly meet user needs.
Consequences for System Design
When you underestimate governance challenges, the consequences can be severe. Poorly designed systems often lead to:
- Increased frustration among users.
- Decreased productivity due to inefficient workflows.
- Higher costs associated with maintenance and support.
To avoid these pitfalls, you must recognize the importance of governance in system design. A well-governed system aligns technical capabilities with user expectations, ensuring a smoother experience for everyone involved.
Accountability Issues
Lack of Ownership in System Design
Accountability issues often arise in technical projects, leading to a lack of ownership in system design. When team members do not feel responsible for their contributions, it creates a culture of blame. This can result in an "accountability death spiral," where:
- Team members shift blame among themselves.
- Vague responsibilities lead to finger-pointing.
- Hidden problems escalate, ultimately resulting in project crises.
The Project Manager often bears the brunt of project failures, even when they lack control over key factors. This situation creates frustration and can lead to project failure, demonstrating the negative impact of accountability issues.
Impact on End-Users
The consequences of accountability gaps extend beyond project teams. They significantly impact end-user satisfaction. Users often face a fragmented ecosystem of devices and applications, leading to inefficiencies and frustration. Here are some specific ways accountability issues affect users:
- Disconnect Between Technologies: Users encounter difficulties navigating a disjointed system.
- Lack of Understanding of User Work Environments: IT teams may overlook the practical realities of users' work environments, resulting in misaligned tools.
- Complexity of Third-Party Integration: Poorly managed interactions with third-party tools can hinder productivity.
- Security and Compliance Constraints: Overly restrictive security measures can frustrate users and lead to risky workarounds.
- Time-to-Access and Usability Issues: Delays in accessing systems contribute to user dissatisfaction.
- Lack of Continuous Feedback: Sporadic feedback collection prevents IT from addressing root causes of dissatisfaction.
- Misalignment with User Expectations: Users expect seamless experiences, and failure to meet these expectations leads to dissatisfaction.
By fostering a culture of accountability, you can enhance user satisfaction and create systems that truly serve their needs.
Governance Zones in Microsoft 365
Governance in the context of Microsoft 365 refers to a framework that includes policies, roles, responsibilities, and processes. This framework governs how your organization manages and utilizes its Microsoft 365 environment. Effective governance ensures that you use the system properly, comply with relevant regulations, and maintain security. Each organization may have different governance needs based on its culture, rules, and workflows. Therefore, governance is not a one-size-fits-all approach; it requires tailored guidelines and practices that promote safety and security without hindering productivity.
Intent-Based Governance
Importance of Behavioral Focus
Intent-based governance emphasizes the behaviors you want to encourage within your organization. Instead of merely configuring settings, you should focus on how users interact with Microsoft 365. This approach helps you create a culture of accountability and ownership. By establishing clear expectations, you can guide users toward desired behaviors that align with your organization's goals.
Consider implementing strategies that promote a governance-first culture. For example, standardizing workspace creation with naming conventions and templates can ensure accountability and data stewardship. Additionally, establishing rules for archiving or deleting inactive workspaces can improve data quality and compliance.
Questions on Ownership and Lifecycle Management
Ownership and lifecycle management are critical components of intent-based governance. You should ask yourself:
- Who is responsible for managing each workspace?
- How will you handle the lifecycle of documents and projects?
- What processes are in place for monitoring and reporting?
By addressing these questions, you can create a structured approach to governance that aligns with your organization's needs. This proactive stance helps prevent governance failures and ensures that your Microsoft 365 environment remains secure and efficient.
Identifying Governance Failures
Signs of Ineffective Governance
Recognizing the signs of ineffective governance is essential for maintaining a healthy Microsoft 365 environment. Common indicators include:
- Lack of continuous governance practices: Many organizations treat governance as a one-time setup rather than an ongoing process. This oversight can lead to stagnation and chaos.
- Oversharing and misconfigured access: Security researchers estimate that over 15% of business-critical files are at risk due to these issues. You must regularly review access permissions to mitigate this risk.
- Organizational design problems: Inappropriate site privacy settings and default sharing options set to 'everyone' can create vulnerabilities.
Case Studies of Governance Breakdowns
Examining case studies of governance breakdowns can provide valuable insights. For instance, many Microsoft 365 Copilot deployments stall between weeks 6 and 12 due to governance treated as a moment rather than a continuous process. Additionally, nearly 70% of security teams express concerns that AI tools like Copilot could expose sensitive data. These examples highlight the importance of establishing a robust governance architecture that evolves with your organization.
| Governance Failure Type | Description |
|---|---|
| Lack of Continuous Governance Practices | Most Microsoft 365 Copilot deployments stall between weeks 6–12 due to governance treated as a moment rather than a process. |
| Oversharing and Misconfigured Access | Security researchers estimate that over 15% of business-critical files are at risk due to these issues. |
| Organizational Design Problems | Common patterns include inappropriate site privacy settings and default sharing options set to 'everyone'. |
By understanding these failures, you can take proactive steps to enhance your governance strategy. A well-structured governance framework will help you navigate the complexities of Microsoft 365 and ensure that your organization thrives.
Real-World Governance Failures

Case Studies
Examples from Various Industries
Real-world examples illustrate the consequences of poor governance in technical environments. For instance, a global enterprise discovered 6,200 applications and 4,000 flows in a single default environment, which was not intended for production use. Many applications lacked current ownership, with ownership fields pointing to users who had left the organization years ago. This situation arose not from malicious intent but from treating a development platform as a productivity tool.
Another notable case involved the Syracuse ASC ransomware attack. This incident highlighted the consequences of inadequate monitoring systems, which failed to detect a breach quickly. Organizations have even lost access to their entire tenant due to misconfigured policies. These examples underscore the critical need for effective governance strategies.
| Issue Identified | Consequence of Poor Governance |
|---|---|
| Lack of transparency | Limits civil society's ability to evaluate projects |
| Difficulties in accessing information | Hinders monitoring and accountability |
| Inadequate political coordination | Leads to negative socio-environmental impacts and increased costs for society |
Lessons Learned
From these case studies, you can draw several lessons. First, always ensure proper ownership and accountability for applications. Second, prioritize monitoring systems to detect breaches promptly. Lastly, treat governance as an ongoing process rather than a one-time setup.
Recurring Issues
Trends in Governance Failures
Several recurring issues emerge across multiple governance failure case studies. These include:
| Primary Reasons for Governance Failure | Examples from Case Studies |
|---|---|
| Lack of proper oversight and accountability mechanisms | Enron |
| Conflicts of interest | Theranos |
| Inadequate risk assessment strategies | 2008 financial crisis |
These trends reveal that governance failures often stem from systemic issues within organizations.
Role of Organizational Culture
Organizational culture plays a significant role in governance failures. A flawed culture can lead to weak leadership and unethical behavior. For example, companies like Wells Fargo and Parmalat illustrate how poor culture and groupthink can result in corporate scandals. The board's role is essential in shaping and monitoring culture to ensure ethical behavior and accountability.
By fostering a strong governance culture, you can mitigate risks and enhance the effectiveness of your governance strategies.
Shifts in Governance Thinking
Holistic Governance Approach
Integrating Technical and Governance Perspectives
A holistic governance approach is essential for effective management in technical environments. This approach emphasizes the need for tailored solutions that consider the unique needs of market players and consumer contexts. By integrating technical and governance perspectives, you can improve project outcomes significantly. Organizations that actively maintain governance through continuous feedback loops adapt better to changes. This proactive stance allows you to catch issues early, ensuring that projects align with organizational goals and sustain operational efficiency.
Here are some benefits of integrating these perspectives:
- Fosters collaboration among teams.
- Ensures compliance with regulations and standards.
- Supports innovation while effectively managing risks.
- Helps organizations adapt governance frameworks to evolving needs.
Cross-Functional Team Importance
Cross-functional teams play a vital role in governance. These teams bring together diverse skills and perspectives, enhancing problem-solving capabilities. For instance, WCF Insurance's cross-functional teams focused on Wildly Important Goals (WIGs), leading to significant achievements such as retaining $200,000 in premium within the first month. Similarly, Google’s '20% time' policy led to innovations like Gmail, showcasing how collaboration fosters creativity and new product development. Harvard Business Review emphasizes that feedback mechanisms in cross-functional management enhance continuous improvement and agile decision-making.
Fostering Accountability
Encouraging Ownership of Systems
Fostering accountability within your organization is crucial for governance success. Encouraging ownership of systems leads to better outcomes. When team members feel responsible for their contributions, they are more likely to engage actively in governance processes. Here are some effective strategies for fostering accountability:
- Set clear goals and expectations.
- Encourage open communication.
- Implement regular check-ins.
- Empower employees to own their responsibilities.
- Provide necessary resources.
- Offer continuous learning opportunities.
- Recognize and reward accountability.
Building Trust Among Stakeholders
Building trust among stakeholders is essential for effective governance. Trust fosters collaboration and encourages open dialogue. When stakeholders feel valued, they are more likely to contribute positively to governance efforts. Start meetings by asking, "Who owns this and what's next?" This simple question can clarify roles and responsibilities, enhancing accountability. Additionally, publicly rewarding ownership behaviors can reinforce a culture of accountability.
By adopting a holistic governance approach and fostering accountability, you can create a more resilient organization. This strategy not only reduces technical debt but also enhances organizational readiness for future challenges.
Practical Steps for Improvement
Enhancing Communication
Tools for Better Teamwork
Effective communication is vital for successful governance in technical environments. You can enhance teamwork by utilizing various tools designed for collaboration. Here are some recommended tools:
| Category | Tool | Description |
|---|---|---|
| Project Management and Collaboration | Jira | A versatile tool for issue tracking, agile boards, and customizable workflows. |
| Project Management and Collaboration | Linear | A straightforward project management solution with features like issue tracking and automation. |
| Version Control | Git | The de facto tool for managing and tracking changes to code, facilitating collaboration. |
| Communication and Collaboration | Slack | A hub for team communication with features like channels and direct messages. |
| Communication and Collaboration | Microsoft Teams | A unified platform for communication and collaboration, integrating with Office 365. |
Using these tools can streamline your processes and improve overall governance.
Importance of Feedback Loops
Feedback loops play a crucial role in continuous improvement. They help align decision-making with public priorities and engage community residents. Here are some benefits of implementing feedback loops:
| Benefit of Feedback Loops | Description |
|---|---|
| Align decision-making with public priorities | Ensures policy choices are informed by public opinion, balancing resources against conflicting demands. |
| Engage community residents | Fosters a sense of being heard, builds buy-in for solutions, and enhances trust in local authorities. |
| Build trust | Repeated engagements increase trust in government, leading to higher civic participation and willingness to pay taxes. |
| Make public consultation the norm | Establishes a culture of informed public participation and continuous improvement in decision-making processes. |
By establishing regular feedback mechanisms, you can create a culture of transparency and responsiveness.
Effective Governance Frameworks
Best Practices for Technical Projects
Implementing a governance framework in Microsoft 365 requires a balance between user self-service capabilities and IT security needs. Here are some best practices to consider:
- Data Security: Implement multi-factor authentication and configure Data Loss Prevention policies.
- Compliance Management: Use Microsoft Compliance Manager and enable auditing and eDiscovery.
- User Access Control: Apply role-based access control with least privilege and conduct access reviews.
- Collaboration Governance: Set organization-wide sharing and guest access policies.
- Content Lifecycle Management: Define retention labels and automate content classification.
These practices ensure that your governance framework remains robust and effective.
Continuous Improvement Strategies
To foster continuous improvement in governance, consider these strategies:
- Establish architectural standards to guide development teams.
- Empower decision-making at various levels to reduce bottlenecks.
- Foster a culture of collaboration and shared responsibility among teams.
By implementing these strategies, you can enhance your governance framework and ensure it evolves with your organization’s needs.
In conclusion, effective governance is essential for successful technical projects. You must learn from past governance failures to improve future outcomes. Here are some key takeaways:
- Enhance board accountability through transparent reporting and defined roles.
- Mitigate conflicts of interest with strict policies and open communication.
- Implement robust risk management strategies using stress tests and scenario planning.
By focusing on these areas, you can prevent significant legal, ethical, and operational risks. Remember, continuous monitoring and accountability are vital for maintaining a healthy governance framework. Embrace these lessons to foster a culture of integrity and transparency in your organization. 🌟
FAQ
What is intent-based governance in Microsoft 365?
Intent-based governance focuses on the behaviors you want to encourage within your organization. It emphasizes guiding users toward desired actions rather than just configuring settings.
How can I improve user experience in technical systems?
To enhance user experience, gather feedback from users. Understand their needs and design systems that are intuitive and easy to navigate.
What are common signs of ineffective governance?
Common signs include lack of continuous governance practices, oversharing of sensitive information, and organizational design problems that create vulnerabilities.
Why is accountability important in system design?
Accountability fosters ownership among team members. When individuals feel responsible, they actively engage in governance processes, leading to better outcomes.
How can I ensure effective communication in my team?
Utilize collaboration tools like Microsoft Teams or Slack. Regular check-ins and open communication channels enhance teamwork and project success.
What role does organizational culture play in governance?
Organizational culture significantly impacts governance. A strong culture promotes ethical behavior and accountability, reducing the risk of governance failures.
How can I implement continuous improvement strategies?
Establish regular feedback loops and empower decision-making at various levels. Encourage collaboration and shared responsibility among teams to foster a culture of improvement.
What best practices should I follow for Microsoft 365 governance?
Implement data security measures, manage user access control, and define content lifecycle management policies. These practices ensure a robust governance framework.
1
00:00:00,000 --> 00:00:01,960
I'm not the most technical person in the room.
2
00:00:01,960 --> 00:00:03,280
I don't write production code.
3
00:00:03,280 --> 00:00:04,840
I don't tune infrastructure.
4
00:00:04,840 --> 00:00:07,640
But I spend time inside large Microsoft environments,
5
00:00:07,640 --> 00:00:10,440
the kind where thousands of people collaborate daily,
6
00:00:10,440 --> 00:00:12,600
where financial transactions flow through teams,
7
00:00:12,600 --> 00:00:14,680
where compliance auditors show up asking questions
8
00:00:14,680 --> 00:00:16,120
about who can access what.
9
00:00:16,120 --> 00:00:17,240
And here's what I've learned.
10
00:00:17,240 --> 00:00:20,160
The biggest problems I see are never technical problems.
11
00:00:20,160 --> 00:00:21,560
They're governance problems.
12
00:00:21,560 --> 00:00:23,200
Technology rarely fails.
13
00:00:23,200 --> 00:00:25,120
Organizations fail to structure it.
14
00:00:25,120 --> 00:00:27,880
Most people think Microsoft 365 is email, teams,
15
00:00:27,880 --> 00:00:30,080
SharePoint and OneDrive stitch together.
16
00:00:30,080 --> 00:00:31,360
It's not. At enterprise scale,
17
00:00:31,360 --> 00:00:34,480
it's the operating system for how your organization actually works.
18
00:00:34,480 --> 00:00:37,120
And when you treat an operating system like a tool collection,
19
00:00:37,120 --> 00:00:39,240
things break in ways that are hard to see coming.
20
00:00:39,240 --> 00:00:41,160
This confession isn't a blame exercise.
21
00:00:41,160 --> 00:00:42,880
The engineers in these stories were brilliant.
22
00:00:42,880 --> 00:00:45,840
The problem is that brilliance applied to the wrong problem
23
00:00:45,840 --> 00:00:48,360
creates systems that are technically perfect
24
00:00:48,360 --> 00:00:50,920
and operationally impossible.
25
00:00:50,920 --> 00:00:54,040
Redefining Microsoft 365 as an operating system.
26
00:00:54,040 --> 00:00:57,840
Let me start by reframing how we think about Microsoft 365 entirely.
27
00:00:57,840 --> 00:00:59,720
Most organizations see it this way.
28
00:00:59,720 --> 00:01:03,160
Email, real-time collaboration, file storage, identity management,
29
00:01:03,160 --> 00:01:06,680
separate tools, separate problems, separate governance models.
30
00:01:06,680 --> 00:01:08,120
But that's not what it is anymore.
31
00:01:08,120 --> 00:01:11,040
Microsoft 365 is not a collection of applications.
32
00:01:11,040 --> 00:01:14,360
At enterprise scale, it is the operating system for how people work.
33
00:01:14,360 --> 00:01:15,960
Think about what an operating system does.
34
00:01:15,960 --> 00:01:17,480
It manages resources.
35
00:01:17,480 --> 00:01:19,360
It enforces access control.
36
00:01:19,360 --> 00:01:22,280
It orchestrates how different applications interact.
37
00:01:22,280 --> 00:01:24,240
It determines what runs, when it runs,
38
00:01:24,240 --> 00:01:27,200
who can see what output and what happens if something fails.
39
00:01:27,200 --> 00:01:29,480
That's not a metaphor for Microsoft 365.
40
00:01:29,480 --> 00:01:30,800
That's what it actually is.
41
00:01:30,800 --> 00:01:33,920
When you deploy teams, you're not just installing a chat application.
42
00:01:33,920 --> 00:01:35,520
You're deploying a communication layer
43
00:01:35,520 --> 00:01:38,240
that determines how information flows through your organization.
44
00:01:38,240 --> 00:01:41,520
When you deploy SharePoint, you're not just building a file repository.
45
00:01:41,520 --> 00:01:43,880
You're building the institutional memory system
46
00:01:43,880 --> 00:01:45,520
where knowledge lives, who can find it,
47
00:01:45,520 --> 00:01:47,920
how long it persists whether it's protected or exposed.
48
00:01:47,920 --> 00:01:50,080
When you deploy Power Platform on top of that,
49
00:01:50,080 --> 00:01:51,840
you're creating automation pathways
50
00:01:51,840 --> 00:01:55,160
that touch financial systems, customer data, compliance processes.
51
00:01:55,160 --> 00:01:56,800
You're no longer just deploying tools.
52
00:01:56,800 --> 00:01:59,240
You're architecting the nervous system of your organization.
53
00:01:59,240 --> 00:01:59,920
Here's the problem.
54
00:01:59,920 --> 00:02:02,480
Most organizations understand this intellectually.
55
00:02:02,480 --> 00:02:04,280
They don't internalize it operationally.
56
00:02:04,280 --> 00:02:06,720
A tenant starts small, a few teams.
57
00:02:06,720 --> 00:02:09,760
Some SharePoint sites, a Power Automate flow or two,
58
00:02:09,760 --> 00:02:12,840
because someone figured out they could automate a manual process.
59
00:02:12,840 --> 00:02:14,960
In year one, everything feels manageable.
60
00:02:14,960 --> 00:02:17,360
The technical team understands what exists.
61
00:02:17,360 --> 00:02:20,160
Governance is informal because the scope is small,
62
00:02:20,160 --> 00:02:21,800
then the organization grows.
63
00:02:21,800 --> 00:02:23,480
Or adoption accelerates.
64
00:02:23,480 --> 00:02:26,840
Or leadership decides everyone needs teams and modern collaboration.
65
00:02:26,840 --> 00:02:29,320
By year three, the tenant is unrecognizable.
66
00:02:29,320 --> 00:02:31,440
Thousands of teams with unclear ownership,
67
00:02:31,440 --> 00:02:33,680
SharePoint sites that nobody remembers creating.
68
00:02:33,680 --> 00:02:37,000
External sharing enabled in ways that were never explicitly decided.
69
00:02:37,000 --> 00:02:39,520
It just happened because defaults are permissive.
70
00:02:39,520 --> 00:02:41,640
Power Automate flows triggering other flows,
71
00:02:41,640 --> 00:02:44,720
independency chains that only one person understood.
72
00:02:44,720 --> 00:02:46,520
Sensitivity labels that nobody applies,
73
00:02:46,520 --> 00:02:48,800
retention policies that conflict with each other,
74
00:02:48,800 --> 00:02:51,560
guest accounts from partnerships that ended two years ago.
75
00:02:51,560 --> 00:02:55,000
Admin roles sprawled across 30 people who all have global admin access
76
00:02:55,000 --> 00:02:57,680
because it was easier than designing role-based controls.
77
00:02:57,680 --> 00:02:59,760
And then something breaks or an audit happens.
78
00:02:59,760 --> 00:03:02,320
Or a competitor data breach makes leadership nervous.
79
00:03:02,320 --> 00:03:04,920
The moment organizations believe they have a technical problem,
80
00:03:04,920 --> 00:03:07,440
is the moment they actually have an architectural problem.
81
00:03:07,440 --> 00:03:10,000
They call a consulting firm, they hire a technical lead,
82
00:03:10,000 --> 00:03:11,800
they invest in advanced security tools,
83
00:03:11,800 --> 00:03:14,800
they implement stricter policies, they build better automation.
84
00:03:14,800 --> 00:03:16,880
None of that fixes what's actually broken.
85
00:03:16,880 --> 00:03:19,200
The real challenge isn't understanding the technology.
86
00:03:19,200 --> 00:03:22,120
Microsoft 365 is well engineered, the controls exist,
87
00:03:22,120 --> 00:03:23,400
the documentation is thorough.
88
00:03:23,400 --> 00:03:26,440
The problem is designing systems that humans can operate sustainably.
89
00:03:26,440 --> 00:03:28,480
That distinction matters because here's what happens.
90
00:03:28,480 --> 00:03:32,320
Technical solutions applied to governance problems create more governance problems.
91
00:03:32,320 --> 00:03:35,240
You implement aggressive conditional access policies.
92
00:03:35,240 --> 00:03:38,640
Users find workarounds, you block power platform usage.
93
00:03:38,640 --> 00:03:40,560
Citizen developers move to personal clouds,
94
00:03:40,560 --> 00:03:42,800
you require sensitivity labels on everything.
95
00:03:42,800 --> 00:03:44,200
Nobody applies them correctly,
96
00:03:44,200 --> 00:03:46,560
so you get false positives and alert fatigue.
97
00:03:46,560 --> 00:03:49,760
You've treated an architectural problem as if it were a technical problem.
98
00:03:49,760 --> 00:03:52,440
And when you do that, you push the friction somewhere else.
99
00:03:52,440 --> 00:03:57,160
The goal of Microsoft 365 architecture isn't to build systems that work on day one.
100
00:03:57,160 --> 00:04:01,440
It's to design systems that organizations can still operate five years from now.
101
00:04:01,440 --> 00:04:04,200
That requires thinking differently about what success means.
102
00:04:04,200 --> 00:04:07,400
It means asking whether every technical decision can be maintained
103
00:04:07,400 --> 00:04:09,360
when the technical person who built it leaves,
104
00:04:09,360 --> 00:04:11,960
it means building governance into the architecture from the beginning,
105
00:04:11,960 --> 00:04:13,600
not layering it on after the chaos.
106
00:04:13,600 --> 00:04:16,160
That's what we're going to explore in this conversation.
107
00:04:16,160 --> 00:04:18,800
Why technical excellence becomes a liability?
108
00:04:18,800 --> 00:04:20,760
This is where the confession gets uncomfortable.
109
00:04:20,760 --> 00:04:23,760
The best engineers are often the worst architects for enterprise governance.
110
00:04:23,760 --> 00:04:25,160
And I say that without judgment.
111
00:04:25,160 --> 00:04:26,600
I've worked with brilliant people,
112
00:04:26,600 --> 00:04:28,760
people who understand architecture deeply,
113
00:04:28,760 --> 00:04:32,760
who can design role-based access control models that are mathematically elegant,
114
00:04:32,760 --> 00:04:36,760
who build power platform solutions that solve real problems with elegant elegance.
115
00:04:36,760 --> 00:04:38,160
These are not mediocre people.
116
00:04:38,160 --> 00:04:39,760
These are people who could work anywhere.
117
00:04:39,760 --> 00:04:43,760
The problem is that technical depth creates a blindness to systemic durability.
118
00:04:43,760 --> 00:04:46,360
A brilliant engineer solves for, can we do this?
119
00:04:46,360 --> 00:04:49,360
A governance architect has to solve for, should we do this?
120
00:04:49,360 --> 00:04:51,360
And who manages it in three years?
121
00:04:51,360 --> 00:04:52,960
Those are different problems.
122
00:04:52,960 --> 00:04:54,560
And they require different thinking.
123
00:04:54,560 --> 00:04:57,560
Technical people optimize for capability that they ask,
124
00:04:57,560 --> 00:05:00,160
what's the most advanced thing we can build with this platform?
125
00:05:00,160 --> 00:05:02,960
How can we automate the maximum number of processes?
126
00:05:02,960 --> 00:05:06,160
How do we design identity controls that are theoretically perfect?
127
00:05:06,160 --> 00:05:09,160
The answer to each of those questions is often technically correct.
128
00:05:09,160 --> 00:05:13,160
But technical correctness and operational sustainability are not the same thing.
129
00:05:13,160 --> 00:05:14,960
Here's the pattern I've seen over and over.
130
00:05:14,960 --> 00:05:18,960
A brilliant architect designs a Microsoft 365 environment.
131
00:05:18,960 --> 00:05:20,560
The tenant is perfectly configured.
132
00:05:20,560 --> 00:05:23,360
Conditional access policies are granular and risk-aware.
133
00:05:23,360 --> 00:05:26,960
Role-based access control is implemented down to the SharePoint side level.
134
00:05:26,960 --> 00:05:29,960
Power platform governance includes environment separation,
135
00:05:29,960 --> 00:05:32,760
flow ownership tracking and connector restrictions.
136
00:05:32,760 --> 00:05:35,760
As your AD is clean, retention policies are consistent.
137
00:05:35,760 --> 00:05:38,160
External sharing is controlled. It's a masterpiece.
138
00:05:38,160 --> 00:05:40,360
On day one, it works beautifully.
139
00:05:40,360 --> 00:05:42,360
By year three, it's collapsing silently.
140
00:05:42,360 --> 00:05:45,160
Why? Because the organization didn't internalize the governance model.
141
00:05:45,160 --> 00:05:48,760
Nobody remembers why the conditional access policies were written that way.
142
00:05:48,760 --> 00:05:52,360
The architect who understood the entire role-based access control model
143
00:05:52,360 --> 00:05:53,560
left for another job.
144
00:05:53,560 --> 00:05:56,360
The power platform governance structure created such friction
145
00:05:56,360 --> 00:05:59,960
that citizen developers started building flows in personal environments.
146
00:05:59,960 --> 00:06:01,960
The retention policies make sense on paper,
147
00:06:01,960 --> 00:06:04,360
but conflict with how business units actually work.
148
00:06:04,360 --> 00:06:05,160
So they're ignored.
149
00:06:05,160 --> 00:06:08,960
The external sharing controls are so strict that legitimate partners can't collaborate.
150
00:06:08,960 --> 00:06:10,760
So users find workarounds.
151
00:06:10,760 --> 00:06:13,560
The system didn't fail because it was poorly designed.
152
00:06:13,560 --> 00:06:17,960
It failed because it was designed for a different organization than the one actually operating it.
153
00:06:17,960 --> 00:06:22,160
This is the fundamental split, configuration thinking versus intent-based design.
154
00:06:22,160 --> 00:06:25,960
Configuration thinking asks, "What settings should we configure?"
155
00:06:25,960 --> 00:06:27,560
It's tactical. It's specific.
156
00:06:27,560 --> 00:06:29,160
You set conditional access policies.
157
00:06:29,160 --> 00:06:30,760
You define role assignments.
158
00:06:30,760 --> 00:06:33,760
You configure retention labels. You implement DLP rules.
159
00:06:33,760 --> 00:06:36,360
All of it is technically sound. All of it works on day one.
160
00:06:36,360 --> 00:06:39,760
But configuration thinking treats governance as a problem you solve once.
161
00:06:39,760 --> 00:06:42,160
You configure the controls. You document the policies.
162
00:06:42,160 --> 00:06:44,560
You hand it off. Done.
163
00:06:44,560 --> 00:06:49,760
Intent-based design asks, "What behavior do we want the system to enforce over time?"
164
00:06:49,760 --> 00:06:53,960
That's architectural. A brilliant technical architect can design perfect configurations.
165
00:06:53,960 --> 00:06:57,360
A governance architect has to ask whether those configurations will survive
166
00:06:57,360 --> 00:06:59,360
when the technical person isn't in the room anymore.
167
00:06:59,360 --> 00:07:01,760
Will they survive when business requirements change?
168
00:07:01,760 --> 00:07:04,560
Will they survive when a new technology gets integrated?
169
00:07:04,560 --> 00:07:08,760
Will they survive when the organization acquires another company and needs to merge tenants?
170
00:07:08,760 --> 00:07:12,160
The most dangerous architecture is the one that works perfectly on day one
171
00:07:12,160 --> 00:07:13,960
and collapses silently by year three.
172
00:07:13,960 --> 00:07:16,560
It collapses quietly because there is no dramatic failure.
173
00:07:16,560 --> 00:07:19,360
Things still run, but the system has become un-maintainable.
174
00:07:19,360 --> 00:07:20,560
The governance has drifted.
175
00:07:20,560 --> 00:07:23,560
The configurations no longer align with organizational reality.
176
00:07:23,560 --> 00:07:25,960
The controls require constant manual intervention.
177
00:07:25,960 --> 00:07:28,560
The compliance audit uncovers dozens of policy exceptions
178
00:07:28,560 --> 00:07:30,160
that nobody documents anymore.
179
00:07:30,160 --> 00:07:31,960
And then you discover the real problem.
180
00:07:31,960 --> 00:07:36,960
The original architects solved for technical perfection instead of organizational sustainability.
181
00:07:36,960 --> 00:07:38,360
This isn't a blame exercise.
182
00:07:38,360 --> 00:07:40,160
The engineers in those stories were brilliant.
183
00:07:40,160 --> 00:07:43,160
That's the point. The problem is that brilliance applied to the wrong problem
184
00:07:43,160 --> 00:07:46,560
creates systems that are technically perfect and operationally impossible.
185
00:07:46,560 --> 00:07:49,960
An organization can't operate a perfectly optimized technical solution
186
00:07:49,960 --> 00:07:52,560
if nobody understands why the optimization exists.
187
00:07:52,560 --> 00:07:55,760
That's the confession. The best technical minds in the room often create
188
00:07:55,760 --> 00:07:58,560
the worst governance outcomes, not because they're incompetent,
189
00:07:58,560 --> 00:08:02,760
but because they're answering a different question than the organization needs answered.
190
00:08:02,760 --> 00:08:04,560
The three governance zones framework.
191
00:08:04,560 --> 00:08:07,960
Before we dive into the failures, let me introduce the framework that prevents them.
192
00:08:07,960 --> 00:08:10,160
Most organizations treat all data the same way.
193
00:08:10,160 --> 00:08:12,960
They apply one set of governance rules across everything.
194
00:08:12,960 --> 00:08:14,160
That's the fundamental mistake.
195
00:08:14,160 --> 00:08:16,360
Microsoft 365 isn't a single system.
196
00:08:16,360 --> 00:08:18,960
It's three different systems operating at different scales,
197
00:08:18,960 --> 00:08:22,160
serving different purposes, requiring different governance models.
198
00:08:22,160 --> 00:08:23,560
Once you see those three zones,
199
00:08:23,560 --> 00:08:25,960
the entire governance problem becomes simpler.
200
00:08:25,960 --> 00:08:28,960
Zone one is personal work, one drive, personal teams chats,
201
00:08:28,960 --> 00:08:32,360
the stuff you're working on that doesn't need to be shared with anyone.
202
00:08:32,360 --> 00:08:34,960
This zone should be user controlled with minimal governance.
203
00:08:34,960 --> 00:08:36,360
The person owns their content.
204
00:08:36,360 --> 00:08:38,160
They can organize it however they want.
205
00:08:38,160 --> 00:08:39,960
They can share it with whoever they want.
206
00:08:39,960 --> 00:08:43,160
Nobody needs to label it or review it or enforce retention.
207
00:08:43,160 --> 00:08:46,560
The organizational overhead on zone one should be almost zero.
208
00:08:46,560 --> 00:08:48,160
Zone two is collaborative work.
209
00:08:48,160 --> 00:08:53,160
Teams channels, project sites, marketing campaigns, product launches, customer engagements.
210
00:08:53,160 --> 00:08:56,560
This is where teams collaborate on things that matter but aren't regulated.
211
00:08:56,560 --> 00:08:59,360
This zone should be team managed with moderate governance.
212
00:08:59,360 --> 00:09:00,760
You need owner accountability.
213
00:09:00,760 --> 00:09:02,360
You need life cycle rules.
214
00:09:02,360 --> 00:09:05,960
If a project ends, the site gets archived, not left often forever.
215
00:09:05,960 --> 00:09:07,760
You need clear sharing policies.
216
00:09:07,760 --> 00:09:11,760
You need someone to own the space and be responsible for whether the right people have access.
217
00:09:11,760 --> 00:09:15,560
But you don't need the operational burden of treating every team's channel
218
00:09:15,560 --> 00:09:17,160
like it contains financial records.
219
00:09:17,160 --> 00:09:18,760
Zone three is enterprise records.
220
00:09:18,760 --> 00:09:21,760
HR data, finance, regulated content,
221
00:09:21,760 --> 00:09:25,160
anything that has compliance implications or legal hold requirements.
222
00:09:25,160 --> 00:09:27,360
This is where you need strict governance.
223
00:09:27,360 --> 00:09:29,360
Sensitivity labels are mandatory.
224
00:09:29,360 --> 00:09:30,960
Retention policies are enforced.
225
00:09:30,960 --> 00:09:32,560
Access is reviewed regularly.
226
00:09:32,560 --> 00:09:34,160
External sharing is restricted.
227
00:09:34,160 --> 00:09:35,160
Everything is logged.
228
00:09:35,160 --> 00:09:36,360
Everything is auditable.
229
00:09:36,360 --> 00:09:36,960
That's it.
230
00:09:36,960 --> 00:09:38,560
Three zones, three governance models.
231
00:09:38,560 --> 00:09:40,960
This model instantly simplifies governance design.
232
00:09:40,960 --> 00:09:42,160
It prevents scope creep.
233
00:09:42,160 --> 00:09:46,160
It prevents the situation where you build a governance framework designed for enterprise records.
234
00:09:46,160 --> 00:09:49,960
Then try to apply it to personal work and suddenly your organization can't function
235
00:09:49,960 --> 00:09:52,960
because people can't organize their own files without approval chains.
236
00:09:52,960 --> 00:09:57,560
Most failures occur when organizations treat all three zones with identical governance.
237
00:09:57,560 --> 00:09:59,760
They build a perfect governance model for zone three.
238
00:09:59,760 --> 00:10:02,760
Strict controls, careful oversight, compliance driven.
239
00:10:02,760 --> 00:10:04,560
Then they apply it to zone two.
240
00:10:04,560 --> 00:10:06,160
Suddenly teams becomes friction.
241
00:10:06,160 --> 00:10:07,360
Collaboration slows down.
242
00:10:07,360 --> 00:10:08,560
People find workarounds.
243
00:10:08,560 --> 00:10:09,960
They apply it to zone one.
244
00:10:09,960 --> 00:10:14,360
People start using personal cloud storage because organizational one drive is too controlled.
245
00:10:14,360 --> 00:10:16,160
You've just created shadow IT.
246
00:10:16,160 --> 00:10:17,960
Not solve the governance problem.
247
00:10:17,960 --> 00:10:21,760
The framework works because it separates three things that most organizations conflate.
248
00:10:21,760 --> 00:10:23,560
Presentation, logic and data.
249
00:10:23,560 --> 00:10:25,160
Presentation is what users see.
250
00:10:25,160 --> 00:10:28,360
The interface, the experience that should adapt to the zone.
251
00:10:28,360 --> 00:10:29,960
Zone one should feel frictionless.
252
00:10:29,960 --> 00:10:32,160
Zone three should feel controlled and auditable.
253
00:10:32,160 --> 00:10:33,560
Logic is governance.
254
00:10:33,560 --> 00:10:36,960
The rules, the policies, the controls, that should be appropriate to the zone,
255
00:10:36,960 --> 00:10:38,560
not uniform across all zones.
256
00:10:38,560 --> 00:10:40,160
Data is the actual content.
257
00:10:40,160 --> 00:10:42,760
That should be protected according to its sensitivity,
258
00:10:42,760 --> 00:10:45,160
not according to which zone it happens to sit in.
259
00:10:45,160 --> 00:10:47,160
Intent-based governance asks,
260
00:10:47,160 --> 00:10:50,360
"What behavior do we want the system to enforce over time?"
261
00:10:50,360 --> 00:10:54,760
In zone one, we want to enable personal productivity without organizational overhead.
262
00:10:54,760 --> 00:10:57,760
In zone two, we want to enable collaboration with clear ownership.
263
00:10:57,760 --> 00:11:00,760
In zone three, we want compliance and auditability.
264
00:11:00,760 --> 00:11:04,360
Configuration thinking asks, "What settings should we configure?"
265
00:11:04,360 --> 00:11:05,560
That's where chaos begins.
266
00:11:05,560 --> 00:11:09,560
You configure policies, you layer them, you create exceptions, you document them.
267
00:11:09,560 --> 00:11:14,160
Six months later, the actual behavior of the system no longer matches any documentation.
268
00:11:14,160 --> 00:11:15,960
Intent-based governance survives.
269
00:11:15,960 --> 00:11:17,760
Configuration thinking collapses.
270
00:11:17,760 --> 00:11:19,560
This framework is the foundation.
271
00:11:19,560 --> 00:11:21,560
Every governance decision flows from it,
272
00:11:21,560 --> 00:11:23,960
which is why most organizations get governance so wrong.
273
00:11:23,960 --> 00:11:25,160
They never define the zones.
274
00:11:25,160 --> 00:11:28,160
They never make explicit what behavior they actually want.
275
00:11:28,160 --> 00:11:31,160
So they end up with a configuration mess that nobody can maintain.
276
00:11:31,160 --> 00:11:34,760
Case study one, the automation hydra.
277
00:11:34,760 --> 00:11:37,560
Let's start with the first failure pattern I've seen repeatedly.
278
00:11:37,560 --> 00:11:39,160
I call it the automation hydra.
279
00:11:39,160 --> 00:11:41,360
This one starts with a legitimate business problem.
280
00:11:41,360 --> 00:11:44,360
A mid-market organization has a manual process that's slow.
281
00:11:44,360 --> 00:11:46,360
Sales collateral needs to go through approval.
282
00:11:46,360 --> 00:11:48,960
In voices need to be rooted to the right cost center.
283
00:11:48,960 --> 00:11:51,560
Customer data needs to sync between systems.
284
00:11:51,560 --> 00:11:53,560
Someone says we could automate this.
285
00:11:53,560 --> 00:11:55,360
A brilliant automation engineer.
286
00:11:55,360 --> 00:11:56,960
Or in modern organizations.
287
00:11:56,960 --> 00:12:00,760
A citizen developer with power platform training builds a power automate flow.
288
00:12:00,760 --> 00:12:03,560
It works. It solves the problem processing time drops.
289
00:12:03,560 --> 00:12:06,160
Manual errors drop. It's a success.
290
00:12:06,160 --> 00:12:07,960
Success breeds expansion.
291
00:12:07,960 --> 00:12:09,560
Can we automate this other process?
292
00:12:09,560 --> 00:12:10,360
Yes, we can.
293
00:12:10,360 --> 00:12:11,960
Another flow. This one's more complex.
294
00:12:11,960 --> 00:12:12,960
It touches more systems.
295
00:12:12,960 --> 00:12:16,760
It reads data from SharePoint writes the dynamics sends approvals through teams.
296
00:12:16,760 --> 00:12:19,360
Still works beautifully.
297
00:12:19,360 --> 00:12:22,360
Six months in, the automation program is seen as a win.
298
00:12:22,360 --> 00:12:25,960
Leadership wants more business units request flows for their own processes.
299
00:12:25,960 --> 00:12:27,960
The citizen developer ecosystem explodes.
300
00:12:27,960 --> 00:12:30,160
30 flows, 50 flows, 100 flows.
301
00:12:30,160 --> 00:12:32,560
By this point, something has changed fundamentally,
302
00:12:32,560 --> 00:12:35,160
but nobody notices yet because the flow still work.
303
00:12:35,160 --> 00:12:37,160
The problem isn't that the flows are broken.
304
00:12:37,160 --> 00:12:41,160
The problem is that they've become interdependent in ways nobody explicitly designed.
305
00:12:41,160 --> 00:12:42,560
Flow A triggers flow B.
306
00:12:42,560 --> 00:12:45,160
Flow B modifies data that flow C depends on.
307
00:12:45,160 --> 00:12:47,760
Flow C writes to a list that flow D reads from.
308
00:12:47,760 --> 00:12:50,560
These dependencies accumulate invisibly
309
00:12:50,560 --> 00:12:54,160
because they emerge from individual business problems being solved locally,
310
00:12:54,160 --> 00:12:56,360
not from a system being designed holistically.
311
00:12:56,360 --> 00:12:57,760
Then something changes.
312
00:12:57,760 --> 00:12:59,360
A business requirement shifts.
313
00:12:59,360 --> 00:13:01,960
Someone updates flow A to handle a new case type.
314
00:13:01,960 --> 00:13:03,960
That update ripples through the dependency chain.
315
00:13:03,960 --> 00:13:06,760
Flow B breaks because it wasn't expecting the new data format.
316
00:13:06,760 --> 00:13:08,560
Flow C starts generating errors.
317
00:13:08,560 --> 00:13:11,960
Three hours later, business processes are failing across the organization.
318
00:13:11,960 --> 00:13:16,360
The person who built flow A is trying to debug why downstream systems are broken.
319
00:13:16,360 --> 00:13:20,360
But they never explicitly documented that flow A was connected to flow B.
320
00:13:20,360 --> 00:13:21,760
They just knew it in their head.
321
00:13:21,760 --> 00:13:24,760
This is the moment most organizations realize they have a problem.
322
00:13:24,760 --> 00:13:25,960
The symptoms are chaos.
323
00:13:25,960 --> 00:13:28,360
Business processes breaking unexpectedly.
324
00:13:28,360 --> 00:13:29,960
IT scrambling to restore service.
325
00:13:29,960 --> 00:13:31,960
Nobody knowing the actual ownership chain.
326
00:13:31,960 --> 00:13:35,160
Rollbacks that take hours because you can't just revert flow A.
327
00:13:35,160 --> 00:13:38,960
You have to understand how it cascades through the entire automation ecosystem.
328
00:13:38,960 --> 00:13:42,960
I've seen organizations report hundreds of flows with no clear ownership.
329
00:13:42,960 --> 00:13:45,560
Thousands of flows with undocumented dependencies.
330
00:13:45,560 --> 00:13:50,160
A single update that cascaded through dozens of flows in ways nobody predicted.
331
00:13:50,160 --> 00:13:54,960
The core problem is simple automation without lifecycle governance creates a system that nobody can maintain.
332
00:13:54,960 --> 00:13:56,360
This isn't a technical problem.
333
00:13:56,360 --> 00:13:57,560
The flows are well written.
334
00:13:57,560 --> 00:13:58,960
The platform works perfectly.
335
00:13:58,960 --> 00:14:00,160
The issue is architectural.
336
00:14:00,160 --> 00:14:02,760
Citizen developers were empowered to build solutions.
337
00:14:02,760 --> 00:14:03,360
That's good.
338
00:14:03,360 --> 00:14:04,960
But nobody owned the system as a whole.
339
00:14:04,960 --> 00:14:06,360
There was no lifecycle governance.
340
00:14:06,360 --> 00:14:08,360
No one was tracking what flows existed.
341
00:14:08,360 --> 00:14:08,960
Who built them?
342
00:14:08,960 --> 00:14:09,960
What they depended on?
343
00:14:09,960 --> 00:14:11,360
Whether they were still needed.
344
00:14:11,360 --> 00:14:15,760
There was no continuous monitoring of the automation ecosystem as a living system.
345
00:14:15,760 --> 00:14:17,960
Power platform governance requires three things.
346
00:14:17,960 --> 00:14:18,760
Ownership.
347
00:14:18,760 --> 00:14:21,560
Life cycle management and continuous monitoring.
348
00:14:21,560 --> 00:14:27,960
Ownership means every flow has a documented owner who understands what it does and what it depends on.
349
00:14:27,960 --> 00:14:29,960
Not a generic IT owns this.
350
00:14:29,960 --> 00:14:31,960
A person, someone accountable.
351
00:14:31,960 --> 00:14:34,960
Life cycle management means flows have a defined lifespan.
352
00:14:34,960 --> 00:14:38,160
When the business process changes, the flow gets updated or archived.
353
00:14:38,160 --> 00:14:41,360
When the person who built it leaves, someone else becomes responsible.
354
00:14:41,360 --> 00:14:44,560
You don't accumulate orphaned flows that run forever.
355
00:14:44,560 --> 00:14:46,560
Because nobody remembers what they do.
356
00:14:46,560 --> 00:14:50,360
Continuous monitoring means you're watching the automation ecosystem as a system.
357
00:14:50,360 --> 00:14:51,560
You're tracking dependencies.
358
00:14:51,560 --> 00:14:52,960
You're measuring failure rates.
359
00:14:52,960 --> 00:14:55,560
You're understanding how changes in one flow affect others.
360
00:14:55,560 --> 00:14:58,960
You're treating automation as infrastructure that requires oversight.
361
00:14:58,960 --> 00:15:02,360
Not just as a collection of individual problems solved independently.
362
00:15:02,360 --> 00:15:05,360
Without these three things, automation becomes entropy.
363
00:15:05,360 --> 00:15:06,760
The system still works.
364
00:15:06,760 --> 00:15:08,560
Technically, the flows keep running.
365
00:15:08,560 --> 00:15:09,960
But nobody controls it anymore.
366
00:15:09,960 --> 00:15:13,360
You've created a black box of automation that the organization is now dependent on
367
00:15:13,360 --> 00:15:14,560
but can't actually manage.
368
00:15:14,560 --> 00:15:16,880
And here's the uncomfortable part, the technical solution,
369
00:15:16,880 --> 00:15:20,360
building more automation, adding more flows, deploying more sophisticated logic,
370
00:15:20,360 --> 00:15:21,760
makes the problem worse.
371
00:15:21,760 --> 00:15:25,760
Each new flow increases the complexity and interdependency of the system.
372
00:15:25,760 --> 00:15:27,760
You haven't solved the governance problem.
373
00:15:27,760 --> 00:15:28,760
You've expanded it.
374
00:15:28,760 --> 00:15:32,760
This is why brilliant automation architects can create the worst governance outcomes.
375
00:15:32,760 --> 00:15:34,760
They solve the technical problem beautifully.
376
00:15:34,760 --> 00:15:38,160
They just never ask who's going to maintain the system when I'm not here.
377
00:15:38,160 --> 00:15:40,160
Why automation entropy happens?
378
00:15:40,160 --> 00:15:42,560
The automation hydra doesn't appear overnight.
379
00:15:42,560 --> 00:15:44,160
Let me walk you through how it develops.
380
00:15:44,160 --> 00:15:45,960
It starts with the legitimate business need.
381
00:15:45,960 --> 00:15:47,560
Some manual process is slow.
382
00:15:47,560 --> 00:15:50,360
Invoices sit in inboxes waiting for approval.
383
00:15:50,360 --> 00:15:52,360
Customer data doesn't sync between systems.
384
00:15:52,360 --> 00:15:55,760
Someone notices the inefficiency and says, we could automate this.
385
00:15:55,760 --> 00:15:59,560
A citizen developer or sometimes a brilliant technical architect who's been tasked
386
00:15:59,560 --> 00:16:03,160
with modernizing legacy processes, builds a power automate flow.
387
00:16:03,160 --> 00:16:04,160
It's straightforward.
388
00:16:04,160 --> 00:16:07,960
Read data from one system, apply business logic, write to another system,
389
00:16:07,960 --> 00:16:08,960
send notifications.
390
00:16:08,960 --> 00:16:09,460
It works.
391
00:16:09,460 --> 00:16:10,760
It solves the actual problem.
392
00:16:10,760 --> 00:16:13,760
Processing time drops, manual errors disappear.
393
00:16:13,760 --> 00:16:15,560
The organization sees immediate value.
394
00:16:15,560 --> 00:16:17,060
That's not where the failure begins.
395
00:16:17,060 --> 00:16:20,360
That's where success begins and success creates the conditions for failure.
396
00:16:20,360 --> 00:16:23,560
Because once leadership sees that automation works, the question changes.
397
00:16:23,560 --> 00:16:25,360
It's no longer should we automate this.
398
00:16:25,360 --> 00:16:28,160
It becomes how many more processes can we automate?
399
00:16:28,160 --> 00:16:30,160
Citizen developers start getting requests.
400
00:16:30,160 --> 00:16:32,160
Finance wants to automate invoice routing.
401
00:16:32,160 --> 00:16:34,160
Sales wants to automate collateral approvals.
402
00:16:34,160 --> 00:16:36,160
HR wants to automate onboarding.
403
00:16:36,160 --> 00:16:37,760
Each request is legitimate.
404
00:16:37,760 --> 00:16:39,560
Each flow solves a real problem.
405
00:16:39,560 --> 00:16:42,160
But here's what's happening underneath that success.
406
00:16:42,160 --> 00:16:44,160
Dependencies are accumulating invisibly.
407
00:16:44,160 --> 00:16:45,560
The first few flows are simple.
408
00:16:45,560 --> 00:16:46,560
Independent.
409
00:16:46,560 --> 00:16:47,560
They don't interact.
410
00:16:47,560 --> 00:16:49,760
But by the 10th flow, something has shifted.
411
00:16:49,760 --> 00:16:51,760
Flow A reads from a SharePoint list.
412
00:16:51,760 --> 00:16:53,360
Flow B writes to the same list.
413
00:16:53,360 --> 00:16:55,560
Nobody explicitly decided they should interact.
414
00:16:55,560 --> 00:16:59,160
It just happened because both flows needed access to the same data.
415
00:16:59,160 --> 00:17:01,960
But now FlowB's output is becoming FlowA's input.
416
00:17:01,960 --> 00:17:03,560
FlowC joins the ecosystem.
417
00:17:03,560 --> 00:17:07,760
It monitors an email inbox and creates tasks in a list that FlowD depends on.
418
00:17:07,760 --> 00:17:11,760
FlowD triggers FlowE, which modifies records that FlowA reads.
419
00:17:11,760 --> 00:17:14,360
You now have five flows, and they're connected in ways
420
00:17:14,360 --> 00:17:16,760
that only the developers who built them understand.
421
00:17:16,760 --> 00:17:17,960
And only in their heads.
422
00:17:17,960 --> 00:17:19,760
Nobody has mapped these dependencies.
423
00:17:19,760 --> 00:17:20,560
There's no diagram.
424
00:17:20,560 --> 00:17:22,960
There's no documentation that says if you change FlowB,
425
00:17:22,960 --> 00:17:25,560
you need to check whether FlowA will still work.
426
00:17:25,560 --> 00:17:28,960
The connections emerged organically from individual problem solving.
427
00:17:28,960 --> 00:17:32,560
Each flow was designed to solve a specific business problem in isolation.
428
00:17:32,560 --> 00:17:34,960
Nobody designed how they interact as a system.
429
00:17:34,960 --> 00:17:37,560
This is the definition of technical debt in the automation space.
430
00:17:37,560 --> 00:17:38,760
It's not a broken flow.
431
00:17:38,760 --> 00:17:41,160
It's invisible coupling.
432
00:17:41,160 --> 00:17:44,360
Most organizations lack visibility into what flows exist.
433
00:17:44,360 --> 00:17:45,360
Who owns them?
434
00:17:45,360 --> 00:17:46,760
Or what data they touch?
435
00:17:46,760 --> 00:17:49,160
They know flows are running, they see the business value.
436
00:17:49,160 --> 00:17:50,760
But they don't know the actual architecture.
437
00:17:50,760 --> 00:17:52,360
If a citizen developer leaves,
438
00:17:52,360 --> 00:17:54,560
that person's flows become orphaned knowledge.
439
00:17:54,560 --> 00:17:58,160
If a business requirement changes and someone needs to update a flow,
440
00:17:58,160 --> 00:18:00,360
they don't know what else might break downstream.
441
00:18:00,360 --> 00:18:02,560
The governance failure isn't building the flows.
442
00:18:02,560 --> 00:18:04,960
It's failing to design a system that can sustain them.
443
00:18:04,960 --> 00:18:07,760
Technical people, whether citizen developers or architects,
444
00:18:07,760 --> 00:18:09,560
were solving the right problem locally.
445
00:18:09,560 --> 00:18:11,360
They were automating manual processes.
446
00:18:11,360 --> 00:18:12,560
They were reducing errors.
447
00:18:12,560 --> 00:18:13,960
They were delivering business value.
448
00:18:13,960 --> 00:18:16,160
But they were solving the wrong problem globally.
449
00:18:16,160 --> 00:18:17,160
They weren't asking,
450
00:18:17,160 --> 00:18:20,360
how will the organization maintain this automation ecosystem in two years?
451
00:18:20,360 --> 00:18:24,160
What happens when 500 flows exist and nobody can trace the dependencies?
452
00:18:24,160 --> 00:18:26,960
What happens when a critical flow breaks at two in the morning?
453
00:18:26,960 --> 00:18:28,960
And the person who built it is unreachable.
454
00:18:28,960 --> 00:18:32,160
Most organizations discover they have an automation entropy problem
455
00:18:32,160 --> 00:18:33,560
only when something breaks.
456
00:18:33,560 --> 00:18:37,160
And by that point, the system is complex enough that fixing it requires
457
00:18:37,160 --> 00:18:41,760
either reverse engineering the entire flow ecosystem or rebuilding it from scratch.
458
00:18:41,760 --> 00:18:44,760
The path to the automation hydra is paved with good intentions
459
00:18:44,760 --> 00:18:46,760
and incremental local decisions.
460
00:18:46,760 --> 00:18:47,960
Each flow was the right choice.
461
00:18:47,960 --> 00:18:50,360
None of them individually created the problem.
462
00:18:50,360 --> 00:18:54,360
The problem emerged from treating automation as a collection of independent solutions
463
00:18:54,360 --> 00:18:57,360
instead of treating it as infrastructure that needs governance.
464
00:18:57,360 --> 00:19:00,360
That's why the most dangerous automation programs are the successful ones.
465
00:19:00,360 --> 00:19:03,760
Success hides the structural problem until it's too late to fix it easily.
466
00:19:03,760 --> 00:19:06,360
Case study 2, the security fortress.
467
00:19:06,360 --> 00:19:08,760
Now let's look at the second failure pattern.
468
00:19:08,760 --> 00:19:11,560
When security architecture becomes operational friction.
469
00:19:11,560 --> 00:19:13,760
This one starts differently. There's no gradual drift.
470
00:19:13,760 --> 00:19:15,360
There's no accumulated legacy.
471
00:19:15,360 --> 00:19:17,560
This one is designed from the ground up to be secure.
472
00:19:17,560 --> 00:19:19,960
A technically brilliant security architect.
473
00:19:19,960 --> 00:19:22,560
Someone who understands zero trust principles deeply.
474
00:19:22,560 --> 00:19:26,560
Who can explain conditional access policies and device compliance in detail?
475
00:19:26,560 --> 00:19:29,960
Who knows the NIST Cybersecurity Framework called gets tasked
476
00:19:29,960 --> 00:19:31,960
with designing a modern security model?
477
00:19:31,960 --> 00:19:33,560
And they design something perfect.
478
00:19:33,560 --> 00:19:37,960
Aggressive conditional access policies that evaluate risk on every authentication attempt.
479
00:19:37,960 --> 00:19:39,960
Strict device compliance requirements
480
00:19:39,960 --> 00:19:43,360
that only allow managed devices to access corporate resources.
481
00:19:43,360 --> 00:19:46,160
MFA required for everything enforced consistently.
482
00:19:46,160 --> 00:19:49,360
External sharing restricted to pre-approved domains.
483
00:19:49,360 --> 00:19:51,760
Heavy controls on what applications can be installed.
484
00:19:51,760 --> 00:19:53,760
Careful monitoring of data movement.
485
00:19:53,760 --> 00:19:56,160
All technically sound. All best practice aligned.
486
00:19:56,160 --> 00:19:58,760
All defensible in an audit. It's a fortress.
487
00:19:58,760 --> 00:20:00,360
On day one, it works beautifully.
488
00:20:00,360 --> 00:20:03,160
The security baseline is tighter than it's ever been.
489
00:20:03,160 --> 00:20:05,360
Threat intelligent systems light up fewer threats
490
00:20:05,360 --> 00:20:07,360
because the attack surface is reduced.
491
00:20:07,360 --> 00:20:08,960
The security team feels confident.
492
00:20:08,960 --> 00:20:10,560
Leadership feels protected.
493
00:20:10,560 --> 00:20:13,560
The architecture is exactly what Microsoft and every security framework
494
00:20:13,560 --> 00:20:14,760
recommends by month three.
495
00:20:14,760 --> 00:20:16,560
The organization has quietly disabled it.
496
00:20:16,560 --> 00:20:19,560
Not officially. Nobody announces that the fortress is coming down.
497
00:20:19,560 --> 00:20:22,160
Instead, users start finding workarounds.
498
00:20:22,160 --> 00:20:23,560
Personal drop box.
499
00:20:23,560 --> 00:20:27,160
WhatsApp file transfers to send documents outside the secure system.
500
00:20:27,160 --> 00:20:30,160
Shadow SAS tools where collaborative work actually happens.
501
00:20:30,160 --> 00:20:32,960
Employees access work email from personal devices.
502
00:20:32,960 --> 00:20:36,560
Because the compliance requirements on corporate devices are too onerous.
503
00:20:36,560 --> 00:20:38,760
Project teams maintain Google Drive folders
504
00:20:38,760 --> 00:20:41,360
because SharePoint access is too restricted.
505
00:20:41,360 --> 00:20:43,560
Sales uses a personal Slack workspace
506
00:20:43,560 --> 00:20:45,360
because the corporate team's implementation
507
00:20:45,360 --> 00:20:47,960
requires too many approvals for external guest access.
508
00:20:47,960 --> 00:20:49,760
The security architecture was perfect.
509
00:20:49,760 --> 00:20:51,360
The human behavior defeated it.
510
00:20:51,360 --> 00:20:52,360
Here's the paradox.
511
00:20:52,360 --> 00:20:54,360
Security that ignores human behavior
512
00:20:54,360 --> 00:20:56,160
eventually weakens security.
513
00:20:56,160 --> 00:20:57,960
You've designed a system so restrictive
514
00:20:57,960 --> 00:20:59,560
that it makes people's jobs harder.
515
00:20:59,560 --> 00:21:00,960
They don't become more secure.
516
00:21:00,960 --> 00:21:02,160
They become creative.
517
00:21:02,160 --> 00:21:05,160
And their creativity bypasses your controls entirely.
518
00:21:05,160 --> 00:21:06,560
The research is clear on this.
519
00:21:06,560 --> 00:21:11,160
Around 80% of SAS breaches originate from misconfiguration or user behavior
520
00:21:11,160 --> 00:21:14,160
not from platform vulnerabilities or sophisticated attacks.
521
00:21:14,160 --> 00:21:16,360
Users circumventing security controls.
522
00:21:16,360 --> 00:21:17,760
Misconfigured permissions.
523
00:21:17,760 --> 00:21:20,560
Oversharing because approval processes are too slow.
524
00:21:20,560 --> 00:21:22,960
Shadow IT because official channels can't keep up.
525
00:21:22,960 --> 00:21:24,560
But here's what's important to understand.
526
00:21:24,560 --> 00:21:26,560
Users don't circumvent security
527
00:21:26,560 --> 00:21:28,360
because they're reckless or incompetent.
528
00:21:28,360 --> 00:21:31,360
They circumvent it because the system makes their job impossible.
529
00:21:31,360 --> 00:21:33,760
A sales team that can't share documents with a customer
530
00:21:33,760 --> 00:21:35,360
without a three day approval process
531
00:21:35,360 --> 00:21:38,160
will email the files to personal accounts instead.
532
00:21:38,160 --> 00:21:41,560
A finance team that can't get access to a critical tool
533
00:21:41,560 --> 00:21:43,960
in time for month and closes will find a workaround.
534
00:21:43,960 --> 00:21:45,960
A project team that can't invite a contractor
535
00:21:45,960 --> 00:21:48,760
without IT involvement will use personal cloud storage.
536
00:21:48,760 --> 00:21:50,560
The architect who designed the fortress
537
00:21:50,560 --> 00:21:52,960
designed it for a theoretical organization.
538
00:21:52,960 --> 00:21:54,960
The organization that actually operates
539
00:21:54,960 --> 00:21:57,560
requires flexibility that the fortress doesn't allow.
540
00:21:57,560 --> 00:21:59,760
Shadow IT isn't a security problem.
541
00:21:59,760 --> 00:22:00,960
It's a governance problem.
542
00:22:00,960 --> 00:22:03,560
It's the symptom of a system that's optimized for control
543
00:22:03,560 --> 00:22:05,760
instead of optimized for sustainable operation.
544
00:22:05,760 --> 00:22:07,560
You can make security tighter and tighter.
545
00:22:07,560 --> 00:22:09,360
You can reduce the attack surface.
546
00:22:09,360 --> 00:22:10,760
You can enforce more policies.
547
00:22:10,760 --> 00:22:14,160
But if the cost of compliance is higher than the cost of circumvention
548
00:22:14,160 --> 00:22:15,560
people will circumvent.
549
00:22:15,560 --> 00:22:17,560
The most dangerous security architecture is the one
550
00:22:17,560 --> 00:22:19,960
that's technically perfect but operationally impossible.
551
00:22:19,960 --> 00:22:21,360
Because it forces a choice.
552
00:22:21,360 --> 00:22:23,160
Follow the rules and don't work or work
553
00:22:23,160 --> 00:22:24,360
and don't follow the rules.
554
00:22:24,360 --> 00:22:25,560
Most people choose to work.
555
00:22:25,560 --> 00:22:27,560
This is where the confession gets uncomfortable.
556
00:22:27,560 --> 00:22:30,360
The brilliant security architect designed a perfect fortress.
557
00:22:30,360 --> 00:22:32,160
But the organization didn't need a fortress.
558
00:22:32,160 --> 00:22:33,160
It needed a city.
559
00:22:33,160 --> 00:22:36,560
It needed a system where security existed within the actual work flow
560
00:22:36,560 --> 00:22:39,160
of how people work, not as friction layered on top of it.
561
00:22:39,160 --> 00:22:41,960
It needed conditional access policies that evaluated risk
562
00:22:41,960 --> 00:22:44,160
but didn't require users to re-authenticate
563
00:22:44,160 --> 00:22:46,560
every time they switched between applications.
564
00:22:46,560 --> 00:22:49,360
It needed device compliance that was enforced intelligently,
565
00:22:49,360 --> 00:22:50,760
not with blanket restrictions
566
00:22:50,760 --> 00:22:52,360
that made remote work impossible.
567
00:22:52,360 --> 00:22:54,560
It needed external sharing controls
568
00:22:54,560 --> 00:22:55,960
that protected sensitive data
569
00:22:55,960 --> 00:22:58,560
without making legitimate collaboration impossible.
570
00:22:58,560 --> 00:23:00,160
The technical solution wasn't wrong.
571
00:23:00,160 --> 00:23:02,360
The problem was solving for the wrong organization.
572
00:23:02,360 --> 00:23:03,960
The architect solved for an organization
573
00:23:03,960 --> 00:23:05,960
that valued security above all else.
574
00:23:05,960 --> 00:23:09,360
But the actual organization valued security and productivity.
575
00:23:09,360 --> 00:23:10,960
And when those two things conflict,
576
00:23:10,960 --> 00:23:13,560
governance has to make a choice about which one wins.
577
00:23:13,560 --> 00:23:16,560
That choice can't be made by a perfect technical control.
578
00:23:16,560 --> 00:23:18,760
It has to be made by an architect who understands
579
00:23:18,760 --> 00:23:22,160
both the security requirement and the organizational reality.
580
00:23:22,160 --> 00:23:24,560
Why security friction creates shadow it?
581
00:23:24,560 --> 00:23:27,360
This pattern repeats in organizations across industries.
582
00:23:27,360 --> 00:23:28,760
Let me explain why it happens.
583
00:23:28,760 --> 00:23:32,360
Technical security architects optimize for control and risk reduction.
584
00:23:32,360 --> 00:23:33,360
That's their job.
585
00:23:33,360 --> 00:23:35,360
They're asked to make the environment more secure.
586
00:23:35,360 --> 00:23:37,360
So they asked the right question from a security perspective.
587
00:23:37,360 --> 00:23:39,960
What's the most restrictive policy we can enforce
588
00:23:39,960 --> 00:23:41,560
while still allowing work?
589
00:23:41,560 --> 00:23:43,160
But they don't ask the follow-up question
590
00:23:43,160 --> 00:23:45,360
that governance architects need to ask.
591
00:23:45,360 --> 00:23:48,160
What will users do when they can't work within these restrictions?
592
00:23:48,160 --> 00:23:51,160
Those are different questions and they lead to different answers.
593
00:23:51,160 --> 00:23:53,960
Conditional access policies that require specific devices
594
00:23:53,960 --> 00:23:54,960
like a corporate laptop,
595
00:23:54,960 --> 00:23:56,160
managed through Intune
596
00:23:56,160 --> 00:23:58,160
with specific security configurations,
597
00:23:58,160 --> 00:24:00,360
sound reasonable from a security standpoint.
598
00:24:00,360 --> 00:24:03,360
Your controlling which devices can access corporate resources.
599
00:24:03,360 --> 00:24:05,760
But if you exclude remote workers on home networks
600
00:24:05,760 --> 00:24:08,560
or contractors who don't have access to corporate hardware
601
00:24:08,560 --> 00:24:10,160
or employees traveling internationally,
602
00:24:10,160 --> 00:24:11,360
you've created a problem.
603
00:24:11,360 --> 00:24:12,960
Those people still need to work.
604
00:24:12,960 --> 00:24:14,360
So they find workarounds.
605
00:24:14,360 --> 00:24:17,360
MFA prompts that trigger to frequently create a lured fatigue.
606
00:24:17,360 --> 00:24:18,960
The user sees the MFA notification.
607
00:24:18,960 --> 00:24:20,560
They approve it without thinking.
608
00:24:20,560 --> 00:24:22,560
Then they see another one and another one.
609
00:24:22,560 --> 00:24:25,560
Pretty soon they're approving MFA prompts on autopilot
610
00:24:25,560 --> 00:24:28,760
without actually verifying that they initiated the request.
611
00:24:28,760 --> 00:24:31,760
An attacker sends a MFA push notification
612
00:24:31,760 --> 00:24:34,760
during a phishing attack the user approves it in muscle memory
613
00:24:34,760 --> 00:24:36,960
and you've just defeated your own security control.
614
00:24:36,960 --> 00:24:40,160
External sharing restrictions that prevent legitimate collaboration
615
00:24:40,160 --> 00:24:43,360
sound protective until you realize that a sales team actually needs
616
00:24:43,360 --> 00:24:45,160
to share documents with customers.
617
00:24:45,160 --> 00:24:47,960
A consultant needs to collaborate with a partner firm.
618
00:24:47,960 --> 00:24:49,960
A nonprofit needs to work with their board.
619
00:24:49,960 --> 00:24:51,960
You've restricted sharing so heavily
620
00:24:51,960 --> 00:24:53,960
that the business can't actually operate
621
00:24:53,960 --> 00:24:57,360
so people use personal email or dropbox or Google Drive Sharedfolders
622
00:24:57,360 --> 00:25:00,960
or one drive links that they email from their personal Gmail account
623
00:25:00,960 --> 00:25:02,960
because corporate email has restrictions.
624
00:25:02,960 --> 00:25:06,560
The core insight is this, security that ignores operational reality
625
00:25:06,560 --> 00:25:08,360
creates operational insecurity.
626
00:25:08,360 --> 00:25:10,960
You're designing controls for a theoretical organization
627
00:25:10,960 --> 00:25:12,960
that value security above all else.
628
00:25:12,960 --> 00:25:14,560
The actual organization needs to work.
629
00:25:14,560 --> 00:25:16,560
And when your security controls prevent work,
630
00:25:16,560 --> 00:25:18,560
you haven't made the organization more secure.
631
00:25:18,560 --> 00:25:20,560
You forced it to operate outside your controls.
632
00:25:20,560 --> 00:25:22,360
Research tells us something interesting.
633
00:25:22,360 --> 00:25:24,560
Organizations estimate their employees use
634
00:25:24,560 --> 00:25:26,960
about 37 approved applications.
635
00:25:26,960 --> 00:25:30,760
The actual number they use is around 625 different apps.
636
00:25:30,760 --> 00:25:32,760
That's a 17-fold discrepancy.
637
00:25:32,760 --> 00:25:33,960
That's not a user problem.
638
00:25:33,960 --> 00:25:35,960
That's a governance architecture problem.
639
00:25:35,960 --> 00:25:37,960
The technical solution of restricting access
640
00:25:37,960 --> 00:25:39,360
and implementing tighter controls
641
00:25:39,360 --> 00:25:40,960
doesn't address the underlying problem.
642
00:25:40,960 --> 00:25:44,160
It makes it worse because when approval processes are too slow
643
00:25:44,160 --> 00:25:46,960
or technically rigid or require expertise
644
00:25:46,960 --> 00:25:48,960
users don't have users solve their own problems.
645
00:25:48,960 --> 00:25:50,560
They find tools that work faster.
646
00:25:50,560 --> 00:25:53,560
They use SAS applications that don't require IT approval.
647
00:25:53,560 --> 00:25:55,960
They adopt AI tools that make their job easier.
648
00:25:55,960 --> 00:25:57,160
They bypass the system.
649
00:25:57,160 --> 00:25:58,360
And here's the uncomfortable part.
650
00:25:58,360 --> 00:26:00,360
The technical solution, more restrictions,
651
00:26:00,360 --> 00:26:03,360
actually increases the attack surface you're trying to protect
652
00:26:03,360 --> 00:26:05,760
because now you have unknown applications
653
00:26:05,760 --> 00:26:06,760
accessing corporate data.
654
00:26:06,760 --> 00:26:08,960
You have unapproved integrations, you can't monitor.
655
00:26:08,960 --> 00:26:11,760
You have file storage systems, you didn't choose and can't control.
656
00:26:11,760 --> 00:26:14,960
You have communication channels, you didn't architect and can't secure.
657
00:26:14,960 --> 00:26:17,160
An employee is using their personal Gmail,
658
00:26:17,160 --> 00:26:19,760
their personal Dropbox, their personal Slack workspace,
659
00:26:19,760 --> 00:26:21,560
their personal chat GPT account.
660
00:26:21,560 --> 00:26:23,760
If their personal account gets compromised,
661
00:26:23,760 --> 00:26:26,160
the attacker has access to corporate information
662
00:26:26,160 --> 00:26:28,560
and you have no visibility into where that data went
663
00:26:28,560 --> 00:26:30,160
or how it's being used.
664
00:26:30,160 --> 00:26:32,760
The fortress didn't make the organization more secure.
665
00:26:32,760 --> 00:26:34,560
It fragmented the security architecture.
666
00:26:34,560 --> 00:26:36,760
It pushed sensitive information outside the systems
667
00:26:36,760 --> 00:26:38,160
you can monitor and control.
668
00:26:38,160 --> 00:26:39,960
It created a shadow IT ecosystem
669
00:26:39,960 --> 00:26:41,760
that your security tools can't see.
670
00:26:41,760 --> 00:26:43,560
That's the paradox of security friction.
671
00:26:43,560 --> 00:26:45,560
The tighter you make legitimate workflows,
672
00:26:45,560 --> 00:26:47,560
the more you drive work outside those workflows.
673
00:26:47,560 --> 00:26:49,560
And the more work happens outside your controls,
674
00:26:49,560 --> 00:26:51,360
the more risk you've actually introduced.
675
00:26:51,360 --> 00:26:53,960
This is where intent-based governance becomes essential.
676
00:26:53,960 --> 00:26:56,960
Instead of asking what's the most restrictive policy we can enforce,
677
00:26:56,960 --> 00:26:59,360
ask how do we protect sensitive data
678
00:26:59,360 --> 00:27:01,360
while enabling the organization to work?
679
00:27:01,360 --> 00:27:04,360
Those lead to completely different architecture.
680
00:27:04,360 --> 00:27:05,960
Case study 3.
681
00:27:05,960 --> 00:27:07,160
The co-pilot stall.
682
00:27:07,160 --> 00:27:09,160
The third failure pattern is newer,
683
00:27:09,160 --> 00:27:10,360
but it's becoming the most common.
684
00:27:10,360 --> 00:27:11,760
I call it the co-pilot stall.
685
00:27:11,760 --> 00:27:13,960
Organizations pilot co-pilot with enthusiasm.
686
00:27:13,960 --> 00:27:15,760
Week 1, adoption metrics are impressive.
687
00:27:15,760 --> 00:27:18,160
Employees are excited about an AI assistant
688
00:27:18,160 --> 00:27:19,960
that understands their organization.
689
00:27:19,960 --> 00:27:21,560
Leadership approves the investment.
690
00:27:21,560 --> 00:27:22,560
The pilot expands.
691
00:27:22,560 --> 00:27:23,760
More users get licenses.
692
00:27:23,760 --> 00:27:24,960
More use cases emerge.
693
00:27:24,960 --> 00:27:27,760
Meeting recaps, email drafting, document generation.
694
00:27:27,760 --> 00:27:28,960
It all works.
695
00:27:28,960 --> 00:27:30,760
And then, between week 6 and 12,
696
00:27:30,760 --> 00:27:31,760
the rollout stops.
697
00:27:31,760 --> 00:27:33,160
Not because co-pilot is broken.
698
00:27:33,160 --> 00:27:34,760
Not because the technology failed.
699
00:27:34,760 --> 00:27:38,560
The rollout stops because AI instantly surfaces governance debt
700
00:27:38,560 --> 00:27:40,960
that the organization has been accumulating for years.
701
00:27:40,960 --> 00:27:42,960
Co-pilot doesn't just answer questions.
702
00:27:42,960 --> 00:27:44,160
It aggregates data.
703
00:27:44,160 --> 00:27:45,160
It reads emails.
704
00:27:45,160 --> 00:27:46,560
It searches teams.
705
00:27:46,560 --> 00:27:48,160
It accesses SharePoint.
706
00:27:48,160 --> 00:27:49,560
It crawls one drive.
707
00:27:49,560 --> 00:27:53,760
It understands context across the entire Microsoft 365 environment.
708
00:27:53,760 --> 00:27:55,560
And when it starts surfacing information,
709
00:27:55,560 --> 00:27:58,360
it reveals things that were previously hidden by obscurity.
710
00:27:58,360 --> 00:27:59,760
Broken permissions.
711
00:27:59,760 --> 00:28:03,560
SharePoint sites where access is set to everyone in the organization.
712
00:28:03,560 --> 00:28:05,960
Documents shared via anyone with the link
713
00:28:05,960 --> 00:28:07,960
that have been circulating for two years.
714
00:28:07,960 --> 00:28:09,960
External sharing that was supposed to be restricted
715
00:28:09,960 --> 00:28:11,760
but never was because the defaults were permissive
716
00:28:11,760 --> 00:28:12,960
and no one enforced it.
717
00:28:12,960 --> 00:28:15,160
Guest accounts from partnerships that ended years ago
718
00:28:15,160 --> 00:28:17,560
still retaining access to sensitive files.
719
00:28:17,560 --> 00:28:19,760
Sensitivity labels that were never applied,
720
00:28:19,760 --> 00:28:21,760
retention policies that conflict with each other,
721
00:28:21,760 --> 00:28:23,760
all of this existed before co-pilot.
722
00:28:23,760 --> 00:28:25,160
The governance debt was real.
723
00:28:25,160 --> 00:28:27,560
But it was invisible because finding the information was hard.
724
00:28:27,560 --> 00:28:29,560
A document shared with everyone in the organization
725
00:28:29,560 --> 00:28:31,760
wasn't a problem if nobody thought to search for it.
726
00:28:31,760 --> 00:28:33,960
An external link that exposed customer data
727
00:28:33,960 --> 00:28:35,760
wasn't visible to the security team
728
00:28:35,760 --> 00:28:38,360
if they didn't audit SharePoint links specifically.
729
00:28:38,360 --> 00:28:39,560
Permission sprawl existed,
730
00:28:39,560 --> 00:28:41,760
but it didn't manifest as an operational problem
731
00:28:41,760 --> 00:28:43,660
because accessing that data at scale
732
00:28:43,660 --> 00:28:45,060
required deliberate effort.
733
00:28:45,060 --> 00:28:47,060
Co-pilot changes that calculus.
734
00:28:47,060 --> 00:28:50,060
An AI system can access and surface all of that information
735
00:28:50,060 --> 00:28:50,660
instantly.
736
00:28:50,660 --> 00:28:54,060
If an account is compromised and an attacker has co-pilot access,
737
00:28:54,060 --> 00:28:55,860
they can accelerate internal reconnaissance
738
00:28:55,860 --> 00:28:57,860
and sensitive data harvesting at scale.
739
00:28:57,860 --> 00:28:59,460
Ask co-pilot for customer contracts.
740
00:28:59,460 --> 00:29:01,760
Co-pilot searches SharePoint and OneDrive in Teams
741
00:29:01,760 --> 00:29:04,860
and finds them because access controls haven't been enforced.
742
00:29:04,860 --> 00:29:06,660
Ask co-pilot for financial forecasts.
743
00:29:06,660 --> 00:29:10,060
Co-pilot surfaces them from shared drives and email attachments
744
00:29:10,060 --> 00:29:11,760
because they've been shared too broadly.
745
00:29:11,760 --> 00:29:14,660
Most organizations don't discover they have permissions sprawl
746
00:29:14,660 --> 00:29:16,560
until co-pilot exposes it.
747
00:29:16,560 --> 00:29:19,860
Research tells us that around 73% of AI agent deployments
748
00:29:19,860 --> 00:29:22,360
fail to scale beyond pilots due to governance failures.
749
00:29:22,360 --> 00:29:24,660
Not technical failures, governance failures.
750
00:29:24,660 --> 00:29:27,060
The organizations running the pilots realize
751
00:29:27,060 --> 00:29:29,560
they have deeper infrastructure problems than they thought.
752
00:29:29,560 --> 00:29:30,760
Permissions sprawl,
753
00:29:30,760 --> 00:29:32,360
unmanaged external sharing,
754
00:29:32,360 --> 00:29:34,660
data that's sensitive but classified as public.
755
00:29:34,660 --> 00:29:36,060
Files that should be restricted
756
00:29:36,060 --> 00:29:38,060
but are accessible to contractors or partners
757
00:29:38,060 --> 00:29:39,760
from years old partnerships
758
00:29:39,760 --> 00:29:41,660
and at that point leadership gets nervous.
759
00:29:41,660 --> 00:29:43,060
Security teams get nervous.
760
00:29:43,060 --> 00:29:44,660
If co-pilot can see this information,
761
00:29:44,660 --> 00:29:45,760
what can attack us see?
762
00:29:45,760 --> 00:29:47,560
What happens if an account gets compromised?
763
00:29:47,560 --> 00:29:50,160
What happens if external partners retain access to data
764
00:29:50,160 --> 00:29:51,660
we forgot about?
765
00:29:51,660 --> 00:29:52,660
The rollout stalls,
766
00:29:52,660 --> 00:29:55,660
while the organization tries to fix underlying governance problems
767
00:29:55,660 --> 00:29:57,660
but those problems are bigger and more pervasive
768
00:29:57,660 --> 00:29:58,660
than anyone realized,
769
00:29:58,660 --> 00:30:01,660
cleaning up permissions sprawl across thousands of SharePoint sites
770
00:30:01,660 --> 00:30:02,760
takes months.
771
00:30:02,760 --> 00:30:06,160
Auditing and removing inappropriate external sharing takes months.
772
00:30:06,160 --> 00:30:09,560
Enforcing sensitivity labels on existing content takes months
773
00:30:09,560 --> 00:30:11,960
by the time the organization is ready to scale co-pilot
774
00:30:11,960 --> 00:30:13,260
the pilot momentum is gone.
775
00:30:13,260 --> 00:30:15,660
Leadership has moved on to other priorities.
776
00:30:15,660 --> 00:30:17,060
The core inside is this.
777
00:30:17,060 --> 00:30:18,960
AI doesn't create governance problems.
778
00:30:18,960 --> 00:30:20,460
It reveals them instantly.
779
00:30:20,460 --> 00:30:22,760
Most organizations have years of permissions sprawl
780
00:30:22,760 --> 00:30:24,160
that technical people ignored
781
00:30:24,160 --> 00:30:25,860
because the system was functioning.
782
00:30:25,860 --> 00:30:26,960
From a technical perspective,
783
00:30:26,960 --> 00:30:28,960
if people could access the files they needed,
784
00:30:28,960 --> 00:30:30,160
the system was working,
785
00:30:30,160 --> 00:30:32,560
permission management seemed like a compliance problem,
786
00:30:32,560 --> 00:30:33,660
not an operational one.
787
00:30:33,660 --> 00:30:35,760
So it was deferred, push to the next year.
788
00:30:35,760 --> 00:30:37,660
Treated as something to address eventually,
789
00:30:37,660 --> 00:30:38,660
then co-pilot arrives
790
00:30:38,660 --> 00:30:40,360
and makes the deferred problem urgent.
791
00:30:40,360 --> 00:30:42,260
The technical teams build co-pilot perfectly.
792
00:30:42,260 --> 00:30:43,460
The AI works beautifully.
793
00:30:43,460 --> 00:30:45,860
The integration with Microsoft 365 is seamless.
794
00:30:45,860 --> 00:30:47,060
The capability is real.
795
00:30:47,060 --> 00:30:49,760
But the organization wasn't ready to operate it safely.
796
00:30:49,760 --> 00:30:51,560
The governance foundation wasn't solid enough
797
00:30:51,560 --> 00:30:54,060
to support an AI system that could access everything.
798
00:30:54,060 --> 00:30:56,960
This is why the most dangerous moment in technology adoption
799
00:30:56,960 --> 00:30:59,260
is right before the capability gets deployed.
800
00:30:59,260 --> 00:31:01,960
That's when your governance debt becomes visible.
801
00:31:01,960 --> 00:31:03,460
The permissions sprawl reality.
802
00:31:03,460 --> 00:31:04,860
To understand the co-pilot stall,
803
00:31:04,860 --> 00:31:06,560
you need to understand permissions sprawl
804
00:31:06,560 --> 00:31:07,960
and to understand permissions sprawl
805
00:31:07,960 --> 00:31:09,860
you have to understand how it accumulates.
806
00:31:09,860 --> 00:31:11,760
It's not the result of a single bad decision.
807
00:31:11,760 --> 00:31:14,660
It's the result of years of small, reasonable decisions
808
00:31:14,660 --> 00:31:16,460
that all point in the same direction.
809
00:31:16,460 --> 00:31:19,560
Someone creates a share point side for a project.
810
00:31:19,560 --> 00:31:21,760
The site privacy settings default to
811
00:31:21,760 --> 00:31:24,260
everyone in the organization can access this.
812
00:31:24,260 --> 00:31:25,760
That seems reasonable at the time.
813
00:31:25,760 --> 00:31:27,760
The project is organization wide.
814
00:31:27,760 --> 00:31:29,260
Everyone should be able to find it.
815
00:31:29,260 --> 00:31:30,660
So nobody changes the setting.
816
00:31:30,660 --> 00:31:32,260
Years later, the project is done.
817
00:31:32,260 --> 00:31:33,360
The site gets archived.
818
00:31:33,360 --> 00:31:34,560
But before it gets archived,
819
00:31:34,560 --> 00:31:37,060
someone uploads the latest customer contract there.
820
00:31:37,060 --> 00:31:38,960
Someone else stores the project budget.
821
00:31:38,960 --> 00:31:41,760
Another person archives client communications.
822
00:31:41,760 --> 00:31:44,960
All in a site set to everyone in the organization.
823
00:31:44,960 --> 00:31:46,760
Meanwhile, in one drive,
824
00:31:46,760 --> 00:31:49,560
a user shares a folder with anyone with the link.
825
00:31:49,560 --> 00:31:51,860
They're collaborating with a consultant on a proposal.
826
00:31:51,860 --> 00:31:53,260
The link makes it easy to share.
827
00:31:53,260 --> 00:31:54,360
Both of them can edit.
828
00:31:54,360 --> 00:31:55,560
It works beautifully.
829
00:31:55,560 --> 00:31:57,460
The consultant finishes the engagement.
830
00:31:57,460 --> 00:31:59,760
The link gets forwarded to a friend at another company
831
00:31:59,760 --> 00:32:01,060
who's starting a similar project.
832
00:32:01,060 --> 00:32:02,260
Nobody revokes the link
833
00:32:02,260 --> 00:32:04,060
because nobody remembers it exists.
834
00:32:04,060 --> 00:32:05,660
It's still active, still accessible,
835
00:32:05,660 --> 00:32:09,060
still sharing whatever files the user added to that folder.
836
00:32:09,060 --> 00:32:11,860
In Teams, a channel is set to public by default.
837
00:32:11,860 --> 00:32:14,660
Team owners don't have to explicitly allow access.
838
00:32:14,660 --> 00:32:16,860
Everyone in the organization can join.
839
00:32:16,860 --> 00:32:19,060
Someone posts a sensitive internal analysis.
840
00:32:19,060 --> 00:32:21,460
Someone else shares competitive intelligence.
841
00:32:21,460 --> 00:32:23,660
Someone posts salary information by accident.
842
00:32:23,660 --> 00:32:25,860
The default permissions don't prevent any of this.
843
00:32:25,860 --> 00:32:27,560
Nobody has enforced a policy about
844
00:32:27,560 --> 00:32:30,560
what should be shared in public channels versus private channels.
845
00:32:30,560 --> 00:32:32,260
Permission inheritance in SharePoint
846
00:32:32,260 --> 00:32:34,860
gets broken through routine administrative tasks.
847
00:32:34,860 --> 00:32:36,660
The IT team creates a site template.
848
00:32:36,660 --> 00:32:38,460
They set permissions at the site level.
849
00:32:38,460 --> 00:32:40,960
A team adds a subfolder for sensitive documents.
850
00:32:40,960 --> 00:32:43,560
They think they've restricted access to that subfolder.
851
00:32:43,560 --> 00:32:45,360
But the permissions inheritance is broken.
852
00:32:45,360 --> 00:32:47,560
The folder inherits permissions from the parent site
853
00:32:47,560 --> 00:32:48,960
which is set to broad access.
854
00:32:48,960 --> 00:32:51,260
Nobody notices because the system is functioning.
855
00:32:51,260 --> 00:32:52,560
Files are accessible.
856
00:32:52,560 --> 00:32:53,660
People can do their jobs.
857
00:32:53,660 --> 00:32:54,660
These aren't failures.
858
00:32:54,660 --> 00:32:56,860
Each decision made sense locally.
859
00:32:56,860 --> 00:32:58,760
Opening the site to everyone in the organization
860
00:32:58,760 --> 00:33:00,460
was the right call for that project.
861
00:33:00,460 --> 00:33:01,860
Using anyone with the link sharing
862
00:33:01,860 --> 00:33:03,760
was the right choice for that collaboration.
863
00:33:03,760 --> 00:33:05,660
Public teams channels are useful.
864
00:33:05,660 --> 00:33:08,960
Site level permissions are simpler than granular folder permissions.
865
00:33:08,960 --> 00:33:11,160
But over time, something that made sense
866
00:33:11,160 --> 00:33:13,660
individually becomes a problem systemically.
867
00:33:13,660 --> 00:33:15,660
Permissions sprawl is the accumulated result
868
00:33:15,660 --> 00:33:18,460
of years of just share it with everyone decisions
869
00:33:18,460 --> 00:33:19,960
layered on top of each other.
870
00:33:19,960 --> 00:33:22,160
The research is sobering.
871
00:33:22,160 --> 00:33:25,260
15% or more of business critical files are at risk
872
00:33:25,260 --> 00:33:27,660
due to oversharing or misconfigured access.
873
00:33:27,660 --> 00:33:28,760
That's not a small number.
874
00:33:28,760 --> 00:33:29,860
That's a structural problem.
875
00:33:29,860 --> 00:33:32,060
It means that across most large organizations
876
00:33:32,060 --> 00:33:33,960
a significant portion of sensitive data
877
00:33:33,960 --> 00:33:36,860
is accessible to people who shouldn't have access to it.
878
00:33:36,860 --> 00:33:39,960
Not because of a breach, not because of a security vulnerability,
879
00:33:39,960 --> 00:33:41,260
because of permissions sprawl.
880
00:33:41,260 --> 00:33:43,660
Here's what makes permissions sprawl invisible.
881
00:33:43,660 --> 00:33:45,460
When data is difficult to find,
882
00:33:45,460 --> 00:33:47,060
access doesn't become a problem.
883
00:33:47,060 --> 00:33:49,960
A customer contract shared with everyone in the organization
884
00:33:49,960 --> 00:33:52,660
isn't dangerous if finding that contract requires
885
00:33:52,660 --> 00:33:54,660
knowing the exact sharepoint site
886
00:33:54,660 --> 00:33:56,160
navigating to the right folder
887
00:33:56,160 --> 00:33:58,260
and searching through dozens of files.
888
00:33:58,260 --> 00:34:01,260
The access exists, but the information is practically hidden.
889
00:34:01,260 --> 00:34:03,460
So the security problem remains theoretical.
890
00:34:03,460 --> 00:34:06,060
But co-pilot makes all of it discoverable instantly.
891
00:34:06,060 --> 00:34:08,860
Ask co-pilot to summarize all customer contracts.
892
00:34:08,860 --> 00:34:10,660
Co-pilot searches, co-pilot finds them,
893
00:34:10,660 --> 00:34:12,460
co-pilot surfaces the information.
894
00:34:12,460 --> 00:34:14,060
The permission sprawl that was invisible
895
00:34:14,060 --> 00:34:15,760
becomes operationally obvious.
896
00:34:15,760 --> 00:34:18,160
Technical people often ignored permission sprawl
897
00:34:18,160 --> 00:34:19,460
for exactly this reason.
898
00:34:19,460 --> 00:34:20,660
The system was functioning.
899
00:34:20,660 --> 00:34:22,260
Users could access what they needed.
900
00:34:22,260 --> 00:34:23,460
Performance was fine.
901
00:34:23,460 --> 00:34:25,760
From a technical standpoint, everything was working.
902
00:34:25,760 --> 00:34:27,260
From a governance standpoint,
903
00:34:27,260 --> 00:34:29,360
the system had catastrophically failed.
904
00:34:29,360 --> 00:34:30,560
That distinction is crucial.
905
00:34:30,560 --> 00:34:32,460
A system that functions but can't be controlled
906
00:34:32,460 --> 00:34:33,460
is not actually functioning.
907
00:34:33,460 --> 00:34:35,460
Permission sprawl isn't a technical problem.
908
00:34:35,460 --> 00:34:36,960
It's an architectural failure.
909
00:34:36,960 --> 00:34:38,860
The failure to design access controls
910
00:34:38,860 --> 00:34:41,660
that remain maintainable as the organization evolves.
911
00:34:41,660 --> 00:34:42,760
Case study 4.
912
00:34:42,760 --> 00:34:44,160
The identity collapse.
913
00:34:44,160 --> 00:34:46,660
The fourth failure pattern is perhaps the most insidious
914
00:34:46,660 --> 00:34:49,160
because it's invisible until it becomes catastrophic.
915
00:34:49,160 --> 00:34:50,660
I call it the identity collapse.
916
00:34:50,660 --> 00:34:52,660
This one starts with good intentions.
917
00:34:52,660 --> 00:34:55,760
An organization builds out there as your AD instance.
918
00:34:55,760 --> 00:34:57,960
Now called EntraID with care.
919
00:34:57,960 --> 00:35:00,860
A few applications get integrated, a few hundred user accounts.
920
00:35:00,860 --> 00:35:02,560
Some basic administrative structure.
921
00:35:02,560 --> 00:35:03,560
Everything is clean.
922
00:35:03,560 --> 00:35:04,960
Everything is documented.
923
00:35:04,960 --> 00:35:06,560
Then the organization grows.
924
00:35:06,560 --> 00:35:08,160
Or they acquire another company.
925
00:35:08,160 --> 00:35:10,360
Or they integrate a new application ecosystem.
926
00:35:10,360 --> 00:35:11,860
And as your AD grows with it,
927
00:35:11,860 --> 00:35:12,660
five years later,
928
00:35:12,660 --> 00:35:15,460
the tenant has hundreds of applications integrated.
929
00:35:15,460 --> 00:35:18,360
Guest accounts from partnerships, contractors, vendors,
930
00:35:18,360 --> 00:35:19,460
thousands of them.
931
00:35:19,460 --> 00:35:21,960
Many from partnerships that ended years ago,
932
00:35:21,960 --> 00:35:24,360
Global Admin roles are scattered across the organization.
933
00:35:24,360 --> 00:35:25,360
The CIO has one.
934
00:35:25,360 --> 00:35:26,560
The IT director has one.
935
00:35:26,560 --> 00:35:28,060
The infrastructure lead has one.
936
00:35:28,060 --> 00:35:30,660
A couple of contractors who build custom integrations have one.
937
00:35:30,660 --> 00:35:32,960
Someone in finance who manages integrations has one.
938
00:35:32,960 --> 00:35:36,160
Someone in HR who manages employee lifecycle processes has one.
939
00:35:36,160 --> 00:35:37,260
Why so many global admins?
940
00:35:37,260 --> 00:35:41,360
Because Azure AD and Microsoft 365 use a centralized admin model.
941
00:35:41,360 --> 00:35:42,460
It's all or nothing.
942
00:35:42,460 --> 00:35:45,660
You either have global admin rights and can do anything to anyone
943
00:35:45,660 --> 00:35:48,060
or you have limited permissions and can't do your job.
944
00:35:48,060 --> 00:35:50,260
There's no middle ground without significant effort.
945
00:35:50,260 --> 00:35:52,560
So when someone needs the ability to create security groups,
946
00:35:52,560 --> 00:35:54,860
the quickest solution is to give them global admin.
947
00:35:54,860 --> 00:35:57,760
When someone needs to manage app permissions, global admin.
948
00:35:57,760 --> 00:36:01,460
When someone needs to reset a password for a VIP global admin,
949
00:36:01,460 --> 00:36:04,360
pretty soon you have 10 or 15 people with global admin access.
950
00:36:04,360 --> 00:36:07,160
And the problem isn't that these people are incompetent or malicious.
951
00:36:07,160 --> 00:36:11,260
The problem is that identity complexity has become invisible technical debt.
952
00:36:11,260 --> 00:36:12,560
Here's the core issue.
953
00:36:12,560 --> 00:36:15,060
Technical people often grant global admin rights,
954
00:36:15,060 --> 00:36:16,360
not because it's the right choice,
955
00:36:16,360 --> 00:36:21,060
but because the alternative designing granular role-based access control
956
00:36:21,060 --> 00:36:22,260
is too complex.
957
00:36:22,260 --> 00:36:25,160
A brilliant architect could sit down and design a perfect model.
958
00:36:25,160 --> 00:36:26,960
They could use Azure AD's built in roles.
959
00:36:26,960 --> 00:36:29,860
They could create custom roles with granular permissions.
960
00:36:29,860 --> 00:36:31,460
They could design a delegation model
961
00:36:31,460 --> 00:36:34,660
where admins have just enough privilege to do their jobs and nothing more.
962
00:36:34,660 --> 00:36:36,860
That model would be theoretically perfect,
963
00:36:36,860 --> 00:36:40,460
but it would take weeks to design and months to implement and maintain.
964
00:36:40,460 --> 00:36:44,160
And the person who designed it would have to stay around to explain it to everyone else.
965
00:36:44,160 --> 00:36:46,760
It's easier to just give people global admin and move on.
966
00:36:46,760 --> 00:36:50,460
That decision creates a system where anyone with global admin can do anything to anyone.
967
00:36:50,460 --> 00:36:53,460
They can read anyone's email, they can access anyone's files,
968
00:36:53,460 --> 00:36:56,260
they can change password policies, they can disable MFA,
969
00:36:56,260 --> 00:36:59,660
they can create new admin accounts, they can integrate malicious applications,
970
00:36:59,660 --> 00:37:02,160
they can modify conditional access policies.
971
00:37:02,160 --> 00:37:06,560
A single compromised GA account means the entire organization is compromised.
972
00:37:06,560 --> 00:37:07,660
And here's what happens next.
973
00:37:07,660 --> 00:37:09,460
When admin complexity increases,
974
00:37:09,460 --> 00:37:11,760
organizations don't grant less privilege.
975
00:37:11,760 --> 00:37:16,860
They grant more because the only way to solve the problem of too many admins with too much power is to
976
00:37:16,860 --> 00:37:19,860
give more people too much power so you have redundancy.
977
00:37:20,860 --> 00:37:23,060
Research tells us something striking.
978
00:37:23,060 --> 00:37:27,660
90% of organizations grant excessive administrative privileges in Microsoft 365.
979
00:37:27,660 --> 00:37:29,260
90% that's not an outlier.
980
00:37:29,260 --> 00:37:30,060
That's the norm.
981
00:37:30,060 --> 00:37:34,760
That's the natural outcome of the centralized admin model combined with the complexity of the platform.
982
00:37:34,760 --> 00:37:36,960
The identity collapse isn't a technical failure.
983
00:37:36,960 --> 00:37:40,060
It's an architectural failure to design sustainable delegation.
984
00:37:40,060 --> 00:37:42,860
A brilliant architect can design role-based access control.
985
00:37:42,860 --> 00:37:44,560
That's theoretically perfect.
986
00:37:44,560 --> 00:37:47,060
But if the organization can't sustain that design,
987
00:37:47,060 --> 00:37:51,260
if it requires constant maintenance, if it's too complex for other admins to understand,
988
00:37:51,260 --> 00:37:54,460
if it creates bottlenecks that force people to work around it,
989
00:37:54,460 --> 00:37:55,860
then the design has failed.
990
00:37:55,860 --> 00:37:57,960
The collapse identity architecture is the symptom.
991
00:37:57,960 --> 00:38:02,260
The root cause is an admin model that forces a choice between perfect controls
992
00:38:02,260 --> 00:38:06,060
that are un-maintainable or simple controls that are dangerously permissive.
993
00:38:06,060 --> 00:38:07,960
Most organizations choose permissive.
994
00:38:07,960 --> 00:38:10,060
And they pay for it with identity sprawl.
995
00:38:10,060 --> 00:38:15,260
This is different from the other failures because it's not about a specific capability implemented poorly.
996
00:38:15,260 --> 00:38:20,260
It's about a fundamental architectural assumption that didn't survive contact with organizational reality.
997
00:38:20,260 --> 00:38:23,060
Microsoft 365 assumes you'll design granular delegation.
998
00:38:23,060 --> 00:38:28,260
It gives you the tools, but it doesn't acknowledge that designing those controls requires expertise and ongoing effort.
999
00:38:28,260 --> 00:38:31,460
So most organizations just hand out global admin and hope nothing goes wrong.
1000
00:38:31,460 --> 00:38:33,660
By the time the organization realizes they have a problem,
1001
00:38:33,660 --> 00:38:37,960
usually because an admin account gets compromised or an audit reveals excessive privilege,
1002
00:38:37,960 --> 00:38:42,060
the identity architecture has collapsed so far that fixing it requires months of work.
1003
00:38:42,060 --> 00:38:44,560
And the worst part, the original architects weren't wrong.
1004
00:38:44,560 --> 00:38:46,160
They were solving the problem they were given.
1005
00:38:46,160 --> 00:38:50,060
They were just solving it in a way that created a bigger problem down the line.
1006
00:38:50,060 --> 00:38:52,560
Why technical people misgovernance problems?
1007
00:38:52,560 --> 00:38:56,560
Now let's step back and understand why brilliant technical people create these failures.
1008
00:38:56,560 --> 00:38:58,760
It's not malice, it's not incompetence.
1009
00:38:58,760 --> 00:39:02,760
It's a difference in how technical minds and governance minds frame problems.
1010
00:39:02,760 --> 00:39:06,760
Technical thinking is optimized for solving defined problems with measurable solutions.
1011
00:39:06,760 --> 00:39:11,260
You have a problem. You analyze it, you design a solution, you implement it, you measure whether it works.
1012
00:39:11,260 --> 00:39:16,260
Success is binary, either it works or it doesn't, either the flow processes invoices correctly or it doesn't.
1013
00:39:16,260 --> 00:39:20,260
Either the conditional access policy blocks unauthorized access or it doesn't.
1014
00:39:20,260 --> 00:39:24,260
Either the role-based access control model grants the right permissions or it doesn't.
1015
00:39:24,260 --> 00:39:26,160
Governance is fundamentally different.
1016
00:39:26,160 --> 00:39:30,360
Governance is about preventing undefined future problems through architectural design.
1017
00:39:30,360 --> 00:39:34,660
The problem isn't defined yet, you're trying to anticipate what might go wrong three years from now.
1018
00:39:34,660 --> 00:39:38,560
You're trying to design structures that will survive when requirements change.
1019
00:39:38,560 --> 00:39:43,460
You're trying to create systems that people you've never met will be able to understand and maintain.
1020
00:39:43,460 --> 00:39:47,260
These are fundamentally different cognitive tasks and they require different thinking.
1021
00:39:47,260 --> 00:39:51,460
Technical excellence creates a particular blindness when you've designed something perfectly.
1022
00:39:51,460 --> 00:39:55,960
When the configuration is elegant, when the logic is sound, when the system works beautifully on day one,
1023
00:39:55,960 --> 00:39:57,660
you have confidence in that solution.
1024
00:39:57,660 --> 00:40:03,460
That confidence is justified. You've done your job well, but that same confidence can mask governance blindness
1025
00:40:03,460 --> 00:40:06,360
because technically the solution is correct. Performance is optimal.
1026
00:40:06,360 --> 00:40:10,160
The controls work as designed. It's easy to conclude that the problem is solved.
1027
00:40:10,160 --> 00:40:12,960
But here's the truth. A perfectly configured tenant is seductive.
1028
00:40:12,960 --> 00:40:17,260
It creates the impression that you've solved a governance problem when you've only solved a technical problem.
1029
00:40:17,260 --> 00:40:20,060
Governance isn't solved. It's continuously maintained.
1030
00:40:20,060 --> 00:40:22,560
You don't build a governance model and declare it complete.
1031
00:40:22,560 --> 00:40:27,960
You build a governance model and then you spend the next five years adapting it as the organization evolves.
1032
00:40:27,960 --> 00:40:30,160
Requirements change. Technology changes.
1033
00:40:30,160 --> 00:40:33,460
People leave and new people arrive. Business units reorganize.
1034
00:40:33,460 --> 00:40:35,260
New applications get integrated.
1035
00:40:35,260 --> 00:40:40,560
New compliance requirements emerge. A governance architecture that doesn't evolve is a governance architecture that's failing.
1036
00:40:40,560 --> 00:40:46,060
Technical people often treat governance as a checklist, create policies, document them, move on.
1037
00:40:46,060 --> 00:40:50,860
You've done your job. Now governance is someone else's responsibility, but that's not how governance works.
1038
00:40:50,860 --> 00:40:55,360
Governance is a living system. It requires continuous oversight. It requires regular adaptation.
1039
00:40:55,360 --> 00:40:57,860
It requires someone to ask, is this policy still working?
1040
00:40:57,860 --> 00:41:00,460
Not just was this policy created?
1041
00:41:00,460 --> 00:41:03,860
The best technical architects I've worked with are the ones who learn to think in systems.
1042
00:41:03,860 --> 00:41:07,460
They didn't stop thinking technically. They didn't become less competent engineers.
1043
00:41:07,460 --> 00:41:09,260
But they learn to ask a different question.
1044
00:41:09,260 --> 00:41:12,260
A brilliant architect asks, does this work today?
1045
00:41:12,260 --> 00:41:15,760
A governance architect asks, what will this decision look like in three years?
1046
00:41:15,760 --> 00:41:19,060
That's the shift. Will this be maintainable when the person who built it leaves?
1047
00:41:19,060 --> 00:41:22,260
Will this make sense to someone reading the documentation six months from now?
1048
00:41:22,260 --> 00:41:26,660
Will this decision scale to 200 SharePoint sites or 200 Power Automate flows?
1049
00:41:26,660 --> 00:41:28,860
What happens when the business requirement changes?
1050
00:41:28,860 --> 00:41:30,760
What happens when the technology gets updated?
1051
00:41:30,760 --> 00:41:34,960
What happens when the organization grows? Those are governance questions. They're not sexy.
1052
00:41:34,960 --> 00:41:37,160
They don't require deep technical knowledge.
1053
00:41:37,160 --> 00:41:39,360
They require thinking about operational reality.
1054
00:41:39,360 --> 00:41:43,960
They require humble acknowledgement that perfect technical solutions often create imperfect human problems.
1055
00:41:43,960 --> 00:41:48,560
The shift from technical thinking to architectural thinking is the shift from, can we do this?
1056
00:41:48,560 --> 00:41:51,860
To, should we do this? And who manages it in three years?
1057
00:41:51,860 --> 00:41:55,060
Most technical people never make that shift. They're not taught to.
1058
00:41:55,060 --> 00:41:58,660
The incentive systems reward technical excellence, not governance durability.
1059
00:41:58,660 --> 00:42:04,160
A brilliant architecture that works beautifully for three years and collapses in year four doesn't get credit for the three years.
1060
00:42:04,160 --> 00:42:05,960
It gets blamed for the collapse.
1061
00:42:05,960 --> 00:42:13,460
But the architects who learn to think about organizational reality instead of just technical perfection are the ones who create systems that actually survive.
1062
00:42:13,460 --> 00:42:17,060
They design for the organization that exists, not the one they wish existed.
1063
00:42:17,060 --> 00:42:19,960
They ask uncomfortable questions about maintainability.
1064
00:42:19,960 --> 00:42:24,260
They accept that perfect technical solutions sometimes need to be compromised to be sustainable.
1065
00:42:24,260 --> 00:42:33,160
That distinction between technical excellence and governance durability is the line between creating great systems and creating systems that organizations can actually operate.
1066
00:42:33,160 --> 00:42:34,860
The intent-based governance shift.
1067
00:42:34,860 --> 00:42:37,860
This is where we move from diagnosing failures to preventing them.
1068
00:42:37,860 --> 00:42:41,460
Most organizations approach governance backwards. They start with implementation.
1069
00:42:41,460 --> 00:42:44,860
They ask, what settings should we configure? What policies should we write?
1070
00:42:44,860 --> 00:42:46,360
What controls should we deploy?
1071
00:42:46,360 --> 00:42:52,860
And they end up with a document that describes the configuration along with a set of controls that will drift from that documentation within six months
1072
00:42:52,860 --> 00:42:55,660
because the world changed and the documentation didn't.
1073
00:42:55,660 --> 00:42:58,060
Intent-based governance inverts that question.
1074
00:42:58,060 --> 00:43:00,660
Instead of starting with implementation, you start with intent.
1075
00:43:00,660 --> 00:43:03,660
You ask, what behavior do we want the system to enforce?
1076
00:43:03,660 --> 00:43:05,460
Not forever, because forever is a lie.
1077
00:43:05,460 --> 00:43:08,660
But over the time horizon, that matters for your organization.
1078
00:43:08,660 --> 00:43:11,860
Over the next three years, over the next business cycle.
1079
00:43:11,860 --> 00:43:14,860
This shift changes everything about how architecture is designed.
1080
00:43:14,860 --> 00:43:18,460
Configuration thinking leads to policy documents that drift from reality.
1081
00:43:18,460 --> 00:43:22,860
You write a policy, you say external sharing is restricted to approved domains.
1082
00:43:22,860 --> 00:43:24,660
That's specific. That's measurable.
1083
00:43:24,660 --> 00:43:26,260
That's a configuration you can implement.
1084
00:43:26,260 --> 00:43:30,860
But six months later, a business unit needs to share with a partner that's not on the approved list.
1085
00:43:30,860 --> 00:43:32,860
The request and exception, you add the exception.
1086
00:43:32,860 --> 00:43:35,460
Six months later, another request, another exception.
1087
00:43:35,460 --> 00:43:41,160
After a year, your approved domains list has grown to include so many domains that it's functionally unrestricted.
1088
00:43:41,160 --> 00:43:42,660
The policy document says one thing.
1089
00:43:42,660 --> 00:43:44,260
The configuration does something else.
1090
00:43:44,260 --> 00:43:46,660
The actual behavior is a third thing entirely.
1091
00:43:46,660 --> 00:43:50,660
Intent-based thinking leads to principles that guide decision making over time.
1092
00:43:50,660 --> 00:43:54,260
Instead of saying external sharing is restricted to approved domains,
1093
00:43:54,260 --> 00:43:59,260
ask, how do we enable legitimate collaboration while protecting sensitive data?
1094
00:43:59,260 --> 00:44:01,560
That's a principle. It's not a specific configuration.
1095
00:44:01,560 --> 00:44:04,460
It's a question that should guide every external sharing decision.
1096
00:44:04,460 --> 00:44:08,560
How do we balance the business need for collaboration with the security need for protection?
1097
00:44:08,560 --> 00:44:10,660
That principle can be implemented in multiple ways.
1098
00:44:10,660 --> 00:44:15,860
You could use approved domains. You could use sensitivity labels to restrict what data can be shared externally.
1099
00:44:15,860 --> 00:44:19,560
You could use conditional access to verify the external user's identity.
1100
00:44:19,560 --> 00:44:24,360
You could use DLP policies to prevent sensitive data from leaving the organization.
1101
00:44:24,360 --> 00:44:26,260
You could use a combination of all of these.
1102
00:44:26,260 --> 00:44:29,360
The principle doesn't prescribe the implementation. It guides it.
1103
00:44:29,360 --> 00:44:33,460
The first statement, external sharing is restricted to approved domains.
1104
00:44:33,460 --> 00:44:37,760
Is a configuration that will be circumvented the moment it conflicts with a business need.
1105
00:44:37,760 --> 00:44:41,860
The second statement, we enable collaboration while protecting sensitive data,
1106
00:44:41,860 --> 00:44:44,660
is an intent that can guide multiple configurations.
1107
00:44:44,660 --> 00:44:48,060
And when you need to change the configuration because a business requirement evolved,
1108
00:44:48,060 --> 00:44:50,460
you can adapt it without changing the principle.
1109
00:44:50,460 --> 00:44:55,460
Intent-based governance is more durable than configuration thinking because it survives configuration changes.
1110
00:44:55,460 --> 00:44:59,860
When you know the intent, you can adapt the implementation as the organization evolves.
1111
00:44:59,860 --> 00:45:01,960
A new business partner joins your organization.
1112
00:45:01,960 --> 00:45:05,660
They're not on the approved domains list, but they're a legitimate collaborator.
1113
00:45:05,660 --> 00:45:09,560
The principle says enable collaboration while protecting sensitive data.
1114
00:45:09,560 --> 00:45:12,060
So you add the domain. You haven't violated the principle.
1115
00:45:12,060 --> 00:45:16,460
You've applied it to a new circumstance. Configuration thinking can't do that without creating policy drift.
1116
00:45:16,460 --> 00:45:20,260
Intent-based thinking can do it without abandoning the governance model.
1117
00:45:20,260 --> 00:45:24,460
This is the fundamental shift from technical architecture to governance architecture.
1118
00:45:24,460 --> 00:45:26,960
Technical architecture optimizes for capability.
1119
00:45:26,960 --> 00:45:29,060
How do we build the most sophisticated system?
1120
00:45:29,060 --> 00:45:30,860
How do we automate the most processes?
1121
00:45:30,860 --> 00:45:33,460
How do we create the most granular access controls?
1122
00:45:33,460 --> 00:45:35,860
Governance architecture optimizes for durability.
1123
00:45:35,860 --> 00:45:36,860
How do we build a system?
1124
00:45:36,860 --> 00:45:39,060
The organization can still operate in three years.
1125
00:45:39,060 --> 00:45:42,260
How do we design governance that survives when requirements change?
1126
00:45:42,260 --> 00:45:46,860
How do we create principles that guide decision making when we can't predict every future scenario?
1127
00:45:46,860 --> 00:45:49,060
Most organizations never make this shift.
1128
00:45:49,060 --> 00:45:53,060
They hire a brilliant technical architect that architect designs perfect configurations.
1129
00:45:53,060 --> 00:45:55,360
They document them meticulously. They hand it off.
1130
00:45:55,360 --> 00:45:59,260
And the organization tries to operate that perfect system in the real world
1131
00:45:59,260 --> 00:46:03,460
where business requirements change every quarter and technology evolves every month.
1132
00:46:03,460 --> 00:46:07,260
The system gradually becomes un maintainable, not because the original design was flawed,
1133
00:46:07,260 --> 00:46:11,460
but because the design was optimized for a fictional organization, not the real one.
1134
00:46:11,460 --> 00:46:15,460
Intent-based governance works because it acknowledges a fundamental truth.
1135
00:46:15,460 --> 00:46:19,860
You can't predict the future. You don't know what business requirements will emerge three years from now.
1136
00:46:19,860 --> 00:46:23,660
You don't know what new technology will need to integrate with Microsoft 365.
1137
00:46:23,660 --> 00:46:26,260
You don't know what compliance requirement will be imposed.
1138
00:46:26,260 --> 00:46:30,460
But you can define principles that will guide good decisions even when circumstances change.
1139
00:46:30,460 --> 00:46:31,460
That's the shift.
1140
00:46:31,460 --> 00:46:37,660
From how do we configure this perfectly today to how do we design principles that will guide this organization sustainably?
1141
00:46:37,660 --> 00:46:38,660
Hmm.
1142
00:46:38,660 --> 00:46:41,060
Designing for durability, not just capability.
1143
00:46:41,060 --> 00:46:44,060
Durability is the metric that technical people often miss.
1144
00:46:44,060 --> 00:46:48,460
And it's the metric that determines whether a system survives or whether it quietly collapses.
1145
00:46:48,460 --> 00:46:51,860
A system is durable if the organization can still operate it in three years.
1146
00:46:51,860 --> 00:46:54,060
Not on day one, not with perfect conditions.
1147
00:46:54,060 --> 00:46:57,860
Three years from now with different people with change requirements with evolved technology,
1148
00:46:57,860 --> 00:47:03,060
can the organization still maintain this system? Can someone who didn't build it understand how it works?
1149
00:47:03,060 --> 00:47:06,260
Can it adapt when circumstances change? That's durability.
1150
00:47:06,260 --> 00:47:09,060
Durability requires clarity about three things.
1151
00:47:09,060 --> 00:47:09,860
Ownership.
1152
00:47:09,860 --> 00:47:13,660
Someone must be accountable for this system, not just initially but continuously.
1153
00:47:13,660 --> 00:47:17,460
Life cycle management, the system must have mechanisms to age gracefully,
1154
00:47:17,460 --> 00:47:21,060
to archive what's no longer needed to retire, what's obsolete.
1155
00:47:21,060 --> 00:47:22,460
And continuous monitoring.
1156
00:47:22,460 --> 00:47:25,860
Someone must be watching whether the system is still achieving its intent,
1157
00:47:25,860 --> 00:47:29,260
not just whether it's technically functioning. Here's where the tension emerges.
1158
00:47:29,260 --> 00:47:32,460
Technical excellence and durability are often in opposition.
1159
00:47:32,460 --> 00:47:34,860
The most capable system is often the least durable.
1160
00:47:34,860 --> 00:47:37,460
Why? Because it requires constant technical intervention.
1161
00:47:37,460 --> 00:47:40,060
It requires the architect who designed it to stay engaged.
1162
00:47:40,060 --> 00:47:41,860
It requires specialized expertise.
1163
00:47:41,860 --> 00:47:46,060
It requires people to understand subtle dependencies and complex configurations.
1164
00:47:46,060 --> 00:47:50,860
A perfectly optimized system that only the original architect understands is brilliant on day one
1165
00:47:50,860 --> 00:47:53,660
and unmentainable on day 431.
1166
00:47:53,660 --> 00:47:58,260
The most durable system is often less capable. It trades some functionality for simplicity.
1167
00:47:58,260 --> 00:48:00,860
It accepts good enough instead of perfect. It asks,
1168
00:48:00,860 --> 00:48:03,660
"Do we need this feature if it doubles the complexity?"
1169
00:48:03,660 --> 00:48:06,460
It designs for the lowest common denominator of expertise.
1170
00:48:06,460 --> 00:48:09,460
It documents extensively because it knows the person maintaining it
1171
00:48:09,460 --> 00:48:11,060
won't have the designer's context.
1172
00:48:11,060 --> 00:48:14,460
Governance architecture finds the balance between those two extremes.
1173
00:48:14,460 --> 00:48:17,060
Not maximum capability, not maximum simplicity,
1174
00:48:17,060 --> 00:48:18,860
the right capability for the organization,
1175
00:48:18,860 --> 00:48:21,060
the right complexity the organization can sustain.
1176
00:48:21,060 --> 00:48:23,660
And that balance is different for every organization.
1177
00:48:23,660 --> 00:48:28,660
A small organization that's growing fast might need more capability and can tolerate less durability
1178
00:48:28,660 --> 00:48:31,860
because the person who built the system is still there, they can adapt quickly.
1179
00:48:31,860 --> 00:48:35,460
A large enterprise organization that moves slowly might need less capability
1180
00:48:35,460 --> 00:48:38,860
but much more durability because the person who built the system left years ago
1181
00:48:38,860 --> 00:48:42,660
and the system is operated by people who have 800 other responsibilities.
1182
00:48:42,660 --> 00:48:45,660
Technical people often push toward maximum capability
1183
00:48:45,660 --> 00:48:47,660
without considering the durability cost.
1184
00:48:47,660 --> 00:48:51,460
They ask, "What's possible? What's the most sophisticated thing we can build?
1185
00:48:51,460 --> 00:48:53,060
What's the most elegant solution?"
1186
00:48:53,060 --> 00:48:55,860
Those are good questions, but they're not governance questions.
1187
00:48:55,860 --> 00:48:57,860
Governance architects ask a different question.
1188
00:48:57,860 --> 00:49:00,060
They ask, "What's the minimum capability we need?
1189
00:49:00,060 --> 00:49:01,660
Not what's possible? What's necessary?"
1190
00:49:01,660 --> 00:49:05,260
And then they ask, "What's the maximum complexity we can sustain?"
1191
00:49:05,260 --> 00:49:06,060
Not forever.
1192
00:49:06,060 --> 00:49:08,660
Over the next three to five years with the people we have,
1193
00:49:08,660 --> 00:49:10,860
with the expertise we can realistically maintain,
1194
00:49:10,860 --> 00:49:12,860
how much complexity can we actually manage?
1195
00:49:12,860 --> 00:49:13,660
Those two questions,
1196
00:49:13,660 --> 00:49:16,460
minimum capability and maximum sustainable complexity
1197
00:49:16,460 --> 00:49:18,060
completely reframe the architecture.
1198
00:49:18,060 --> 00:49:20,860
Instead of asking, "How sophisticated can we make this?"
1199
00:49:20,860 --> 00:49:24,660
you ask, "How simple can we make this while still solving the problem?"
1200
00:49:24,660 --> 00:49:29,060
Instead of designing for an ideal state where everything is perfectly configured and optimized,
1201
00:49:29,060 --> 00:49:32,660
you design for the realistic state where things degrade over time
1202
00:49:32,660 --> 00:49:36,860
and someone needs to be able to fix them without calling the original architect.
1203
00:49:36,860 --> 00:49:39,260
This reframing prevents the failures we've discussed.
1204
00:49:39,260 --> 00:49:41,260
The automation hydra doesn't emerge if you ask,
1205
00:49:41,260 --> 00:49:44,460
"What's the maximum number of flows we can sustain and actually monitor?"
1206
00:49:44,460 --> 00:49:47,260
Not unlimited, not as many as we want.
1207
00:49:47,260 --> 00:49:49,660
A specific number that the organization can manage,
1208
00:49:49,660 --> 00:49:51,260
500 flows? Maybe,
1209
00:49:51,260 --> 00:49:52,960
5,000 flows with no ownership?
1210
00:49:52,960 --> 00:49:56,060
No, that exceeds the organization's capacity to maintain it.
1211
00:49:56,060 --> 00:49:58,460
The security fortress doesn't get built if you ask,
1212
00:49:58,460 --> 00:50:02,060
"What's the minimum control we need while still protecting sensitive data?"
1213
00:50:02,060 --> 00:50:03,260
Not maximum control.
1214
00:50:03,260 --> 00:50:07,260
Minimum. What's the simplest policy that achieves the security goal while staying operable?
1215
00:50:07,260 --> 00:50:09,060
The copilot stall doesn't happen if you ask,
1216
00:50:09,060 --> 00:50:11,860
"What's our governance capacity before we deploy AI at scale?"
1217
00:50:11,860 --> 00:50:14,660
Not what's technically possible. What can we actually sustain?
1218
00:50:14,660 --> 00:50:16,660
Do we have permission governance under control?
1219
00:50:16,660 --> 00:50:18,660
Do we have sensitivity labels applied?
1220
00:50:18,660 --> 00:50:21,060
Do we have a process for managing external access?
1221
00:50:21,060 --> 00:50:24,660
If not, scale the AI slower while you build the governance foundation.
1222
00:50:24,660 --> 00:50:27,060
The identity collapse doesn't occur if you ask,
1223
00:50:27,060 --> 00:50:29,860
"What delegation model can we actually maintain?"
1224
00:50:29,860 --> 00:50:32,260
Not the theoretically perfect R-back model,
1225
00:50:32,260 --> 00:50:35,860
a model that's simple enough that other people can understand it and sustain it.
1226
00:50:35,860 --> 00:50:37,460
Durability changes everything.
1227
00:50:37,460 --> 00:50:39,060
It converts the question from,
1228
00:50:39,060 --> 00:50:42,660
"How good can we make this to how good can we make this and still manage it?"
1229
00:50:42,660 --> 00:50:46,660
And that's the question that separates architects who build systems from architects
1230
00:50:46,660 --> 00:50:49,460
who build systems that organizations can actually operate.
1231
00:50:49,460 --> 00:50:51,460
The role of organizational readiness.
1232
00:50:51,460 --> 00:50:53,860
Technical failures, often mask organizational failures.
1233
00:50:53,860 --> 00:50:54,660
Let me explain.
1234
00:50:54,660 --> 00:50:58,860
When a Microsoft 365 deployment goes sideways,
1235
00:50:58,860 --> 00:51:01,460
the first instinct is to blame the technology.
1236
00:51:01,460 --> 00:51:03,060
The system is too complex.
1237
00:51:03,060 --> 00:51:04,660
The platform has too many features.
1238
00:51:04,660 --> 00:51:06,060
The configuration is difficult.
1239
00:51:06,060 --> 00:51:07,460
Those are real frustrations.
1240
00:51:07,460 --> 00:51:09,060
But they're often not the actual problem.
1241
00:51:09,060 --> 00:51:13,060
The actual problem is that the organization wasn't ready to operate the capability.
1242
00:51:13,060 --> 00:51:16,260
An organization is ready for a capability when it can sustain it over time.
1243
00:51:16,260 --> 00:51:18,260
Not on day one with perfect conditions
1244
00:51:18,260 --> 00:51:21,060
and the person who designed it standing by to explain everything.
1245
00:51:21,060 --> 00:51:23,460
Three months in, six months in, a year in.
1246
00:51:23,460 --> 00:51:26,660
When the original project team has moved on to other priorities
1247
00:51:26,660 --> 00:51:29,060
and someone new is trying to understand how it works.
1248
00:51:29,060 --> 00:51:30,260
That's when readiness matters.
1249
00:51:30,260 --> 00:51:32,260
Readiness includes five specific things.
1250
00:51:32,260 --> 00:51:33,460
First, clear ownership.
1251
00:51:33,460 --> 00:51:36,260
Someone is explicitly responsible for this system.
1252
00:51:36,260 --> 00:51:37,860
Not IT generally.
1253
00:51:37,860 --> 00:51:39,860
Not a team that has eight other responsibilities.
1254
00:51:39,860 --> 00:51:43,860
A person or a small team that owns this capability
1255
00:51:43,860 --> 00:51:45,860
and knows they'll be held accountable if it fails.
1256
00:51:45,860 --> 00:51:47,860
Second, documented processes.
1257
00:51:47,860 --> 00:51:49,060
Not just the configuration.
1258
00:51:49,060 --> 00:51:51,460
But the process, how do people request access?
1259
00:51:51,460 --> 00:51:53,060
How do you onboard a new user?
1260
00:51:53,060 --> 00:51:54,260
What's the approval workflow?
1261
00:51:54,260 --> 00:51:56,660
Who decides whether something is working or not?
1262
00:51:56,660 --> 00:51:57,860
Third, training.
1263
00:51:57,860 --> 00:52:00,260
People who operate the system understand it.
1264
00:52:00,260 --> 00:52:01,460
They know what it's supposed to do.
1265
00:52:01,460 --> 00:52:03,460
They know how to troubleshoot basic problems.
1266
00:52:03,460 --> 00:52:05,460
They know who to call when something breaks.
1267
00:52:05,460 --> 00:52:06,660
Fourth, monitoring.
1268
00:52:06,660 --> 00:52:08,660
Someone is continuously watching the system.
1269
00:52:08,660 --> 00:52:09,460
Usage metrics.
1270
00:52:09,460 --> 00:52:10,660
Configuration drift.
1271
00:52:10,660 --> 00:52:11,860
Policy exceptions.
1272
00:52:11,860 --> 00:52:12,660
Performance.
1273
00:52:12,660 --> 00:52:15,060
Someone's job includes keeping an eye on this.
1274
00:52:15,060 --> 00:52:16,260
Fifth, governance.
1275
00:52:16,260 --> 00:52:18,660
There's a governance model that guides decisions.
1276
00:52:18,660 --> 00:52:20,260
It's not just policy documents.
1277
00:52:20,260 --> 00:52:22,660
It's a living framework that helps people
1278
00:52:22,660 --> 00:52:25,460
make the right decision when circumstances change.
1279
00:52:25,460 --> 00:52:27,460
Most organizations deploy capabilities
1280
00:52:27,460 --> 00:52:29,860
before they're ready for any of those five things.
1281
00:52:29,860 --> 00:52:32,660
Technical people often deploy because the technology is ready.
1282
00:52:32,660 --> 00:52:34,260
The configuration is perfect.
1283
00:52:34,260 --> 00:52:35,460
The feature works beautifully.
1284
00:52:35,460 --> 00:52:37,460
From a technical standpoint, you can deploy.
1285
00:52:37,460 --> 00:52:38,260
So you do.
1286
00:52:38,260 --> 00:52:39,860
But the organization isn't ready.
1287
00:52:39,860 --> 00:52:41,060
Nobody's been assigned to own it.
1288
00:52:41,060 --> 00:52:43,860
There's no documented process for how it's going to be used.
1289
00:52:43,860 --> 00:52:45,860
Training happened in a two-hour demo.
1290
00:52:45,860 --> 00:52:47,860
Nobody's monitoring whether it's actually working.
1291
00:52:47,860 --> 00:52:50,660
And there's no governance model for how decisions will be made
1292
00:52:50,660 --> 00:52:52,260
when the configuration needs to change.
1293
00:52:52,260 --> 00:52:55,460
This creates a gap, a gap between deployment and adoption.
1294
00:52:55,460 --> 00:52:57,060
Between the technical capability existing
1295
00:52:57,060 --> 00:52:59,860
and the organization being able to actually operate it.
1296
00:52:59,860 --> 00:53:01,060
That gap doesn't stay empty.
1297
00:53:01,060 --> 00:53:03,460
It gets filled with shadow IT, workarounds,
1298
00:53:03,460 --> 00:53:04,660
and technical debt.
1299
00:53:04,660 --> 00:53:06,260
Users can't figure out how to use the system.
1300
00:53:06,260 --> 00:53:07,660
So they use something else.
1301
00:53:07,660 --> 00:53:10,660
They can't get access because the approval process is undefined.
1302
00:53:10,660 --> 00:53:12,260
So they find a workaround.
1303
00:53:12,260 --> 00:53:14,660
They can't maintain the system because there's no owner.
1304
00:53:14,660 --> 00:53:16,060
So they let it degrade.
1305
00:53:16,060 --> 00:53:17,860
They can't make decisions about how to adapt it
1306
00:53:17,860 --> 00:53:19,460
because there's no governance model.
1307
00:53:19,460 --> 00:53:20,660
So they just let it drift.
1308
00:53:20,660 --> 00:53:22,660
The gap between readiness and deployment
1309
00:53:22,660 --> 00:53:24,660
becomes invisible technical debt.
1310
00:53:24,660 --> 00:53:25,860
Here's the research on this.
1311
00:53:25,860 --> 00:53:29,060
82% of IT leaders describe managing Microsoft 365
1312
00:53:29,060 --> 00:53:30,660
as a severe operational burden.
1313
00:53:30,660 --> 00:53:32,260
That statistic isn't surprising
1314
00:53:32,260 --> 00:53:35,060
if you understand organizational readiness.
1315
00:53:35,060 --> 00:53:36,660
They're trying to operate a capability
1316
00:53:36,660 --> 00:53:38,260
the organization was never ready for.
1317
00:53:38,260 --> 00:53:40,460
The burden isn't because the technology is complex.
1318
00:53:40,460 --> 00:53:42,460
The burden is because the organization is trying
1319
00:53:42,460 --> 00:53:44,660
to maintain something without clear ownership,
1320
00:53:44,660 --> 00:53:46,660
without documented processes,
1321
00:53:46,660 --> 00:53:49,460
without dedicated monitoring, without a governance model.
1322
00:53:49,460 --> 00:53:51,060
This is why technical excellence
1323
00:53:51,060 --> 00:53:54,860
without governance creates the worst Microsoft 365 tenants.
1324
00:53:54,860 --> 00:53:57,860
A brilliant architect can deploy the most sophisticated capability.
1325
00:53:57,860 --> 00:53:59,660
The configuration can be perfect.
1326
00:53:59,660 --> 00:54:01,460
The system can be elegantly designed.
1327
00:54:01,460 --> 00:54:03,660
But if the organization isn't ready to own it,
1328
00:54:03,660 --> 00:54:05,260
to maintain it, to govern it,
1329
00:54:05,260 --> 00:54:07,260
then that beautiful technical system
1330
00:54:07,260 --> 00:54:08,660
becomes an operational burden.
1331
00:54:08,660 --> 00:54:11,260
Governance architecture includes organizational readiness
1332
00:54:11,260 --> 00:54:13,660
as a prerequisite for capability deployment.
1333
00:54:13,660 --> 00:54:15,860
You don't deploy until the organization is ready.
1334
00:54:15,860 --> 00:54:18,660
You don't deploy until you have someone assigned to own it.
1335
00:54:18,660 --> 00:54:21,260
You don't deploy until you have documented processes.
1336
00:54:21,260 --> 00:54:23,060
You don't deploy until you have a governance model.
1337
00:54:23,060 --> 00:54:25,260
You deploy when the organization has the capacity
1338
00:54:25,260 --> 00:54:26,460
to sustain the capability.
1339
00:54:26,460 --> 00:54:28,660
That's a completely different deployment model
1340
00:54:28,660 --> 00:54:30,460
than most organizations use.
1341
00:54:30,460 --> 00:54:33,260
But it's the only model that prevents the failures we've discussed.
1342
00:54:33,260 --> 00:54:35,460
It's the only model where technical excellence
1343
00:54:35,460 --> 00:54:37,660
actually translates into organizational value
1344
00:54:37,660 --> 00:54:39,460
instead of organizational burden.
1345
00:54:39,460 --> 00:54:42,060
Identifying governance debt in your tenant.
1346
00:54:42,060 --> 00:54:44,060
If you want to know whether your Microsoft tenant
1347
00:54:44,060 --> 00:54:45,660
will survive the next five years,
1348
00:54:45,660 --> 00:54:47,060
start with this assessment.
1349
00:54:47,060 --> 00:54:49,260
Governance debt is different from technical debt.
1350
00:54:49,260 --> 00:54:50,860
Technical debt is something you see.
1351
00:54:50,860 --> 00:54:53,060
A broken feature, a performance issue,
1352
00:54:53,060 --> 00:54:54,260
a security vulnerability.
1353
00:54:54,260 --> 00:54:55,260
You can point to it.
1354
00:54:55,260 --> 00:54:56,260
You can measure it.
1355
00:54:56,260 --> 00:54:59,460
Governance debt is invisible until it manifests as a crisis.
1356
00:54:59,460 --> 00:55:01,460
Governance debt accumulates silently.
1357
00:55:01,460 --> 00:55:03,660
A team is created without a documented owner.
1358
00:55:03,660 --> 00:55:05,060
That's fine. It's one team.
1359
00:55:05,060 --> 00:55:07,260
But six months later, the owner leaves the organization
1360
00:55:07,260 --> 00:55:08,260
and nobody notices.
1361
00:55:08,260 --> 00:55:09,460
The team is now orphaned.
1362
00:55:09,460 --> 00:55:11,260
No one knows who should be maintaining it.
1363
00:55:11,260 --> 00:55:12,660
No one knows what data lives there.
1364
00:55:12,660 --> 00:55:14,860
No one knows if external users still have access.
1365
00:55:14,860 --> 00:55:15,860
That's governance debt.
1366
00:55:15,860 --> 00:55:16,660
It's not visible.
1367
00:55:16,660 --> 00:55:17,860
The team still exists.
1368
00:55:17,860 --> 00:55:19,460
It's still taking up storage.
1369
00:55:19,460 --> 00:55:20,860
But it's no longer being governed.
1370
00:55:20,860 --> 00:55:22,660
Multiply that across hundreds of teams.
1371
00:55:22,660 --> 00:55:24,260
Thousands of SharePoint sites.
1372
00:55:24,260 --> 00:55:25,060
Millions of files.
1373
00:55:25,060 --> 00:55:27,260
Governance debt becomes invisible technical debt
1374
00:55:27,260 --> 00:55:29,660
that's slowly consuming organizational resources
1375
00:55:29,660 --> 00:55:30,660
and creating risk.
1376
00:55:30,660 --> 00:55:33,460
Here are the key categories of governance debt to look for.
1377
00:55:33,460 --> 00:55:35,060
Identity debt is the most dangerous.
1378
00:55:35,060 --> 00:55:36,460
How many global admins do you have?
1379
00:55:36,460 --> 00:55:38,460
Microsoft recommends fewer than five.
1380
00:55:38,460 --> 00:55:40,460
If you have 20, you have identity debt.
1381
00:55:40,460 --> 00:55:43,060
How many guest accounts exist in your organization?
1382
00:55:43,060 --> 00:55:44,460
Are you actively managing them?
1383
00:55:44,460 --> 00:55:46,260
Or are they accumulating from partnerships
1384
00:55:46,260 --> 00:55:47,260
that ended years ago?
1385
00:55:47,260 --> 00:55:48,260
That's identity debt.
1386
00:55:48,260 --> 00:55:49,860
How many app registrations do you have?
1387
00:55:49,860 --> 00:55:51,260
Do you know what permissions they have?
1388
00:55:51,260 --> 00:55:52,860
Do you know if they're still being used?
1389
00:55:52,860 --> 00:55:54,460
That's identity debt.
1390
00:55:54,460 --> 00:55:56,260
Collaboration debt is pervasive.
1391
00:55:56,260 --> 00:55:58,260
Do all your teams have documented owners?
1392
00:55:58,260 --> 00:56:01,260
If you have 500 teams and only 80% have documented owners
1393
00:56:01,260 --> 00:56:03,260
that's 100 teams with unclear ownership.
1394
00:56:03,260 --> 00:56:04,460
That's governance debt.
1395
00:56:04,460 --> 00:56:06,660
Do you have inactive teams that should be archived?
1396
00:56:06,660 --> 00:56:08,460
A team that hasn't had a message in a year
1397
00:56:08,460 --> 00:56:10,060
still consumes storage?
1398
00:56:10,060 --> 00:56:11,460
It still shows up in search results.
1399
00:56:11,460 --> 00:56:14,060
It's still a place where external access might be lingering.
1400
00:56:14,060 --> 00:56:15,060
That's debt.
1401
00:56:15,060 --> 00:56:17,060
Do you have SharePoint sites that nobody can explain?
1402
00:56:17,060 --> 00:56:19,660
Sites that were created for a project that ended years ago
1403
00:56:19,660 --> 00:56:20,860
but never got archived?
1404
00:56:20,860 --> 00:56:21,660
That's debt.
1405
00:56:21,660 --> 00:56:23,660
Do you have clear policies about external sharing?
1406
00:56:23,660 --> 00:56:26,060
Or have external sharing restrictions drifted
1407
00:56:26,060 --> 00:56:27,860
so far from the original policy
1408
00:56:27,860 --> 00:56:30,060
that you don't even recognize the configuration anymore?
1409
00:56:30,060 --> 00:56:31,060
That's debt.
1410
00:56:31,060 --> 00:56:32,860
Automation debt accumulates fast.
1411
00:56:32,860 --> 00:56:35,260
How many power-automate flows exist in your organization?
1412
00:56:35,260 --> 00:56:36,260
Can you list them?
1413
00:56:36,260 --> 00:56:37,260
Do you know who owns each one?
1414
00:56:37,260 --> 00:56:38,660
Do you know what data they access?
1415
00:56:38,660 --> 00:56:40,660
Do you know the dependencies between flows?
1416
00:56:40,660 --> 00:56:43,060
If you have 500 flows and nobody can explain them,
1417
00:56:43,060 --> 00:56:44,660
you have massive automation debt.
1418
00:56:44,660 --> 00:56:46,660
Data debt is often the most consequential.
1419
00:56:46,660 --> 00:56:48,860
What percentage of your files have sensitivity labels?
1420
00:56:48,860 --> 00:56:51,460
If most of your sensitive data is unlabeled,
1421
00:56:51,460 --> 00:56:52,660
you have data debt.
1422
00:56:52,660 --> 00:56:54,460
Do you have retention policies applied?
1423
00:56:54,460 --> 00:56:56,260
Or are files living indefinitely
1424
00:56:56,260 --> 00:56:58,660
because nobody defined how long they should be kept?
1425
00:56:58,660 --> 00:56:59,460
That's debt.
1426
00:56:59,460 --> 00:57:01,460
Do you have uncontrolled external sharing?
1427
00:57:01,460 --> 00:57:04,260
Links that say anyone with this link can access it?
1428
00:57:04,260 --> 00:57:07,060
Shared folders that have been forwarded to third parties?
1429
00:57:07,060 --> 00:57:08,060
That's data debt.
1430
00:57:08,060 --> 00:57:10,060
Here's what the research tells us.
1431
00:57:10,060 --> 00:57:12,260
45% of large organizations experience
1432
00:57:12,260 --> 00:57:14,060
the security or compliance incident
1433
00:57:14,060 --> 00:57:16,660
caused by misconfiguration in the past 12 months.
1434
00:57:16,660 --> 00:57:18,060
That's not a technical incident.
1435
00:57:18,060 --> 00:57:20,260
That's governance debt manifesting as a crisis.
1436
00:57:20,260 --> 00:57:22,860
Someone deployed co-pilot and it exposed oversharing.
1437
00:57:22,860 --> 00:57:25,660
Someone was compromised and had excessive permissions.
1438
00:57:25,660 --> 00:57:27,060
Someone tried to do an access review
1439
00:57:27,060 --> 00:57:29,860
and discovered they had no idea what permissions actually existed.
1440
00:57:29,860 --> 00:57:32,860
That's governance debt that became operationally visible.
1441
00:57:32,860 --> 00:57:34,660
The earlier you identify governance debt,
1442
00:57:34,660 --> 00:57:36,060
the easier it is to address.
1443
00:57:36,060 --> 00:57:38,660
Governance debt discovered in a pilot program is fixable.
1444
00:57:38,660 --> 00:57:41,060
Governance debt discovered during a breach investigation
1445
00:57:41,060 --> 00:57:42,260
is catastrophic.
1446
00:57:42,260 --> 00:57:44,460
Most organizations don't measure governance debt
1447
00:57:44,460 --> 00:57:45,660
until it becomes a crisis.
1448
00:57:45,660 --> 00:57:48,060
They're operating in darkness, hoping nothing breaks,
1449
00:57:48,060 --> 00:57:49,060
until something does.
1450
00:57:49,060 --> 00:57:51,460
This is where the non-technical perspective becomes essential.
1451
00:57:51,460 --> 00:57:54,660
You don't need to be a technical expert to identify governance debt.
1452
00:57:54,660 --> 00:57:56,260
You need to ask simple questions.
1453
00:57:56,260 --> 00:57:57,060
Who owns this?
1454
00:57:57,060 --> 00:57:58,060
Is there a process?
1455
00:57:58,060 --> 00:57:59,060
Is someone monitoring it?
1456
00:57:59,060 --> 00:58:00,660
Can someone explain how it works?
1457
00:58:00,660 --> 00:58:03,460
If you can't answer those questions, you have governance debt.
1458
00:58:03,460 --> 00:58:04,860
That's your baseline assessment.
1459
00:58:04,860 --> 00:58:06,260
Honest answers to those questions
1460
00:58:06,260 --> 00:58:08,060
tell you whether your tenant is managed
1461
00:58:08,060 --> 00:58:09,860
or whether it's quietly accumulating debt
1462
00:58:09,860 --> 00:58:12,460
that will eventually become a crisis.
1463
00:58:12,460 --> 00:58:14,260
The tenant durability checklist.
1464
00:58:14,260 --> 00:58:16,260
Here's a practical tool you can use immediately
1465
00:58:16,260 --> 00:58:18,060
to assess your governance architecture.
1466
00:58:18,060 --> 00:58:19,660
This isn't a compliance checklist.
1467
00:58:19,660 --> 00:58:21,860
This isn't something you're doing to pass an audit.
1468
00:58:21,860 --> 00:58:23,460
This is a durability assessment.
1469
00:58:23,460 --> 00:58:26,060
It tells you whether your tenant will survive the next five years.
1470
00:58:26,060 --> 00:58:27,060
Start with identity.
1471
00:58:27,060 --> 00:58:28,460
How many global admins do you have?
1472
00:58:28,460 --> 00:58:30,260
Microsoft recommends fewer than five.
1473
00:58:30,260 --> 00:58:32,260
If you can't answer that question with certainty,
1474
00:58:32,260 --> 00:58:33,260
you have a problem.
1475
00:58:33,260 --> 00:58:34,260
If the answer is more than ten,
1476
00:58:34,260 --> 00:58:35,860
you have a serious problem.
1477
00:58:35,860 --> 00:58:38,060
Do you have a privileged access management strategies?
1478
00:58:38,060 --> 00:58:39,460
If the answer is what's that,
1479
00:58:39,460 --> 00:58:40,460
then you don't.
1480
00:58:40,460 --> 00:58:41,460
And you need one.
1481
00:58:41,460 --> 00:58:44,860
How many unmanaged guest accounts exist in your environment?
1482
00:58:44,860 --> 00:58:46,660
The number should be declining, not growing.
1483
00:58:46,660 --> 00:58:48,660
If you have guest accounts from partnerships
1484
00:58:48,660 --> 00:58:49,860
that ended three years ago,
1485
00:58:49,860 --> 00:58:51,260
still sitting in your directory,
1486
00:58:51,260 --> 00:58:52,860
that's identity debt you need to address.
1487
00:58:52,860 --> 00:58:54,060
Move to collaboration.
1488
00:58:54,060 --> 00:58:55,860
Do all your teams have documented owners?
1489
00:58:55,860 --> 00:58:57,660
The answer should be 100%.
1490
00:58:57,660 --> 00:58:59,860
Not 95%, not most of them.
1491
00:58:59,860 --> 00:59:00,860
All of them.
1492
00:59:00,860 --> 00:59:03,060
If you have teams without clear ownership,
1493
00:59:03,060 --> 00:59:04,460
you have often spaces.
1494
00:59:04,460 --> 00:59:07,260
Do you have life cycle policies for inactive teams?
1495
00:59:07,260 --> 00:59:07,860
You should.
1496
00:59:07,860 --> 00:59:09,460
Teams that haven't had activity in 90 days
1497
00:59:09,460 --> 00:59:10,660
should trigger a review.
1498
00:59:10,660 --> 00:59:12,660
They should either be renewed or archived.
1499
00:59:12,660 --> 00:59:15,060
What percentage of your sharepoint sites are often?
1500
00:59:15,060 --> 00:59:16,060
The answer should be zero.
1501
00:59:16,060 --> 00:59:18,460
If you have often sites, you're carrying dead weight.
1502
00:59:18,460 --> 00:59:19,460
Your storing data,
1503
00:59:19,460 --> 00:59:20,660
you're not actively managing.
1504
00:59:20,660 --> 00:59:23,460
Your potentially exposing information you've forgotten about.
1505
00:59:23,460 --> 00:59:24,460
Look at automation.
1506
00:59:24,460 --> 00:59:27,660
Does every power automate flow have documented ownership?
1507
00:59:27,660 --> 00:59:29,260
Again, the answer should be 100%.
1508
00:59:29,260 --> 00:59:31,860
If you have flows floating around with no clear owner,
1509
00:59:31,860 --> 00:59:33,060
you have automation debt.
1510
00:59:33,060 --> 00:59:35,260
Do you monitor flow failures and performance?
1511
00:59:35,260 --> 00:59:36,860
You should know when a flow fails.
1512
00:59:36,860 --> 00:59:38,860
You should know if performance is degrading.
1513
00:59:38,860 --> 00:59:41,860
If you're running hundreds of flows and you have no visibility
1514
00:59:41,860 --> 00:59:43,260
into whether they're actually working,
1515
00:59:43,260 --> 00:59:45,460
you've lost control of your automation platform.
1516
00:59:45,460 --> 00:59:46,860
Assess your data.
1517
00:59:46,860 --> 00:59:49,260
What percentage of files have sensitivity labels?
1518
00:59:49,260 --> 00:59:50,860
The number should be increasing over time.
1519
00:59:50,860 --> 00:59:53,860
If it's declining or if it's stuck at a low percentage,
1520
00:59:53,860 --> 00:59:55,860
you haven't built a data governance practice.
1521
00:59:55,860 --> 00:59:57,460
Do you have external sharing policies?
1522
00:59:57,460 --> 00:59:59,260
Not do you allow external sharing?
1523
00:59:59,260 --> 01:00:01,260
Do you have clear documented policies
1524
01:00:01,260 --> 01:00:03,060
about how external sharing works?
1525
01:00:03,060 --> 01:00:04,060
When is it allowed?
1526
01:00:04,060 --> 01:00:05,660
What can be shared with whom?
1527
01:00:05,660 --> 01:00:08,460
If you can't articulate your external sharing policy clearly,
1528
01:00:08,460 --> 01:00:10,260
your external sharing is out of control.
1529
01:00:10,260 --> 01:00:11,260
Look at AI readiness.
1530
01:00:11,260 --> 01:00:13,460
Have you assessed permission sprawl in your environment?
1531
01:00:13,460 --> 01:00:14,660
You don't need to have fixed it.
1532
01:00:14,660 --> 01:00:16,660
But you should at least know what you're dealing with.
1533
01:00:16,660 --> 01:00:18,460
Do you have a data classification strategy?
1534
01:00:18,460 --> 01:00:20,060
Before you deploy AI at scale,
1535
01:00:20,060 --> 01:00:23,260
you need to know what data is sensitive and what data is not.
1536
01:00:23,260 --> 01:00:25,060
If you don't have a classification strategy,
1537
01:00:25,060 --> 01:00:26,460
you're not ready for co-pilot.
1538
01:00:26,460 --> 01:00:27,460
You're not ready for agents.
1539
01:00:27,460 --> 01:00:29,460
You're not ready for AI-driven analytics.
1540
01:00:29,460 --> 01:00:30,860
You need that foundation first.
1541
01:00:30,860 --> 01:00:32,460
This checklist isn't about compliance
1542
01:00:32,460 --> 01:00:34,460
that you're not doing this to satisfy an auditor
1543
01:00:34,460 --> 01:00:36,260
or pass a security assessment.
1544
01:00:36,260 --> 01:00:38,660
You're doing this because every question on this list
1545
01:00:38,660 --> 01:00:41,060
determines whether your organization can sustain
1546
01:00:41,060 --> 01:00:43,060
its Microsoft 365 environment.
1547
01:00:43,060 --> 01:00:46,060
If you can answer every question clearly with the should be answer,
1548
01:00:46,060 --> 01:00:47,860
your governance architecture is solid.
1549
01:00:47,860 --> 01:00:48,860
You have durability.
1550
01:00:48,860 --> 01:00:50,260
You have clarity about ownership.
1551
01:00:50,260 --> 01:00:51,660
You have lifecycle management.
1552
01:00:51,660 --> 01:00:52,460
You have monitoring.
1553
01:00:52,460 --> 01:00:54,460
You have policies that are actually being followed.
1554
01:00:54,460 --> 01:00:57,260
You have a foundation that can sustain the platform as it evolves.
1555
01:00:57,260 --> 01:00:59,860
If you can't answer most of these questions clearly
1556
01:00:59,860 --> 01:01:02,460
or if your answers don't match the should be standard,
1557
01:01:02,460 --> 01:01:04,260
then your governance architecture is weak.
1558
01:01:04,260 --> 01:01:06,260
That doesn't mean your technical setup is broken.
1559
01:01:06,260 --> 01:01:07,860
It means you're carrying governance debt.
1560
01:01:07,860 --> 01:01:10,060
You're operating in a system where clarity is missing.
1561
01:01:10,060 --> 01:01:11,660
Where ownership is ambiguous.
1562
01:01:11,660 --> 01:01:14,460
Where monitoring is happening by accident instead of by design.
1563
01:01:14,460 --> 01:01:16,460
Where policies are drifting from reality.
1564
01:01:16,460 --> 01:01:19,460
That's the difference between a managed tenant and an unmanaged one.
1565
01:01:19,460 --> 01:01:20,860
Not whether the technology works,
1566
01:01:20,860 --> 01:01:23,060
whether the organization can actually operate it.
1567
01:01:23,060 --> 01:01:25,460
Moving from configuration to intent.
1568
01:01:25,460 --> 01:01:27,460
Let me walk you through how to make the shift
1569
01:01:27,460 --> 01:01:30,260
from configuration thinking to intent-based governance.
1570
01:01:30,260 --> 01:01:32,260
This is where the abstraction becomes practical.
1571
01:01:32,260 --> 01:01:34,060
Most organizations never do this shift.
1572
01:01:34,060 --> 01:01:36,460
They follow their current approach and call it governance.
1573
01:01:36,460 --> 01:01:38,460
But there's a five-step framework
1574
01:01:38,460 --> 01:01:40,460
that changes everything about how you design
1575
01:01:40,460 --> 01:01:41,860
sustainable architecture.
1576
01:01:41,860 --> 01:01:43,860
Step one is deceptively simple.
1577
01:01:43,860 --> 01:01:46,860
Identify the intent behind each major governance area.
1578
01:01:46,860 --> 01:01:48,660
Not the configuration, the intent.
1579
01:01:48,660 --> 01:01:50,060
What are we actually trying to achieve?
1580
01:01:50,060 --> 01:01:52,260
Instead of starting with reinforce MFA,
1581
01:01:52,260 --> 01:01:55,260
ask what behavior do we want to enforce around authentication?
1582
01:01:55,260 --> 01:01:57,660
That's different. MFA is an implementation detail.
1583
01:01:57,660 --> 01:01:58,660
It's a control.
1584
01:01:58,660 --> 01:02:01,060
But the behavior you're trying to enforce might be
1585
01:02:01,060 --> 01:02:04,260
users are authenticated based on risk, not just identity.
1586
01:02:04,260 --> 01:02:07,660
Or critical accounts require stronger authentication
1587
01:02:07,660 --> 01:02:08,860
than standard accounts.
1588
01:02:08,860 --> 01:02:11,260
Or systems should adapt authentication requirements
1589
01:02:11,260 --> 01:02:12,260
based on context.
1590
01:02:12,260 --> 01:02:14,660
Those are intents. MFA supports those intents.
1591
01:02:14,660 --> 01:02:16,060
But MFA isn't the intent itself.
1592
01:02:16,060 --> 01:02:18,660
This distinction matters because intents are durable.
1593
01:02:18,660 --> 01:02:20,060
Controls are temporary.
1594
01:02:20,060 --> 01:02:22,460
Step two is designing principles that express intent
1595
01:02:22,460 --> 01:02:24,460
without prescribing implementation.
1596
01:02:24,460 --> 01:02:26,660
A principle is a statement that guides decisions
1597
01:02:26,660 --> 01:02:29,660
without dictating exactly how you implement it.
1598
01:02:29,660 --> 01:02:32,260
For authentication, a principle might be
1599
01:02:32,260 --> 01:02:34,460
we authenticate users based on risk,
1600
01:02:34,460 --> 01:02:35,860
not just identity.
1601
01:02:35,860 --> 01:02:38,260
That's a principle. It's not a specific configuration.
1602
01:02:38,260 --> 01:02:39,860
It acknowledges that we're not treating
1603
01:02:39,860 --> 01:02:42,060
all authentication requests the same.
1604
01:02:42,060 --> 01:02:43,860
A user logging in from their home computer
1605
01:02:43,860 --> 01:02:47,260
on the corporate network at 9 a.m. gets one level of scrutiny.
1606
01:02:47,260 --> 01:02:50,260
A user logging in from a VPN from a new device at 3 a.m. gets
1607
01:02:50,260 --> 01:02:52,660
different scrutiny, same principle.
1608
01:02:52,660 --> 01:02:54,460
Different implementation.
1609
01:02:54,460 --> 01:02:55,660
Another principle.
1610
01:02:55,660 --> 01:02:58,860
We balance security friction with operational usability.
1611
01:02:58,860 --> 01:03:00,260
That one acknowledges the tension.
1612
01:03:00,260 --> 01:03:02,860
You're not maximizing security at the expense of work.
1613
01:03:02,860 --> 01:03:03,860
You're finding the balance.
1614
01:03:03,860 --> 01:03:06,860
What that balance is depends on the user, the application,
1615
01:03:06,860 --> 01:03:09,060
the risk, the principle guides the decision.
1616
01:03:09,060 --> 01:03:10,260
It doesn't prescribe it.
1617
01:03:10,260 --> 01:03:12,260
Step three is defining configurations
1618
01:03:12,260 --> 01:03:13,460
that support the principle.
1619
01:03:13,460 --> 01:03:16,060
This is where you move from principle to implementation.
1620
01:03:16,060 --> 01:03:18,260
What specific controls policies, settings
1621
01:03:18,260 --> 01:03:20,260
will help you achieve this principle.
1622
01:03:20,260 --> 01:03:22,460
For we authenticate users based on risk,
1623
01:03:22,460 --> 01:03:25,460
your configurations might include conditional access policies
1624
01:03:25,460 --> 01:03:27,460
that increase authentication requirements
1625
01:03:27,460 --> 01:03:29,260
for high-risk scenarios.
1626
01:03:29,260 --> 01:03:32,460
MFA requirements for users accessing sensitive data,
1627
01:03:32,460 --> 01:03:34,260
device compliance checks that verify the device
1628
01:03:34,260 --> 01:03:35,860
meet security baselines.
1629
01:03:35,860 --> 01:03:38,260
Location-based access rules that adjust requirements
1630
01:03:38,260 --> 01:03:39,860
for unfamiliar locations.
1631
01:03:39,860 --> 01:03:42,660
Usage analytics that identify anomalous behavior
1632
01:03:42,660 --> 01:03:44,460
and trigger additional authentication.
1633
01:03:44,460 --> 01:03:47,860
You're not saying we require MFA for everyone always.
1634
01:03:47,860 --> 01:03:51,660
You're saying we require MFA contextually based on risk.
1635
01:03:51,660 --> 01:03:53,060
The configuration is flexible.
1636
01:03:53,060 --> 01:03:54,260
It adapts.
1637
01:03:54,260 --> 01:03:56,460
Step four is building feedback loops that measure
1638
01:03:56,460 --> 01:03:58,260
whether the principle is being achieved.
1639
01:03:58,260 --> 01:04:01,260
You're not measuring whether the configuration is in place.
1640
01:04:01,260 --> 01:04:04,060
You're measuring whether the actual behavior is what you intended.
1641
01:04:04,060 --> 01:04:06,260
Are users authenticated appropriately?
1642
01:04:06,260 --> 01:04:09,060
Are legitimate users able to work without excessive friction?
1643
01:04:09,060 --> 01:04:10,060
Are attacks being blocked?
1644
01:04:10,060 --> 01:04:12,660
Is the system still usable or have you overengineered it?
1645
01:04:12,660 --> 01:04:14,060
Those are the questions you're answering.
1646
01:04:14,060 --> 01:04:17,660
The answers tell you whether your principle is being achieved in practice.
1647
01:04:17,660 --> 01:04:20,460
Step five is evolving the configuration based on feedback
1648
01:04:20,460 --> 01:04:21,860
without changing the principle.
1649
01:04:21,860 --> 01:04:23,260
This is where durability lives.
1650
01:04:23,260 --> 01:04:24,460
A year into deployment,
1651
01:04:24,460 --> 01:04:26,660
you discover that your conditional access policies
1652
01:04:26,660 --> 01:04:28,460
are blocking legitimate business travel.
1653
01:04:28,460 --> 01:04:30,860
Remote workers signing in from hotel networks
1654
01:04:30,860 --> 01:04:32,260
are getting too much scrutiny.
1655
01:04:32,260 --> 01:04:33,860
They're having trouble accessing files
1656
01:04:33,860 --> 01:04:35,060
that the configuration isn't working,
1657
01:04:35,060 --> 01:04:38,460
but the principle authenticate based on risk is still valid.
1658
01:04:38,460 --> 01:04:40,260
You don't abandon the principle.
1659
01:04:40,260 --> 01:04:41,460
You adjust the configuration.
1660
01:04:41,460 --> 01:04:43,060
You refine the risk calculations.
1661
01:04:43,060 --> 01:04:46,060
You add exceptions for legitimate remote work scenarios.
1662
01:04:46,060 --> 01:04:49,260
You evolve the implementation while keeping the principle intact.
1663
01:04:49,260 --> 01:04:51,660
Technical people often skip steps one and two.
1664
01:04:51,660 --> 01:04:53,060
They jump straight to step three.
1665
01:04:53,060 --> 01:04:55,260
They ask what configurations should we deploy?
1666
01:04:55,260 --> 01:04:56,660
Without understanding intent,
1667
01:04:56,660 --> 01:04:57,860
without designing principles,
1668
01:04:57,860 --> 01:05:00,460
and that's why they create technically perfect configurations
1669
01:05:00,460 --> 01:05:03,260
that don't align with what the organization actually needs.
1670
01:05:03,260 --> 01:05:06,660
Intent-based governance is harder to design initially.
1671
01:05:06,660 --> 01:05:09,260
It requires thinking about principles before controls,
1672
01:05:09,260 --> 01:05:10,660
but it's easier to maintain
1673
01:05:10,660 --> 01:05:12,460
because when you know the intent,
1674
01:05:12,460 --> 01:05:15,660
you can adapt the implementation as reality changes.
1675
01:05:15,660 --> 01:05:16,460
That's the shift.
1676
01:05:16,460 --> 01:05:19,460
That's how you move from configuration thinking to governance thinking.
1677
01:05:19,460 --> 01:05:21,460
The role of continuous governance.
1678
01:05:21,460 --> 01:05:23,860
This is perhaps the most important point I'm going to make
1679
01:05:23,860 --> 01:05:25,460
in this entire conversation.
1680
01:05:25,460 --> 01:05:27,060
Governance is not a project.
1681
01:05:27,060 --> 01:05:28,260
It's a continuous practice.
1682
01:05:28,260 --> 01:05:29,860
Most organizations approach governance
1683
01:05:29,860 --> 01:05:31,660
like their approach in office renovation.
1684
01:05:31,660 --> 01:05:34,260
They hire consultants, the consultants interview people.
1685
01:05:34,260 --> 01:05:35,260
They design a system.
1686
01:05:35,260 --> 01:05:36,860
They document it meticulously.
1687
01:05:36,860 --> 01:05:37,860
They create policies.
1688
01:05:37,860 --> 01:05:38,860
They create workflows.
1689
01:05:38,860 --> 01:05:40,260
They create role definitions.
1690
01:05:40,260 --> 01:05:41,060
They hand it off.
1691
01:05:41,060 --> 01:05:43,060
And then leadership declares governance complete.
1692
01:05:43,060 --> 01:05:44,060
We have policies.
1693
01:05:44,060 --> 01:05:45,260
We have documentation.
1694
01:05:45,260 --> 01:05:46,660
Governance is solved.
1695
01:05:46,660 --> 01:05:48,660
Now we can move on to the next initiative.
1696
01:05:48,660 --> 01:05:50,260
But that's not how governance works.
1697
01:05:50,260 --> 01:05:51,660
Governance is never solved.
1698
01:05:51,660 --> 01:05:53,060
It's continuously maintained.
1699
01:05:53,060 --> 01:05:55,260
You don't build a governance model and declare victory.
1700
01:05:55,260 --> 01:05:56,660
You build a governance model
1701
01:05:56,660 --> 01:05:58,460
and then you spend the next five years
1702
01:05:58,460 --> 01:06:00,460
adapting it as the organization evolves
1703
01:06:00,460 --> 01:06:02,260
and as circumstances change.
1704
01:06:02,260 --> 01:06:03,660
Here's what the data shows.
1705
01:06:03,660 --> 01:06:06,460
Most co-pilot rollout stall between week six and 12.
1706
01:06:06,460 --> 01:06:07,060
Why?
1707
01:06:07,060 --> 01:06:09,060
Because governance is treated as an event
1708
01:06:09,060 --> 01:06:10,060
rather than a process.
1709
01:06:10,060 --> 01:06:12,860
Organizations go through their governance checklist.
1710
01:06:12,860 --> 01:06:13,860
They create policies.
1711
01:06:13,860 --> 01:06:15,060
They configure DLP.
1712
01:06:15,060 --> 01:06:16,860
They set up sensitivity labels.
1713
01:06:16,860 --> 01:06:18,060
They declare governance ready.
1714
01:06:18,060 --> 01:06:19,060
Then they roll out co-pilot.
1715
01:06:19,060 --> 01:06:20,060
Week one is exciting.
1716
01:06:20,060 --> 01:06:21,460
Week two people are using it.
1717
01:06:21,460 --> 01:06:22,660
Week six something breaks.
1718
01:06:22,660 --> 01:06:24,060
Maybe it's a compliance issue.
1719
01:06:24,060 --> 01:06:25,460
Maybe it's an oversharing problem.
1720
01:06:25,460 --> 01:06:27,060
Maybe it's a performance issue.
1721
01:06:27,060 --> 01:06:28,860
By week 12, the rollout has stalled
1722
01:06:28,860 --> 01:06:31,660
while the organization tries to fix the underlying problem.
1723
01:06:31,660 --> 01:06:34,460
That stall happens because governance was treated as a gate.
1724
01:06:34,460 --> 01:06:36,660
Something you pass through before deployment.
1725
01:06:36,660 --> 01:06:38,660
Not as something you maintain continuously
1726
01:06:38,660 --> 01:06:39,660
throughout the deployment.
1727
01:06:39,660 --> 01:06:42,060
Continuous governance is fundamentally different
1728
01:06:42,060 --> 01:06:43,660
from configuration management.
1729
01:06:43,660 --> 01:06:46,060
Configuration management asks the technical question.
1730
01:06:46,060 --> 01:06:47,460
Are the settings correct?
1731
01:06:47,460 --> 01:06:49,860
Is the DLP policy configured correctly?
1732
01:06:49,860 --> 01:06:52,660
Is the conditional access policy configured correctly?
1733
01:06:52,660 --> 01:06:54,460
Is the sensitivity label applied correctly?
1734
01:06:54,460 --> 01:06:55,860
Those are yes or no questions.
1735
01:06:55,860 --> 01:06:58,660
The settings either match the desired state or they don't.
1736
01:06:58,660 --> 01:07:00,460
Continuous governance asks a different question.
1737
01:07:00,460 --> 01:07:01,460
It's not about settings.
1738
01:07:01,460 --> 01:07:02,460
It's about outcomes.
1739
01:07:02,460 --> 01:07:04,860
Is the system still achieving its intent?
1740
01:07:04,860 --> 01:07:06,260
Not is the policy configured?
1741
01:07:06,260 --> 01:07:07,660
Is the policy actually working?
1742
01:07:07,660 --> 01:07:10,460
Are users able to work while still being protected?
1743
01:07:10,460 --> 01:07:13,260
Are we catching oversharing before it becomes a problem?
1744
01:07:13,260 --> 01:07:15,660
Are we identifying risks before they materialize?
1745
01:07:15,660 --> 01:07:19,060
Is the organization adapting faster than the configuration is drifting?
1746
01:07:19,060 --> 01:07:19,660
That's the difference.
1747
01:07:19,660 --> 01:07:21,460
Configuration management is reactive.
1748
01:07:21,460 --> 01:07:22,460
You deploy a policy.
1749
01:07:22,460 --> 01:07:24,260
You monitor whether the configuration drifts.
1750
01:07:24,260 --> 01:07:25,260
You fix it if it does.
1751
01:07:25,260 --> 01:07:27,060
Continuous governance is proactive.
1752
01:07:27,060 --> 01:07:30,460
You monitor whether the system is still achieving its intent.
1753
01:07:30,460 --> 01:07:32,860
You adapt the configuration before it becomes a problem.
1754
01:07:32,860 --> 01:07:36,260
You evolve the governance model as the organization evolves.
1755
01:07:36,260 --> 01:07:40,260
Organizations that treat governance as continuous seedrometically better outcomes.
1756
01:07:40,260 --> 01:07:41,460
They catch drift earlier.
1757
01:07:41,460 --> 01:07:42,660
They adapt faster.
1758
01:07:42,660 --> 01:07:44,260
They maintain durability longer.
1759
01:07:44,260 --> 01:07:44,860
Why?
1760
01:07:44,860 --> 01:07:46,060
Because they have feedback loops.
1761
01:07:46,060 --> 01:07:47,660
Someone is continuously asking,
1762
01:07:47,660 --> 01:07:48,660
is this still working?
1763
01:07:48,660 --> 01:07:50,660
Is this still aligned with our intent?
1764
01:07:50,660 --> 01:07:51,660
Do we need to adapt?
1765
01:07:51,660 --> 01:07:53,460
And they're equipped to answer those questions
1766
01:07:53,460 --> 01:07:55,660
because governance isn't something that happened once.
1767
01:07:55,660 --> 01:07:57,460
It's something that's actively maintained.
1768
01:07:57,460 --> 01:07:59,460
The technical people who understand this shift
1769
01:07:59,460 --> 01:08:02,060
are the ones who move from being engineers to being architects.
1770
01:08:02,060 --> 01:08:03,460
They still have deep technical expertise.
1771
01:08:03,460 --> 01:08:05,660
But they've added a layer of systems thinking.
1772
01:08:05,660 --> 01:08:08,460
They're thinking about not just whether something works technically
1773
01:08:08,460 --> 01:08:11,260
but whether an organization can sustain it operationally.
1774
01:08:11,260 --> 01:08:12,860
They're thinking about feedback loops.
1775
01:08:12,860 --> 01:08:15,460
They're thinking about how to detect when something is drifting.
1776
01:08:15,460 --> 01:08:18,860
They're thinking about how to adapt without abandoning the underlying principles.
1777
01:08:18,860 --> 01:08:21,260
Here's what continuous governance looks like in practice.
1778
01:08:21,260 --> 01:08:23,060
You deploy a governance policy.
1779
01:08:23,060 --> 01:08:24,660
Three months later, you review it.
1780
01:08:24,660 --> 01:08:26,860
Did it work? Are there unintended consequences?
1781
01:08:26,860 --> 01:08:29,860
Is it creating friction that's driving users toward workarounds?
1782
01:08:29,860 --> 01:08:30,860
You make adjustments.
1783
01:08:30,860 --> 01:08:32,260
Three months later, you review again.
1784
01:08:32,260 --> 01:08:33,660
The organization has changed.
1785
01:08:33,660 --> 01:08:35,260
You've added new applications.
1786
01:08:35,260 --> 01:08:36,660
You've added new user roles.
1787
01:08:36,660 --> 01:08:38,060
You've added new data categories.
1788
01:08:38,060 --> 01:08:39,860
Does the governance model still fit?
1789
01:08:39,860 --> 01:08:41,460
Do the policy still make sense?
1790
01:08:41,460 --> 01:08:42,260
You adapt them.
1791
01:08:42,260 --> 01:08:44,460
A year in, you conduct a comprehensive review.
1792
01:08:44,460 --> 01:08:46,060
Is the principle still valid?
1793
01:08:46,060 --> 01:08:48,260
Is the configuration still achieving the intent?
1794
01:08:48,260 --> 01:08:50,460
Or have circumstances changed so much
1795
01:08:50,460 --> 01:08:52,660
that you need to rethink the entire framework?
1796
01:08:52,660 --> 01:08:54,260
That's continuous governance.
1797
01:08:54,260 --> 01:08:56,060
It's not a project with a completion date.
1798
01:08:56,060 --> 01:08:58,060
It's an ongoing architectural practice.
1799
01:08:58,060 --> 01:09:00,460
And it's the only practice that produces durable systems.
1800
01:09:00,460 --> 01:09:02,260
The organizations that understand this,
1801
01:09:02,260 --> 01:09:04,660
that governance is continuous, not a project.
1802
01:09:04,660 --> 01:09:08,460
Other ones that build Microsoft 365 tenants that actually survive.
1803
01:09:08,460 --> 01:09:10,060
Not because the technology is better,
1804
01:09:10,060 --> 01:09:12,460
not because they're more technically sophisticated,
1805
01:09:12,460 --> 01:09:15,860
but because they understand that architecture is a living discipline
1806
01:09:15,860 --> 01:09:18,260
that requires constant attention and adaptation.
1807
01:09:18,260 --> 01:09:20,260
Building a governance first culture.
1808
01:09:20,260 --> 01:09:22,860
Technical excellence alone will not solve these problems.
1809
01:09:22,860 --> 01:09:24,660
You can have the smartest architects.
1810
01:09:24,660 --> 01:09:26,660
You can have the most sophisticated policies.
1811
01:09:26,660 --> 01:09:28,060
You can have perfect documentation.
1812
01:09:28,060 --> 01:09:30,260
But if your organization doesn't value governance,
1813
01:09:30,260 --> 01:09:32,260
none of it will survive contact with real work.
1814
01:09:32,260 --> 01:09:35,260
Building a governance first culture is the hardest part.
1815
01:09:35,260 --> 01:09:37,460
It's harder than designing the architecture.
1816
01:09:37,460 --> 01:09:39,260
It's harder than implementing the controls
1817
01:09:39,260 --> 01:09:41,660
because it requires changing how your organization thinks
1818
01:09:41,660 --> 01:09:42,660
about technical work.
1819
01:09:42,660 --> 01:09:44,460
It requires changing what you reward.
1820
01:09:44,460 --> 01:09:47,260
It requires changing what leadership cares about.
1821
01:09:47,260 --> 01:09:51,060
A governance first culture means governance is a leadership priority.
1822
01:09:51,060 --> 01:09:52,660
Not an IT checkbox.
1823
01:09:52,660 --> 01:09:54,860
It means that when your CIO talks about strategy,
1824
01:09:54,860 --> 01:09:56,660
governance is part of that conversation.
1825
01:09:56,660 --> 01:09:58,060
Not something to bolt on later.
1826
01:09:58,060 --> 01:09:59,660
Not something to handle in governance.
1827
01:09:59,660 --> 01:10:01,060
A part of the core strategy.
1828
01:10:01,060 --> 01:10:03,860
It means architects are valued for designing durable systems,
1829
01:10:03,860 --> 01:10:05,260
not just capable ones.
1830
01:10:05,260 --> 01:10:07,860
Right now, the technical culture rewards capability.
1831
01:10:07,860 --> 01:10:08,860
How much can we automate?
1832
01:10:08,860 --> 01:10:10,460
How many applications can we integrate?
1833
01:10:10,460 --> 01:10:12,460
How sophisticated can we make the system?
1834
01:10:12,460 --> 01:10:13,460
That's where the respect goes.
1835
01:10:13,460 --> 01:10:16,460
A governance first culture values durability equally.
1836
01:10:16,460 --> 01:10:18,260
How will someone maintain this in three years?
1837
01:10:18,260 --> 01:10:19,860
What happens when the architect leaves?
1838
01:10:19,860 --> 01:10:21,460
Can the organization still operate this?
1839
01:10:21,460 --> 01:10:22,860
Those questions get respect.
1840
01:10:22,860 --> 01:10:24,060
Those answers get rewarded.
1841
01:10:24,060 --> 01:10:25,860
It means technical people are rewarded
1842
01:10:25,860 --> 01:10:27,460
for thinking about sustainability,
1843
01:10:27,460 --> 01:10:28,460
not just innovation.
1844
01:10:28,460 --> 01:10:29,460
Innovation is flashy.
1845
01:10:29,460 --> 01:10:30,660
You deploy a new feature.
1846
01:10:30,660 --> 01:10:31,860
You automate a process.
1847
01:10:31,860 --> 01:10:33,460
You integrate a new application.
1848
01:10:33,460 --> 01:10:34,260
That's visible.
1849
01:10:34,260 --> 01:10:35,060
That's impressive.
1850
01:10:35,060 --> 01:10:36,460
Sustainability is invisible.
1851
01:10:36,460 --> 01:10:38,260
You design a system that's maintainable.
1852
01:10:38,260 --> 01:10:40,060
You create governance that scales.
1853
01:10:40,060 --> 01:10:42,260
You build processes that don't degrade over time.
1854
01:10:42,260 --> 01:10:43,860
Nobody sees it. Nobody uploads it.
1855
01:10:43,860 --> 01:10:46,060
But a governance first culture recognizes it.
1856
01:10:46,060 --> 01:10:47,060
Values it.
1857
01:10:47,060 --> 01:10:47,860
Rewards it.
1858
01:10:47,860 --> 01:10:50,460
It means organizations measure governance health
1859
01:10:50,460 --> 01:10:52,060
alongside technical health.
1860
01:10:52,060 --> 01:10:55,060
Right now most organizations measure technical metrics.
1861
01:10:55,060 --> 01:10:56,860
Uptime, performance, feature adoption,
1862
01:10:56,860 --> 01:10:58,060
those are valuable metrics.
1863
01:10:58,060 --> 01:10:59,660
But governance health is unmeasured.
1864
01:10:59,660 --> 01:11:01,060
How many often teams exist?
1865
01:11:01,060 --> 01:11:03,060
How many flows don't have documented owners?
1866
01:11:03,060 --> 01:11:04,460
How many global admins do you have?
1867
01:11:04,460 --> 01:11:06,260
How much governance debt are you accumulating?
1868
01:11:06,260 --> 01:11:09,660
Those metrics should be measured, reported, tracked over time.
1869
01:11:09,660 --> 01:11:12,060
If you're not measuring it, you're not managing it.
1870
01:11:12,060 --> 01:11:13,460
Here's what the research tells us.
1871
01:11:13,460 --> 01:11:15,660
Organizations with mature governance frameworks
1872
01:11:15,660 --> 01:11:17,860
experience 23% fewer incidents.
1873
01:11:17,860 --> 01:11:18,860
That's not coincidental.
1874
01:11:18,860 --> 01:11:21,660
That's the outcome of building a culture where governance matters.
1875
01:11:21,660 --> 01:11:23,260
Where durability is valued,
1876
01:11:23,260 --> 01:11:26,060
where architects are empowered to slow down capability deployment
1877
01:11:26,060 --> 01:11:27,860
when governance readiness is lacking.
1878
01:11:27,860 --> 01:11:30,860
Building that culture requires specific things from leadership.
1879
01:11:30,860 --> 01:11:33,860
The CIO needs to communicate that governance is strategic,
1880
01:11:33,860 --> 01:11:34,860
not tactical.
1881
01:11:34,860 --> 01:11:37,060
Not we need to comply with this regulation.
1882
01:11:37,060 --> 01:11:38,060
Strategic.
1883
01:11:38,060 --> 01:11:39,260
Governance is how we scale.
1884
01:11:39,260 --> 01:11:40,460
It's how we reduce risk.
1885
01:11:40,460 --> 01:11:42,860
It's how we maintain control as the organization grows.
1886
01:11:42,860 --> 01:11:44,260
It's core to our strategy.
1887
01:11:44,260 --> 01:11:48,060
Technical teams need to understand that durability is as important as capability.
1888
01:11:48,060 --> 01:11:49,060
Not more important.
1889
01:11:49,060 --> 01:11:52,260
As important, you're not sacrificing innovation for governance.
1890
01:11:52,260 --> 01:11:53,860
You're making innovation sustainable.
1891
01:11:53,860 --> 01:11:55,860
You're asking, what's the capability we need?
1892
01:11:55,860 --> 01:11:58,860
And how do we build it so the organization can actually maintain it?
1893
01:11:58,860 --> 01:12:02,060
Architects need to be empowered to slow down capability deployment
1894
01:12:02,060 --> 01:12:03,860
when governance readiness is lacking.
1895
01:12:03,860 --> 01:12:05,860
Right now, the pressure flows one direction.
1896
01:12:05,860 --> 01:12:08,460
Deploy faster, get to market faster, innovate faster.
1897
01:12:08,460 --> 01:12:11,860
But a governance first culture empowers architects to say, not yet.
1898
01:12:11,860 --> 01:12:13,260
The organization isn't ready.
1899
01:12:13,260 --> 01:12:15,260
We need to build the governance foundation first.
1900
01:12:15,260 --> 01:12:17,060
And that should be respected, not overruled.
1901
01:12:17,060 --> 01:12:18,860
This requires changing incentive structures.
1902
01:12:18,860 --> 01:12:22,260
If architects are measured on speed of deployment, they'll optimize for speed.
1903
01:12:22,260 --> 01:12:25,860
If they're measured on durability of deployment, they'll optimize for durability.
1904
01:12:25,860 --> 01:12:29,660
If technical excellence is rewarded and governance durability is ignored,
1905
01:12:29,660 --> 01:12:31,060
technical excellence will win.
1906
01:12:31,060 --> 01:12:33,660
If both are valued equally, both will happen.
1907
01:12:33,660 --> 01:12:35,660
It requires changing conversations.
1908
01:12:35,660 --> 01:12:39,460
When someone proposes a new capability, the conversation can't just be,
1909
01:12:39,460 --> 01:12:40,860
is it technically possible?
1910
01:12:40,860 --> 01:12:43,660
It has to include, is the organization ready to operate it?
1911
01:12:43,660 --> 01:12:45,260
Do we have governance readiness?
1912
01:12:45,260 --> 01:12:46,460
Do we have clear ownership?
1913
01:12:46,460 --> 01:12:47,860
Do we have monitoring in place?
1914
01:12:47,860 --> 01:12:49,860
If not, what do we need to do first?
1915
01:12:49,860 --> 01:12:51,660
That conversation is uncomfortable.
1916
01:12:51,660 --> 01:12:52,860
It slows things down.
1917
01:12:52,860 --> 01:12:54,460
It requires saying no sometimes.
1918
01:12:54,460 --> 01:12:56,660
But it's the conversation that builds sustainable systems.
1919
01:12:56,660 --> 01:12:59,260
This is the hardest part because it requires changing culture.
1920
01:12:59,260 --> 01:13:00,460
And culture changes slowly.
1921
01:13:00,460 --> 01:13:02,060
It requires leadership commitment.
1922
01:13:02,060 --> 01:13:03,860
It requires consistent messaging.
1923
01:13:03,860 --> 01:13:07,660
It requires new people to see the pattern and understand that durability matters.
1924
01:13:07,660 --> 01:13:09,860
It requires architects to feel safe,
1925
01:13:09,860 --> 01:13:13,060
prioritizing governance even when other pressures push towards speed.
1926
01:13:13,060 --> 01:13:16,460
But organizations that make that shift, that build a governance first culture
1927
01:13:16,460 --> 01:13:19,460
are the ones that actually scale Microsoft 365 sustainably.
1928
01:13:19,460 --> 01:13:20,460
They're not the fastest,
1929
01:13:20,460 --> 01:13:23,660
but they're the ones still operating successfully five years later.
1930
01:13:23,660 --> 01:13:25,260
The architects' responsibility,
1931
01:13:25,260 --> 01:13:28,860
as architects, we have a responsibility that goes beyond technical excellence.
1932
01:13:28,860 --> 01:13:31,260
And this is the part that doesn't get talked about enough.
1933
01:13:31,260 --> 01:13:33,860
When you design a Microsoft 365 architecture,
1934
01:13:33,860 --> 01:13:35,860
you're not designing an abstract system.
1935
01:13:35,860 --> 01:13:39,660
You're designing something that thousands of people will live inside for years.
1936
01:13:39,660 --> 01:13:42,460
You're making decisions that will be felt by them every day.
1937
01:13:42,460 --> 01:13:44,260
Whether they can find what they need,
1938
01:13:44,260 --> 01:13:46,260
whether they can collaborate effectively,
1939
01:13:46,260 --> 01:13:47,460
whether they can work safely,
1940
01:13:47,460 --> 01:13:48,660
whether they can move fast,
1941
01:13:48,660 --> 01:13:50,460
or whether they're constantly hitting friction,
1942
01:13:50,460 --> 01:13:54,460
you're creating the conditions for either sustainable collaboration or operational chaos.
1943
01:13:54,460 --> 01:13:56,260
Most technical people don't think about it that way.
1944
01:13:56,260 --> 01:13:57,660
They think about the system,
1945
01:13:57,660 --> 01:14:00,060
the configuration, the elegance of the design,
1946
01:14:00,060 --> 01:14:02,060
the optimization, those are important,
1947
01:14:02,060 --> 01:14:03,860
but they're not the most important thing.
1948
01:14:03,860 --> 01:14:08,060
The most important thing is whether actual human beings can actually operate the system.
1949
01:14:08,060 --> 01:14:11,660
This responsibility requires thinking beyond technical capability.
1950
01:14:11,660 --> 01:14:13,860
It requires understanding organizational behavior.
1951
01:14:13,860 --> 01:14:16,060
It requires understanding human psychology.
1952
01:14:16,060 --> 01:14:20,660
It requires understanding what happens when a policy conflicts with how people actually work.
1953
01:14:20,660 --> 01:14:23,260
It requires understanding that people will find workarounds,
1954
01:14:23,260 --> 01:14:24,460
that they'll bend rules,
1955
01:14:24,460 --> 01:14:28,260
that they'll circumvent controls if those controls make it impossible to do their job.
1956
01:14:28,260 --> 01:14:31,660
You can't design sustainable systems without understanding those realities.
1957
01:14:31,660 --> 01:14:35,060
The best architects I've worked with are not the most technical people I've known.
1958
01:14:35,060 --> 01:14:40,260
They're not the ones who can write the most sophisticated code or design the most elegant infrastructure.
1959
01:14:40,260 --> 01:14:41,860
They're the most thoughtful people.
1960
01:14:41,860 --> 01:14:44,260
They ask questions that technical people often skip.
1961
01:14:44,260 --> 01:14:47,060
They ask, "Who will operate this? Can they actually do it?
1962
01:14:47,060 --> 01:14:49,860
Will they be able to maintain it when circumstances change?"
1963
01:14:49,860 --> 01:14:52,060
What happens when this assumption turns out to be wrong?
1964
01:14:52,060 --> 01:14:53,460
Those are uncomfortable questions.
1965
01:14:53,460 --> 01:14:54,460
They slow things down.
1966
01:14:54,460 --> 01:14:58,660
They require admitting that your elegant technical solution might not be operationally viable.
1967
01:14:58,660 --> 01:15:03,860
But those are the questions that separate systems that survive from systems that silently collapse.
1968
01:15:03,860 --> 01:15:06,060
They design for the organization that exists,
1969
01:15:06,060 --> 01:15:08,060
not the organization they wish existed.
1970
01:15:08,060 --> 01:15:09,460
This is the critical distinction.
1971
01:15:09,460 --> 01:15:12,660
A lot of architects design for a fictional version of the organization.
1972
01:15:12,660 --> 01:15:14,660
They assume perfect execution.
1973
01:15:14,660 --> 01:15:16,060
They assume clear processes.
1974
01:15:16,060 --> 01:15:19,060
They assume that everyone will follow the policies exactly as documented.
1975
01:15:19,060 --> 01:15:22,260
They assume that change will be managed carefully and thoughtfully.
1976
01:15:22,260 --> 01:15:24,460
But that's not the organization you're designing for.
1977
01:15:24,460 --> 01:15:27,060
You're designing for an organization with budget constraints,
1978
01:15:27,060 --> 01:15:28,660
with competing priorities,
1979
01:15:28,660 --> 01:15:30,860
with people who are overworked and overwhelmed,
1980
01:15:30,860 --> 01:15:33,860
with processes that are more bureaucratic than strategic,
1981
01:15:33,860 --> 01:15:36,860
with change that happens by accident as much as by plan.
1982
01:15:36,860 --> 01:15:41,260
An architect who designs for that reality builds systems that actually work.
1983
01:15:41,260 --> 01:15:46,060
An architect who designs for the fictional organization builds systems that immediately start to fail.
1984
01:15:46,060 --> 01:15:48,460
They build governance into architecture from the beginning,
1985
01:15:48,460 --> 01:15:49,460
not as an afterthought.
1986
01:15:49,460 --> 01:15:50,660
This is where the shift happens.
1987
01:15:50,660 --> 01:15:54,060
Most technical architects think about technical architecture first.
1988
01:15:54,060 --> 01:15:55,260
How do I design the system?
1989
01:15:55,260 --> 01:15:56,460
How do I make it scalable?
1990
01:15:56,460 --> 01:15:57,460
How do I make it performant?
1991
01:15:57,460 --> 01:15:58,860
How do I make it reliable?
1992
01:15:58,860 --> 01:16:00,460
Then at the end, they think about governance.
1993
01:16:00,460 --> 01:16:01,860
Okay, now how do we manage this?
1994
01:16:01,860 --> 01:16:02,860
How do we control it?
1995
01:16:02,860 --> 01:16:04,460
Governance becomes a layer on top.
1996
01:16:04,460 --> 01:16:05,860
Separate from the technical design.
1997
01:16:05,860 --> 01:16:10,060
But governance first architects think about it together from the beginning.
1998
01:16:10,060 --> 01:16:13,860
How do I design the system so that the organization can actually govern it?
1999
01:16:13,860 --> 01:16:15,260
What does governance look like?
2000
01:16:15,260 --> 01:16:15,860
Who owns it?
2001
01:16:15,860 --> 01:16:16,860
How do we prevent drift?
2002
01:16:16,860 --> 01:16:19,460
Those questions shape the architecture, not the other way around.
2003
01:16:19,460 --> 01:16:22,660
They measure success by durability, not just capability.
2004
01:16:22,660 --> 01:16:25,660
A capability focused architect measures success by features.
2005
01:16:25,660 --> 01:16:27,460
How many processes did we automate?
2006
01:16:27,460 --> 01:16:29,460
How many applications did we integrate?
2007
01:16:29,460 --> 01:16:31,060
How sophisticated did we make the system?
2008
01:16:31,060 --> 01:16:34,260
A durability focused architect measures success differently?
2009
01:16:34,260 --> 01:16:36,660
Can the organization still operate this in three years?
2010
01:16:36,660 --> 01:16:39,060
Can someone who didn't build it understand how it works?
2011
01:16:39,060 --> 01:16:41,060
Can it adapt when circumstances change?
2012
01:16:41,060 --> 01:16:43,060
Can it survive without the original architect?
2013
01:16:43,060 --> 01:16:44,660
Those are the measures that matter.
2014
01:16:44,660 --> 01:16:47,660
This is the shift from technical leadership to architectural leadership.
2015
01:16:47,660 --> 01:16:51,460
Technical leadership is about mastering the platform, learning all the features.
2016
01:16:51,460 --> 01:16:53,860
Understanding how to configure everything optimally.
2017
01:16:53,860 --> 01:16:56,660
Architectural leadership is about understanding systems.
2018
01:16:56,660 --> 01:17:01,860
Understanding organizations, understanding the tension between technical purity and operational reality.
2019
01:17:01,860 --> 01:17:05,260
Understanding that the goal isn't to build the most sophisticated system.
2020
01:17:05,260 --> 01:17:08,660
The goal is to build a system that the organization can actually sustain.
2021
01:17:08,660 --> 01:17:09,860
That's the responsibility.
2022
01:17:09,860 --> 01:17:11,460
That's what we are accountable for.
2023
01:17:11,460 --> 01:17:13,460
Not just whether the system works technically,
2024
01:17:13,460 --> 01:17:15,460
whether the organization can operate it,
2025
01:17:15,460 --> 01:17:17,860
whether it survives, whether it creates value,
2026
01:17:17,860 --> 01:17:20,260
that's the architect's responsibility.
2027
01:17:20,260 --> 01:17:22,860
Practical steps to implement governance architecture.
2028
01:17:22,860 --> 01:17:26,260
If you want to start implementing governance architecture immediately,
2029
01:17:26,260 --> 01:17:27,860
here are the practical steps.
2030
01:17:27,860 --> 01:17:29,860
Not theoretical, not aspirational.
2031
01:17:29,860 --> 01:17:33,460
Concrete actions you can take this week that will begin shifting your organization
2032
01:17:33,460 --> 01:17:35,060
toward durable governance.
2033
01:17:35,060 --> 01:17:36,660
Step one is audit your current state.
2034
01:17:36,660 --> 01:17:39,660
Use the tenant durability checklist we discussed earlier.
2035
01:17:39,660 --> 01:17:42,260
Walk through every category, be honest about what you find.
2036
01:17:42,260 --> 01:17:43,460
This isn't for an auditor.
2037
01:17:43,460 --> 01:17:44,460
This is for you.
2038
01:17:44,460 --> 01:17:45,460
You're establishing a baseline,
2039
01:17:45,460 --> 01:17:48,260
so you know where you actually are before you start moving.
2040
01:17:48,260 --> 01:17:51,460
Most organizations skip this step because it's uncomfortable.
2041
01:17:51,460 --> 01:17:54,060
You don't want to admit that you have global admins' role.
2042
01:17:54,060 --> 01:17:57,660
You don't want to acknowledge that you have hundreds of flows with no documented owners.
2043
01:17:57,660 --> 01:17:59,260
You don't want to measure the governance dead.
2044
01:17:59,260 --> 01:18:00,260
But you have to.
2045
01:18:00,260 --> 01:18:03,060
You can't move from where you're not toward where you want to be.
2046
01:18:03,060 --> 01:18:05,660
You have to start from the truth about where you are.
2047
01:18:05,660 --> 01:18:08,060
Step two is define intent for each governance area.
2048
01:18:08,060 --> 01:18:09,460
Don't start with implementation.
2049
01:18:09,460 --> 01:18:10,460
Start with intent.
2050
01:18:10,460 --> 01:18:12,460
What are you actually trying to achieve?
2051
01:18:12,460 --> 01:18:15,460
For identity ask, how do we balance access and security?
2052
01:18:15,460 --> 01:18:18,060
That's different from how do we enforce MFA?
2053
01:18:18,060 --> 01:18:20,460
That's asking what behavior you're trying to enable.
2054
01:18:20,460 --> 01:18:24,460
For collaboration ask, how do we enable teamwork while protecting data?
2055
01:18:24,460 --> 01:18:26,660
Not how do we restrict external sharing?
2056
01:18:26,660 --> 01:18:28,660
What's the balance you're trying to achieve?
2057
01:18:28,660 --> 01:18:32,460
For automation ask, how do we empower innovation while maintaining control?
2058
01:18:32,460 --> 01:18:34,460
Not how do we lock down power automate?
2059
01:18:34,460 --> 01:18:36,060
What's the outcome you're trying to create?
2060
01:18:36,060 --> 01:18:38,660
These three questions reshape your entire governance conversation.
2061
01:18:38,660 --> 01:18:40,660
Step three is design governance zones.
2062
01:18:40,660 --> 01:18:42,660
This is the framework that simplifies everything.
2063
01:18:42,660 --> 01:18:43,660
You have three zones.
2064
01:18:43,660 --> 01:18:45,060
Zone one is personal work.
2065
01:18:45,060 --> 01:18:48,260
One drive, personal teams chats, stuff that belongs to individuals,
2066
01:18:48,260 --> 01:18:49,060
minimal governance.
2067
01:18:49,060 --> 01:18:50,860
You don't need to control this aggressively.
2068
01:18:50,860 --> 01:18:52,260
Zone two is collaborative work.
2069
01:18:52,260 --> 01:18:56,060
Teams channels, project sharepoint sites, places where teams work together.
2070
01:18:56,060 --> 01:18:59,860
Moderate governance, clear ownership, life cycle policies, guest management.
2071
01:18:59,860 --> 01:19:04,660
Zone three is enterprise records, HR systems, financial data, regulated content,
2072
01:19:04,660 --> 01:19:08,660
strict governance, retention policies, sensitivity labels, access reviews.
2073
01:19:08,660 --> 01:19:11,660
This three zone model instantly makes governance decisions clearer.
2074
01:19:11,660 --> 01:19:15,260
You're not asking, how do we apply governance to all Microsoft 365?
2075
01:19:15,260 --> 01:19:16,460
That's unanswerable.
2076
01:19:16,460 --> 01:19:18,860
You're asking, how do we govern this specific zone?
2077
01:19:18,860 --> 01:19:20,060
That's unanswerable.
2078
01:19:20,060 --> 01:19:22,460
Apply different governance models to each zone.
2079
01:19:22,460 --> 01:19:24,660
You don't need the same level of control everywhere.
2080
01:19:24,660 --> 01:19:26,460
Step four is identify quick wins.
2081
01:19:26,460 --> 01:19:29,460
Where can you improve governance with minimal disruption?
2082
01:19:29,460 --> 01:19:30,660
Start with identity.
2083
01:19:30,660 --> 01:19:32,060
Reduce your global admin count.
2084
01:19:32,060 --> 01:19:35,660
If you have 20 global admins and the recommendation is fewer than five,
2085
01:19:35,660 --> 01:19:37,260
start working toward that number.
2086
01:19:37,260 --> 01:19:39,060
Not by removing people's access suddenly,
2087
01:19:39,060 --> 01:19:42,460
by implementing privileged access management, by delegating specific roles,
2088
01:19:42,460 --> 01:19:45,060
by creating a plan to get to a reasonable number.
2089
01:19:45,060 --> 01:19:48,460
This is a quick win because it's high impact and relatively straightforward.
2090
01:19:48,460 --> 01:19:49,660
Other quick wins.
2091
01:19:49,660 --> 01:19:52,660
Implement teams, life cycle policies for inactive teams.
2092
01:19:52,660 --> 01:19:55,860
Clean up guest accounts from partnerships that ended years ago.
2093
01:19:55,860 --> 01:19:59,660
These are wins you can achieve without restructuring your entire governance model.
2094
01:19:59,660 --> 01:20:01,860
Step five is build continuous monitoring.
2095
01:20:01,860 --> 01:20:03,660
This is where most organizations fail.
2096
01:20:03,660 --> 01:20:05,260
They implement governance and then stop.
2097
01:20:05,260 --> 01:20:06,260
Nothing changes.
2098
01:20:06,260 --> 01:20:07,260
Policies drift.
2099
01:20:07,260 --> 01:20:09,060
Ownership becomes unclear again.
2100
01:20:09,060 --> 01:20:10,060
Don't let that happen.
2101
01:20:10,060 --> 01:20:12,260
Establish regular reviews of governance health.
2102
01:20:12,260 --> 01:20:14,660
Quarterly reviews of major governance areas.
2103
01:20:14,660 --> 01:20:16,260
Are our policies still working?
2104
01:20:16,260 --> 01:20:17,060
Is drift happening?
2105
01:20:17,060 --> 01:20:18,260
Do we need to adapt?
2106
01:20:18,260 --> 01:20:21,260
Create feedback loops that inform policy evolution.
2107
01:20:21,260 --> 01:20:25,860
Automation debt discovered in month one should inform your automation governance by month three.
2108
01:20:25,860 --> 01:20:27,860
Permissions sprawl identified in a review
2109
01:20:27,860 --> 01:20:30,060
should trigger updates to your sharing policy.
2110
01:20:30,060 --> 01:20:31,860
Step six is communicate the shift.
2111
01:20:31,860 --> 01:20:35,460
This is the most important step because culture change is the hardest part.
2112
01:20:35,460 --> 01:20:38,460
Help your organization understand that governance enables innovation,
2113
01:20:38,460 --> 01:20:39,860
not just restricts it.
2114
01:20:39,860 --> 01:20:41,860
Right now people think governance is limiting.
2115
01:20:41,860 --> 01:20:42,860
It's bureaucracy.
2116
01:20:42,860 --> 01:20:43,860
It's saying no.
2117
01:20:43,860 --> 01:20:44,860
Shift that conversation.
2118
01:20:44,860 --> 01:20:47,860
Governance is the foundation that lets you move fast safely.
2119
01:20:47,860 --> 01:20:50,860
Governance is what allows you to scale without losing control.
2120
01:20:50,860 --> 01:20:52,260
That's a different conversation.
2121
01:20:52,260 --> 01:20:53,460
That's what gets adoption.
2122
01:20:53,460 --> 01:20:54,660
Communicate what you're doing.
2123
01:20:54,660 --> 01:20:55,460
Why you're doing it?
2124
01:20:55,460 --> 01:20:56,660
What the outcome will be?
2125
01:20:56,660 --> 01:20:57,860
Do that continuously.
2126
01:20:57,860 --> 01:20:59,660
Culture change doesn't happen from a memo.
2127
01:20:59,660 --> 01:21:01,460
It happens from consistent messaging.
2128
01:21:01,460 --> 01:21:04,860
From showing results, from demonstrating that governance makes work better,
2129
01:21:04,860 --> 01:21:06,060
not just safer.
2130
01:21:06,060 --> 01:21:07,260
That's the implementation path.
2131
01:21:07,260 --> 01:21:08,660
Start small, start honest.
2132
01:21:08,660 --> 01:21:10,460
Move incrementally, build momentum.
2133
01:21:10,460 --> 01:21:12,860
Create the conditions for sustainable architecture.
2134
01:21:12,860 --> 01:21:15,860
That's how you go from governance debt to governance durability.
2135
01:21:15,860 --> 01:21:17,660
The long view.
2136
01:21:17,660 --> 01:21:19,060
Why this matters now?
2137
01:21:19,060 --> 01:21:22,660
Let me finish with the strategic context for why this matters right now.
2138
01:21:22,660 --> 01:21:24,460
Not in five years, right now.
2139
01:21:24,460 --> 01:21:25,660
In 2026.
2140
01:21:25,660 --> 01:21:28,660
Microsoft 365 is becoming more complex, not less.
2141
01:21:28,660 --> 01:21:30,860
Every quarter new capabilities are added.
2142
01:21:30,860 --> 01:21:33,060
AI is being integrated into every application.
2143
01:21:33,060 --> 01:21:35,860
Copilot is becoming the default way people work.
2144
01:21:35,860 --> 01:21:36,860
Not an add-on.
2145
01:21:36,860 --> 01:21:37,860
The platform is expanding.
2146
01:21:37,860 --> 01:21:39,260
The surface area is growing.
2147
01:21:39,260 --> 01:21:41,660
The number of decision points is multiplying.
2148
01:21:41,660 --> 01:21:45,260
If governance is difficult now, it's about to become exponentially more difficult.
2149
01:21:45,260 --> 01:21:48,460
The organizations that haven't built governance architecture are running out of time.
2150
01:21:48,460 --> 01:21:52,060
At the same time, AI capabilities are surfacing governance debt instantly.
2151
01:21:52,060 --> 01:21:57,460
Copilot can access any data available to users across SharePoint, Teams, OneDrive and Exchange.
2152
01:21:57,460 --> 01:21:59,860
That means Copilot can surface overshared data.
2153
01:21:59,860 --> 01:22:01,460
It can expose broken permissions.
2154
01:22:01,460 --> 01:22:04,460
It can make visible what was previously hidden by obscurity.
2155
01:22:04,460 --> 01:22:09,060
73% of organizations in regulated industries have paused Copilot rollouts
2156
01:22:09,060 --> 01:22:10,460
due to data exposure concerns.
2157
01:22:10,460 --> 01:22:14,060
Not because Copilot is unsafe, because their own data governance is unsafe.
2158
01:22:14,060 --> 01:22:17,060
Because they have years of permissions sprawled that they've never fixed.
2159
01:22:17,060 --> 01:22:18,660
Copilot didn't create that problem.
2160
01:22:18,660 --> 01:22:19,660
It revealed it.
2161
01:22:19,660 --> 01:22:22,260
The regulatory environment is tightening simultaneously.
2162
01:22:22,260 --> 01:22:24,060
The EU AI Act is now in effect.
2163
01:22:24,060 --> 01:22:29,060
Organizations operating in Europe face legally enforceable obligations around AI governance.
2164
01:22:29,060 --> 01:22:30,260
Not best practices.
2165
01:22:30,260 --> 01:22:31,260
Legal requirements.
2166
01:22:31,260 --> 01:22:33,660
They must demonstrate control over AI systems.
2167
01:22:33,660 --> 01:22:35,060
They must maintain audit trails.
2168
01:22:35,060 --> 01:22:36,860
They must ensure fairness and transparency.
2169
01:22:36,860 --> 01:22:38,460
Those are not optional suggestions.
2170
01:22:38,460 --> 01:22:39,860
Those are compliance requirements.
2171
01:22:39,860 --> 01:22:43,660
Organizations that deploy Copilot without governance are deploying a regulated system
2172
01:22:43,660 --> 01:22:44,860
without compliance controls.
2173
01:22:44,860 --> 01:22:46,060
That's not a technical problem.
2174
01:22:46,060 --> 01:22:47,060
That's a legal problem.
2175
01:22:47,060 --> 01:22:48,660
Licensing costs are increasing.
2176
01:22:48,660 --> 01:22:52,860
Microsoft has announced significant pricing increases effective July 2026.
2177
01:22:52,860 --> 01:22:58,060
Organizations are facing double digit percentage increases on their Microsoft 365 licenses.
2178
01:22:58,060 --> 01:22:59,860
That budget pressure forces a conversation.
2179
01:22:59,860 --> 01:23:01,460
What are we actually getting value from?
2180
01:23:01,460 --> 01:23:02,660
Where is licensing waste?
2181
01:23:02,660 --> 01:23:05,260
Where is governance debt creating unnecessary cost?
2182
01:23:05,260 --> 01:23:08,860
Organizations that haven't measured governance debt suddenly need to justify
2183
01:23:08,860 --> 01:23:10,660
continued investment in capabilities.
2184
01:23:10,660 --> 01:23:12,660
They're not actually operating effectively.
2185
01:23:12,660 --> 01:23:14,060
Here's the inflection point.
2186
01:23:14,060 --> 01:23:17,860
Organizations with strong governance architecture are moving forward confidently.
2187
01:23:17,860 --> 01:23:19,060
They are deploying Copilot.
2188
01:23:19,060 --> 01:23:20,460
They're implementing agents.
2189
01:23:20,460 --> 01:23:22,060
They're scaling AI capabilities.
2190
01:23:22,060 --> 01:23:23,860
They understand their data landscape.
2191
01:23:23,860 --> 01:23:25,460
They have clarity about permissions.
2192
01:23:25,460 --> 01:23:27,060
They have life cycle management.
2193
01:23:27,060 --> 01:23:28,060
They can deploy safely.
2194
01:23:28,060 --> 01:23:29,260
They can scale sustainably.
2195
01:23:29,260 --> 01:23:32,860
They can take advantage of new capabilities without creating new risk.
2196
01:23:32,860 --> 01:23:35,060
Organizations without governance are stalling.
2197
01:23:35,060 --> 01:23:37,060
Pilots are pausing, rollouts are halting.
2198
01:23:37,060 --> 01:23:40,260
Why? Because when you try to deploy Copilot at scale and you discover
2199
01:23:40,260 --> 01:23:42,860
you don't have governance readiness, you have to choose.
2200
01:23:42,860 --> 01:23:46,660
Slow down the deployment and fix governance or deploy anyway and accept the risk.
2201
01:23:46,660 --> 01:23:50,060
Most organizations are choosing to slow down and that slow down is expensive.
2202
01:23:50,060 --> 01:23:51,660
The license costs are accumulating.
2203
01:23:51,660 --> 01:23:53,660
The capability isn't delivering value.
2204
01:23:53,660 --> 01:23:57,060
The organization is standing still while competitors move forward.
2205
01:23:57,060 --> 01:23:59,460
The technical people who understand governance architecture
2206
01:23:59,460 --> 01:24:02,660
are about to become the most valuable leaders in enterprise technology.
2207
01:24:02,660 --> 01:24:05,460
They're the ones who can deploy AI safely and sustainably.
2208
01:24:05,460 --> 01:24:07,860
They're the ones who can navigate the regulatory landscape.
2209
01:24:07,860 --> 01:24:11,260
They're the ones who can extract value from the platform without creating chaos.
2210
01:24:11,260 --> 01:24:14,060
They're the ones who understand that technical excellence is not enough.
2211
01:24:14,060 --> 01:24:15,060
That durability matters.
2212
01:24:15,060 --> 01:24:17,060
That governance is architecture, not compliance.
2213
01:24:17,060 --> 01:24:21,060
Organizations are going to be scrambling to find architects who understand this.
2214
01:24:21,060 --> 01:24:24,060
Architects who can balance innovation with sustainability.
2215
01:24:24,060 --> 01:24:26,460
Architects who can build systems that actually survive.
2216
01:24:26,460 --> 01:24:31,060
The technical people who don't understand governance are about to become increasingly frustrated.
2217
01:24:31,060 --> 01:24:35,260
They'll build brilliant systems, elegant configurations, sophisticated automation
2218
01:24:35,260 --> 01:24:38,060
and then they'll watch those systems become unmentable.
2219
01:24:38,060 --> 01:24:41,060
They'll watch organizations struggle to operate what they design.
2220
01:24:41,060 --> 01:24:43,860
They'll watch co-pilot deployment store because governance wasn't ready.
2221
01:24:43,860 --> 01:24:46,860
They'll watch automation platforms collapse under complexity.
2222
01:24:46,860 --> 01:24:50,060
They'll be frustrated because they did their job, they built great systems.
2223
01:24:50,060 --> 01:24:52,060
But the organization couldn't sustain them.
2224
01:24:52,060 --> 01:24:52,860
This is the moment.
2225
01:24:52,860 --> 01:24:56,860
The next 18 months are going to determine which organizations scale AI sustainably
2226
01:24:56,860 --> 01:24:57,860
and which ones get stuck.
2227
01:24:57,860 --> 01:25:01,460
Which organizations deploy co-pilot confidently and which ones pause in panic?
2228
01:25:01,460 --> 01:25:04,860
Which organizations extract value from their Microsoft investment
2229
01:25:04,860 --> 01:25:06,860
and which ones accumulate licensing debt?
2230
01:25:06,860 --> 01:25:08,860
The difference won't be technical capability.
2231
01:25:08,860 --> 01:25:10,060
Microsoft will handle that.
2232
01:25:10,060 --> 01:25:12,060
The difference will be governance maturity.
2233
01:25:12,060 --> 01:25:15,260
Organizations that build governance architecture now will have the foundation.
2234
01:25:15,260 --> 01:25:18,460
Organizations that wait will be playing catch-up and catch-up is expensive.
2235
01:25:18,460 --> 01:25:22,060
It's disruptive, it's painful, you're better off doing the work now.
2236
01:25:22,060 --> 01:25:23,260
The confession completes.
2237
01:25:23,260 --> 01:25:25,860
I'm still not the most technical person in the room.
2238
01:25:25,860 --> 01:25:27,260
I don't write production code.
2239
01:25:27,260 --> 01:25:28,660
I don't tune infrastructure.
2240
01:25:28,660 --> 01:25:31,660
But I've learned something that the brilliant technical people in the room
2241
01:25:31,660 --> 01:25:33,260
don't always understand.
2242
01:25:33,260 --> 01:25:37,260
Technical excellence without governance creates the worst Microsoft 365 tenants.
2243
01:25:37,260 --> 01:25:41,060
The engineers in the failures we discussed were talented, really talented.
2244
01:25:41,060 --> 01:25:42,660
The problem wasn't their expertise.
2245
01:25:42,660 --> 01:25:45,260
The problem was that brilliance applied to the wrong problem
2246
01:25:45,260 --> 01:25:48,660
create systems that are technically perfect and operationally impossible.
2247
01:25:48,660 --> 01:25:52,060
The goal of architecture is not to build systems that work today.
2248
01:25:52,060 --> 01:25:55,860
It's to design systems that organizations can still operate five years from now
2249
01:25:55,860 --> 01:25:58,860
that requires thinking beyond technical capability.
2250
01:25:58,860 --> 01:26:02,460
That requires understanding governance, understanding organizational behavior,
2251
01:26:02,460 --> 01:26:04,860
understanding that people will find workarounds
2252
01:26:04,860 --> 01:26:07,260
if your architecture makes their job impossible.
2253
01:26:07,260 --> 01:26:10,260
Understanding that durability matters more than perfection.
2254
01:26:10,260 --> 01:26:14,860
If you're building Microsoft 365 architecture, start with governance intent,
2255
01:26:14,860 --> 01:26:16,060
not configuration.
2256
01:26:16,060 --> 01:26:19,660
Design for the organization that exists, not the one you wish existed.
2257
01:26:19,660 --> 01:26:22,260
Measure success by durability, not just capability.
2258
01:26:22,260 --> 01:26:25,260
And remember, the most dangerous architecture is the one that works perfectly
2259
01:26:25,260 --> 01:26:27,660
on day one and collapses silently in year three.
2260
01:26:27,660 --> 01:26:32,060
Subscribe to the M365FM podcast for more insights on Microsoft 365,
2261
01:26:32,060 --> 01:26:35,860
governance, architecture and the human systems that make technology sustainable.
2262
01:26:35,860 --> 01:26:37,060
Connect with me on LinkedIn.
2263
01:26:37,060 --> 01:26:40,860
I'm always looking for the next story of technical excellence that created governance failure
2264
01:26:40,860 --> 01:26:43,860
because those stories are how we learn to build better systems.








