In this episode of the M365 FM Podcast, Åsne Holtklimpen joins Mirko Peters to discuss the real challenges behind Microsoft Copilot adoption and AI readiness in Microsoft 365 environments. The core message is clear: Copilot does not create security problems — it exposes the governance and security gaps that already exist inside organizations.
The conversation focuses on common issues such as overshared SharePoint sites, outdated permissions, forgotten Teams channels, uncontrolled data sprawl, and missing governance strategies. Åsne explains how many organizations rushed into cloud collaboration during the pandemic without proper structure, and AI tools now make these weaknesses far more visible.
A major part of the episode highlights the importance of Microsoft Purview, sensitivity labels, Data Loss Prevention (DLP), Conditional Access, and Zero Trust principles. These tools help organizations classify sensitive information, secure access, and prevent Copilot from exposing confidential data. The discussion also emphasizes that successful AI adoption requires strong governance, employee education, lifecycle management, and executive support — not just turning Copilot on.
Åsne shares practical insights from real-world projects across the Nordic region, showing how organizations often underestimate the amount of sensitive data stored in Microsoft 365. The episode provides actionable guidance for IT leaders, security professionals, consultants, and business decision-makers looking to balance innovation, productivity, compliance, and security while preparing for AI-powered workplaces.
You may wonder if Microsoft Security stands ready for Copilot. The answer is yes—Microsoft Security provides strong protection. Copilot does not create new risks. Instead, it reveals existing problems with governance and permissions. Many organizations face issues like legacy permission sprawl, unreviewed historical data, and pilot rollouts without proper security.
| Issue Type | Description |
|---|---|
| Legacy Permission Sprawl | Broad access permissions in Microsoft 365 can lead to unintended exposure of sensitive files during Copilot rollout. |
| Unreviewed Historical Data Exposure | Old and unclassified data can be retrieved by Copilot, increasing the risk of sensitive information being exposed. |
| Missing Copilot Risk Baselines | Lack of baseline measurements for sensitive data access can lead to untracked risks during Copilot deployment. |
| Pilot Rollouts Without Security | Early pilot programs may expose sensitive content if permissions and monitoring are not properly scoped. |
You need strong data governance, clear permissions, and ongoing user education to succeed with Copilot.
Key Takeaways
- Microsoft Security provides strong protection for Copilot, but existing governance issues can expose sensitive data.
- Conduct a Copilot readiness assessment to review permissions, licensing, and data governance before deployment.
- Establish clear permissions and strong access controls to prevent unintentional exposure of sensitive information.
- Regularly audit and clean your data to remove outdated files and fix broken permissions for better security.
- User education is crucial; train employees on permissions and data handling to avoid accidental sharing of sensitive information.
- Implement multi-factor authentication and sensitivity labels to enhance data protection and compliance.
- Create a governance framework that includes regular reviews and updates to adapt to new risks and regulations.
- Stay proactive by reporting suspicious activity and encouraging a culture of security awareness among users.
Copilot Readiness Explained
What Readiness Means
You need more than a technical setup to prepare for Microsoft Copilot. Readiness means you have strong governance, clear permissions, and ongoing user education. Industry experts agree that you must secure, govern, and structure your Microsoft 365 environment before you enable Copilot. This step helps you prevent unintentional exposure of sensitive data and ensures Copilot delivers value without adding security or compliance risks.
A Copilot readiness assessment reviews your environment. You check permissions, licensing, and data governance. This process helps you avoid accidental exposure of sensitive files or unintended access when you use Copilot’s advanced discovery tools. True readiness goes beyond technical requirements. You must create a clean, secure, and well-structured data environment for effective deployment.
Security and Compliance
You must focus on security and compliance as you prepare for Copilot. Review who can access your data and how you protect it. Set up strong access controls and monitor permissions. Make sure your compliance policies match your organization’s needs. When you do this, you reduce the risk of exposing sensitive data and meet regulatory requirements.
User Impact
Copilot changes how users interact with data. You must help users understand how permissions work. When users know how Copilot uses data, they can avoid sharing sensitive information by mistake. Training and clear guidelines help everyone use Copilot safely and effectively.
Governance Foundations
Strong governance forms the foundation for Copilot readiness. You need to manage your data, users, and policies with care. Good governance helps you control risks and get the most value from Copilot.
Data Hygiene
Keep your data clean and organized. Remove outdated files and fix broken permissions. Classify sensitive data and make sure only the right people can access it. A clean data environment helps Copilot work better and keeps your information safe.
Lifecycle Management
You must manage the entire lifecycle of your data and Copilot assets. This includes tracking users, licenses, and custom agents. Use regular reports to monitor adoption, costs, and compliance. Gather user feedback to improve your policies. Stay alert for new risks and update your governance as needed.
| Component | Description |
|---|---|
| Maintain a complete inventory | Build and maintain a 360° view of all Copilot-related assets, including users, licenses, and custom agents. |
| Track key metrics | Analyze usage, adoption rates, and policy violations regularly to provide evidence of value and highlight areas for improvement. |
| Gather user feedback | Provide structured feedback channels for employees to share their Copilot experiences, ensuring policies remain practical and widely adopted. |
| Leverage scheduled and dynamic reporting | Automate reporting to keep stakeholders informed about adoption, costs, and compliance, turning governance into proactive risk management. |
| Stay ahead of emerging risks | Continuously monitor new AI capabilities and evolving regulations to proactively adjust governance frameworks. |
| Define clear governance policies | Establish rules before granting Copilot licenses, creating a foundational document for the AI strategy that sets expectations and accountability. |
Tip: Start with a full review of your data and permissions before you enable Copilot. This step helps you find and fix issues early.
Copilot readiness is not just about technology. You must build strong governance, keep your data clean, and educate your users. When you do this, you create a safe and effective environment for Copilot.
Microsoft Security in Copilot Deployments

When you deploy Copilot, you rely on a strong foundation of microsoft security. The platform provides advanced tools and controls to help you protect your data, manage compliance, and reduce security risks. However, your organization’s governance and permission management play a key role in how effective these features are.
Security Measures
Encryption and Access Controls
You need to secure your data at every stage. Microsoft security uses encryption to protect information both when it is stored and when it moves across networks. This means that only authorized users can access sensitive files, even if someone intercepts the data. You also benefit from access controls that let you decide who can see or edit information. These controls help you enforce the principle of least privilege, so users only get the access they need.
- Multi-factor authentication adds another layer of security by requiring users to verify their identity in more than one way.
- App protection policies separate your organization’s data from personal information on devices.
- Device management ensures that only secure and compliant devices can use Copilot features.
Note: You should review and update access permissions regularly to prevent unwanted exposure of sensitive data.
Identity Protections
Identity protection stands at the core of microsoft security. You must implement strong identity and access management. This includes enforcing multi-factor authentication and monitoring for suspicious sign-in attempts. Microsoft security tools help you detect and block threats like phishing or unauthorized access. You can also use threat protection services to guard against cyber attacks that target user identities.
A secure identity system helps you control who can use Copilot and what actions they can perform. This reduces the risk of data leaks and keeps your environment safe.
Compliance Tools
Microsoft Purview
Microsoft Purview gives you powerful tools to monitor and analyze how users interact with content. You can track sensitive data usage and spot incidents quickly. Purview also helps you meet regulatory requirements by providing detailed audit logs and eDiscovery tools. These features record all Copilot interactions, so you have transparency and traceability for AI-assisted actions.
- Communication Compliance in Purview lets you monitor Copilot prompts and responses. This helps you identify risky behavior and respond before it becomes a problem.
- You can set up retention policies to keep important data and ensure eDiscovery captures all relevant Copilot content.
Sensitivity Labels and DLP
Sensitivity labels help you classify and protect data across Microsoft 365. You can apply labels to documents, emails, and other items to control who can access them. Data Loss Prevention (DLP) policies work with these labels to detect and block the sharing of sensitive information. DLP recognizes Copilot as a unique policy location, so you can create rules that fit your needs.
- Sensitivity labels empower users to protect their own data, but you should verify that these labels meet your organization’s DLP standards.
- DLP policies automatically monitor and control sensitive content, reducing the chance of accidental leaks.
| Compliance Tool | What It Does |
|---|---|
| Microsoft Purview | Monitors content activity, provides audit logs, and supports eDiscovery for Copilot actions. |
| Sensitivity Labels | Classifies and restricts access to sensitive data across Microsoft 365. |
| Data Loss Prevention | Detects and controls sharing of sensitive information, with rules tailored for Copilot. |
Tip: Before you deploy Copilot, audit your permissions, review sensitivity labeling, and develop clear usage policies. This helps you stay compliant and secure.
Why Governance Still Matters
Microsoft security gives you advanced tools, but your organization’s governance and permission management determine how well these tools work. If you do not review permissions or keep your data organized, even the best security features cannot prevent exposure. You must combine technology with strong policies and regular reviews to create a secure environment for Copilot.
You can trust microsoft security to provide the foundation, but your actions make the difference.
Risks and Gaps with Copilot
Microsoft Copilot brings many benefits, but you need to understand the risks and gaps that can appear during deployment. These risks often come from existing weaknesses in your environment, not from Copilot itself. By knowing where problems can arise, you can take steps to protect your data and stay compliant.
Permission Issues
Oversharing in Teams and SharePoint
You may face risks when users share files or folders too broadly in Teams or SharePoint. Legacy permission sprawl can give more people access to sensitive files than you expect. Over time, shared folders and sites can collect many users who no longer need access. When you enable Copilot, it can surface this information to anyone with permission, even if you forgot they had it.
- Broad access in SharePoint and OneDrive can lead to unintentional data abuse.
- Sensitive information in Teams chats and files can be summarized and shared with larger groups.
Tip: Review who has access to shared folders and channels. Remove users who no longer need access.
Outdated Files and Broken Permissions
Old files and broken permissions create another risk. You may have documents that contain sensitive information but have not been reviewed in years. Copilot can find and use these files if permissions allow. Even one incorrect permission can expose critical data.
The analysis shows that even a single case of incorrect permission assignment can lead to significant security risks. You should not ignore these small gaps, as they can have a big impact.
Data Exposure Risks
AI Follows Permissions
Copilot always respects your existing permissions. If a user has access to a file, Copilot can use that file to answer questions or generate content. This means that any gaps in your permission settings can lead to data exposure. For example, if SharePoint permissions are not set up correctly, sensitive documents could become visible to all employees.
| Exposure Path | Description of Risk | Impact |
|---|---|---|
| SharePoint and OneDrive Permission Sprawl | Broad access from legacy sharing can lead to unintentional data abuse. | Increased risk of sensitive data exposure |
| Microsoft Teams Chats and Files | Summaries and shared content can reach unintended audiences. | Redistribution of sensitive content |
| Exchange Email and Calendar Access | Context from emails can leak confidential discussions. | Higher chance of leaking information |
Unintended Access
You may not realize when Copilot combines information from different sources. Prompts can pull together data from emails, chats, and files, sometimes creating new risks. Users might accidentally share sensitive data in AI-generated outputs, especially if they do not check the content before sending.
- Misconfigured permissions can lead to unauthorized access.
- Data leakage can happen if users share AI-generated content without reviewing it.
Compliance Concerns
Data Residency
You must know where Copilot processes your data. Microsoft offers data residency commitments, especially for enterprise customers in the EU. Copilot keeps user prompts and responses within the EU data boundary, supporting your compliance needs. You should check your contracts to make sure Copilot workloads are covered and document your privacy controls for audits.
User Consent
User consent is a key part of compliance. Copilot includes features for GDPR compliance, such as data minimization and options for data access or deletion requests. You need to include Copilot interactions in your privacy notices and Data Protection Impact Assessments. Microsoft 365 Copilot supports compliance with standards like GDPR, ISO 27001, and HIPAA, but you must keep your documentation up to date.
Note: Always inform users about how Copilot uses their data and give them clear choices about consent.
By understanding these risks and gaps, you can build a safer and more compliant environment for Copilot.
Addressing Security and Governance
Microsoft’s Actions
Security Audits
You can trust that Microsoft takes security audits seriously for Copilot deployments. Microsoft 365 Copilot agents follow strict tenant policies and admin configurations. These controls prevent unrestricted access to your organization’s data. You benefit from agent policies that manage access, sharing, and publishing settings through the Copilot Control System. Microsoft also uses lifecycle management features like versioning, staged deployments, and rollback processes. These features help you roll out Copilot in a controlled way and respond quickly if you need to make changes.
- Admins can restrict user access and set sharing controls for Copilot.
- Microsoft Purview supports compliance by providing audit logs, retention, and data loss prevention policies.
- Security governance for connectors is enforced with advanced DLP policies.
Tip: Regular audits help you spot gaps and keep your microsoft 365 environment secure.
Transparency and Roadmaps
Microsoft values transparency. You get clear roadmaps and updates about Copilot’s security and governance features. This openness helps you plan your deployments and stay informed about new tools. Microsoft shares best practices and guidance so you can align your policies with the latest standards. You can use these resources to improve your own security posture and make informed decisions for your microsoft 365 environment.
Organizational Steps
Access Reviews
You play a key role in keeping your microsoft 365 environment safe. Start by validating your security foundations. Make sure your configurations limit access to only those who need it. Review permissions in SharePoint and Teams to prevent oversharing. Enforce device trust with Microsoft Intune so only compliant devices can access sensitive data. Control external sharing by checking guest access and setting sharing expiration policies.
- Classify data with sensitivity labels.
- Align retention policies with your business needs.
- Define content ownership and manage the lifecycle of Teams and groups.
Note: Regular access reviews help you catch permission issues before they become risks.
Policy Updates
You need to keep your policies up to date as your organization grows. Provide user training on responsible Copilot use. Use Microsoft Purview, Entra ID, and Microsoft Defender to monitor and manage Copilot usage. These tools help you detect threats and respond quickly. Establish clear governance by defining rules for content, sharing, and user responsibilities.
| Metric | Before Implementation | After Implementation | Improvement |
|---|---|---|---|
| Breach Risk | High | Low | Significant reduction in risk |
| Efficiency of SecOps Teams | Low | High | Amplified efficiency |
| Cost Savings from Centralization | Minimal | Substantial | Cost efficiencies achieved |
| Threat Detection Capabilities | Reactive | Proactive | Enhanced detection and response |
You see real benefits when you follow these steps. Your breach risk drops, your security teams work more efficiently, and you save money by centralizing controls in microsoft 365. You also move from reacting to threats to stopping them before they cause harm.
User Education and Adoption

Training for Employees
Understanding Permissions
You need to understand how permissions work in Microsoft 365 before you use Copilot. Permissions control who can access files, chats, and emails. If you know how to set and review permissions, you can protect sensitive information and prevent accidental sharing. Microsoft offers training resources that explain data protection and secure collaboration. These materials help you learn best practices for interacting with AI tools.
When you have protected time to experiment and learn, Copilot becomes an empowering tool. You can discover new possibilities and build confidence in using AI. This approach encourages collaboration and helps you embrace Copilot in your daily work.
Data Handling Best Practices
You must follow data handling best practices to keep your environment safe. Always classify sensitive data and use sensitivity labels. Review files regularly and remove outdated or unnecessary information. Make sure you only share data with people who need it. Microsoft provides guidance on secure data handling, so you can learn how to use Copilot without risking exposure.
| Training Strategy | Description |
|---|---|
| Comprehensive Training Materials | Quick-start guides, FAQs, tutorial videos, and role-specific sessions help you learn Copilot basics. |
| Internal Champions | Colleagues with Copilot expertise support you and build a knowledge base for everyone. |
| Continuous Engagement | Weekly tips and open office hours reinforce learning and encourage sustained adoption. |
Ongoing access reviews, user education, and governance audits are essential for safe, compliant Copilot use.
- Contextual, real-time guidance works better than traditional training.
- In-app assistance reduces frustration and helps you learn faster.
- Peer-driven advocacy and in-context learning tools support a comprehensive enablement strategy.
Localized training sessions, such as Power Hours in different languages and time zones, demonstrate key Copilot scenarios. This approach accommodates different learning styles and ensures you receive support during the rollout.
Adoption Strategies
Change Management
You need a clear plan for Copilot adoption. Start with technical readiness, but also consider cultural factors and continuous improvement. A cross-functional task force can coordinate your strategy and align technical goals with business needs. Communication is key. Role-specific enablement helps you understand how Copilot fits into your workflow.
- Holistic adoption includes technical, cultural, and operational readiness.
- Dedicated teams ensure alignment between business and IT objectives.
- Clear communication helps you embrace AI confidently.
Ongoing Support
You benefit from ongoing support as you use Copilot. Regular updates, feedback channels, and help-desk structures keep you informed and engaged. Internal champions answer questions and share tips. You can measure business impact by linking Copilot usage to performance indicators. AI governance stays embedded in your enterprise structure, ensuring compliance and security.
Microsoft offers extensive training resources and user education materials. These cover topics like secure collaboration, data protection, and best practices for interacting with AI.
You build a strong foundation for Copilot adoption when you combine training, change management, and ongoing support. This approach helps you use Copilot safely and effectively.
Preparing for Copilot
Action Plan for Organizations
Governance Checklist
You need a clear action plan to prepare your organization for Microsoft Copilot. Start by understanding the architecture and requirements for Microsoft 365 Copilot. Make sure you have the right licenses and a Microsoft Enterprise ID. Build a governance framework that covers content management and security. Protect sensitive data with strong security measures. Move your content into Microsoft 365 using migration tools.
- Review the architecture and requirements for Microsoft 365 Copilot.
- Confirm you have a Microsoft Enterprise ID and the correct licenses.
- Create a governance framework for content management and security.
- Set up security measures to protect sensitive data.
- Migrate your content into Microsoft 365 using tools like ShareGate.
Tip: A strong governance checklist helps you avoid surprises during deployment.
Lifecycle Management
You must manage the full lifecycle of your Copilot deployment. Track users, licenses, and custom agents from start to finish. Use regular reports to monitor adoption, costs, and compliance. Collect feedback from users to improve your policies and processes. Update your governance as new AI features and regulations appear.
You can measure readiness for Copilot deployment using these criteria:
| Criteria | Description |
|---|---|
| User Readiness and Adoption | Checks if your organization supports AI adoption and if employees feel ready. |
| Metrics and Baselines | Defines how you will measure usage, adoption rates, and task completion times. |
| Measurable Success Criteria | Sets clear goals that match your business objectives. |
| Monitoring Plan | Tracks engagement and satisfaction with analytics and surveys. |
Track adoption rates, feature use, engagement time, task completion, user satisfaction, and cost efficiency to see how well Copilot works for you.
Individual Readiness
Managing Permissions
You play a key role in keeping your data safe when using Copilot. Always follow the principle of least privilege. Give users only the access they need. Audit permissions often to catch changes or mistakes. Use identity management to control who can use Copilot and under what conditions. Set up conditional access policies to secure every session. Add multi-factor authentication for all users. Limit access based on device and location. Use just-in-time elevation for special cases, so no one has permanent high-level access.
- Enforce least privilege for all users.
- Audit permissions regularly.
- Use identity management and conditional access.
- Require multi-factor authentication.
- Restrict access by device and location.
- Apply just-in-time elevation when needed.
Note: Regular permission reviews help you prevent unauthorized access and data leaks.
Reporting Issues
You should report any issues or suspicious activity right away. If you see something unusual, tell your IT team. Quick reporting helps your organization respond to threats and fix problems before they grow. Stay alert and encourage your team to do the same. This keeps your Copilot environment safe and secure.
- Report suspicious activity immediately.
- Share feedback about Copilot with your IT team.
- Stay informed about best practices and updates.
Staying proactive and engaged helps you and your organization get the most from Microsoft Copilot.
You achieve Copilot readiness by focusing on governance, permissions, and user education. Microsoft Security gives you strong protection, but your results depend on your daily practices.
- Regular audits and continuous training keep your environment secure.
- Clear policies and structured readiness assessments help you avoid risks.
- Ongoing education ensures everyone understands their responsibilities.
Key steps for leaders and users:
- Enforce multi-factor authentication and sensitivity labels.
- Select pilot use cases and define success metrics.
- Provide training and support from day one.
Stay proactive and keep learning to maximize Copilot’s benefits.
FAQ
What does copilot readiness mean for your organization?
Copilot readiness means you have strong governance, clear permissions, and user education. You review your environment, clean up data, and set up policies. This helps you use copilot safely and get the most value from ai tools.
How does microsoft copilot use data classification?
Microsoft copilot uses data classification to identify sensitive data and apply protection. You label files and emails, so copilot respects access rules. This process keeps sensitive content secure and supports compliance.
Why is user education important for copilot readiness?
User education helps you understand how copilot interacts with ai and your data. You learn to manage permissions, follow data classification rules, and avoid sharing sensitive information. Training builds confidence and supports ai readiness.
How does copilot protect sensitive data?
Copilot follows your permissions and uses data classification to protect sensitive data. You set up sensitivity labels and data loss prevention policies. Copilot only accesses information you allow, keeping sensitive content safe.
What steps should you take before deploying copilot?
You start with a copilot readiness assessment. Review permissions, clean up old files, and classify data. Set up governance policies and train users. These steps help you prepare for ai adoption and reduce risks.
How does copilot support compliance?
Copilot supports compliance by using tools like Microsoft Purview and sensitivity labels. You track copilot activity, monitor sensitive content, and follow regulations. Copilot readiness ensures you meet legal requirements and protect your data.
What is the role of data classification in copilot readiness?
Data classification helps you organize information for copilot. You label files based on sensitivity. Copilot uses these labels to control access and protect sensitive data. This process supports governance and ai readiness.
How can you measure copilot readiness?
You measure copilot readiness by checking user adoption, reviewing permissions, and tracking ai usage. Set clear goals, monitor engagement, and collect feedback. Regular audits and training help you maintain copilot readiness.
🚀 Want to be part of m365.fm?
Then stop just listening… and start showing up.
👉 Connect with me on LinkedIn and let’s make something happen:
- 🎙️ Be a podcast guest and share your story
- 🎧 Host your own episode (yes, seriously)
- 💡 Pitch topics the community actually wants to hear
- 🌍 Build your personal brand in the Microsoft 365 space
This isn’t just a podcast — it’s a platform for people who take action.
🔥 Most people wait. The best ones don’t.
👉 Connect with me on LinkedIn and send me a message:
"I want in"
Let’s build something awesome 👊
1
00:00:00,000 --> 00:00:05,600
Hello everyone, to another edition of the M365 at ML Podcast.
2
00:00:05,600 --> 00:00:13,280
Today I have a special guest as Oslo Holklimpon and we talk a little bit about co-pilot,
3
00:00:13,280 --> 00:00:19,040
security, governance and so on and I'm really happy to have you here.
4
00:00:19,040 --> 00:00:24,880
And yeah, but will our first questions can you introduce your also a little bit?
5
00:00:24,880 --> 00:00:29,640
Yes, well my name is Osno, I live in the southern parts of Norway,
6
00:00:29,640 --> 00:00:33,800
but I work around the Nordic countries basically with co-pilot implementation
7
00:00:33,800 --> 00:00:37,400
towards with security and governance.
8
00:00:37,400 --> 00:00:42,840
Just briefly talk a little bit about how you can make the invention and creativity
9
00:00:42,840 --> 00:00:47,240
towards agents and co-pilot and everything AI and still manage to govern them.
10
00:00:47,240 --> 00:00:50,840
So this is really interesting and hot topic these days.
11
00:00:50,840 --> 00:00:57,640
So yeah, that's, I worked with SharePoint for, I think I've managed to count 20-3 years,
12
00:00:57,640 --> 00:01:01,640
so Nordic's longest SharePoint have been alive, but not far apart.
13
00:01:01,640 --> 00:01:06,520
And I work with teams and then obviously Merch's with Perview and then co-pilot.
14
00:01:06,520 --> 00:01:09,480
So that's sort of my area of expertise now.
15
00:01:09,480 --> 00:01:11,000
Yeah.
16
00:01:11,000 --> 00:01:17,400
I see also you have a lot of different tools like productivity, tools like co-pilot,
17
00:01:17,400 --> 00:01:19,080
SharePoint teams and so on.
18
00:01:19,080 --> 00:01:24,680
And also I see peer view and how it is, it's a different tool.
19
00:01:24,680 --> 00:01:34,200
So I think how see you to, how do these tools work together?
20
00:01:34,200 --> 00:01:38,840
Well, it started with back in the days when I did internal IT.
21
00:01:38,840 --> 00:01:41,720
We worked with SharePoint, like SharePoint was back in the days.
22
00:01:41,720 --> 00:01:46,680
It was a bit heavy on dynamics sort of.
23
00:01:46,680 --> 00:01:49,560
It wasn't easy to structure and restructure and do anything,
24
00:01:49,560 --> 00:01:55,400
but I saw really early on both with, I'm working to active directory as well,
25
00:01:55,400 --> 00:01:57,160
back really back in the days.
26
00:01:57,160 --> 00:02:01,880
You didn't have that, you didn't have that really flow of information security,
27
00:02:01,880 --> 00:02:06,440
right? You could, you could obviously set up some access points and you could set up SharePoint
28
00:02:06,440 --> 00:02:09,240
with Access, right, and stuff, but you didn't have that security tool,
29
00:02:09,240 --> 00:02:11,160
secure in your data in a proper way.
30
00:02:11,160 --> 00:02:17,320
So when Perview finally came along, it made it so much easier to make sure that your data was safe.
31
00:02:18,440 --> 00:02:21,880
And that just sort of, you know, my heart just swelled.
32
00:02:21,880 --> 00:02:24,360
Finally, something that I could use.
33
00:02:24,360 --> 00:02:26,200
Finally, something that made sense.
34
00:02:26,200 --> 00:02:29,560
And then going from SharePoint and Teams to Worlds,
35
00:02:29,560 --> 00:02:33,560
working with that information security, because I've always worked with the information flow.
36
00:02:33,560 --> 00:02:37,640
And the tiny bit of security we have before, so it made it really,
37
00:02:37,640 --> 00:02:41,400
really easy transfer into the Perview world.
38
00:02:41,400 --> 00:02:45,960
Yeah, and I see often people,
39
00:02:46,760 --> 00:02:51,640
we talk about security, governance, compliance, and I think it's all the same.
40
00:02:51,640 --> 00:02:54,680
How will you differ this topics?
41
00:02:54,680 --> 00:02:58,360
Oh, yeah, I think, I think you're right.
42
00:02:58,360 --> 00:03:03,400
It's everything is within the same concept, because we're talking about zero trust.
43
00:03:03,400 --> 00:03:05,320
We always talked about zero trust.
44
00:03:05,320 --> 00:03:08,840
And it's basically a little bit of the same concept, because we, you know,
45
00:03:08,840 --> 00:03:11,560
we were talking about we need to lock our doors and everything.
46
00:03:12,840 --> 00:03:18,840
But we would also want to make sure that if somebody gets into our house, we need to make sure that the files as well
47
00:03:18,840 --> 00:03:22,360
are locked away in the cabinets or in the saves or whatever.
48
00:03:22,360 --> 00:03:26,920
So it's basically like, yeah, you have the identity security and everything secure
49
00:03:26,920 --> 00:03:30,760
there, but then you need to make sure that the files you get as well are secured.
50
00:03:30,760 --> 00:03:36,200
You can't just send them out of the house, like, however you would like.
51
00:03:36,200 --> 00:03:39,800
So I think everything fits together so well.
52
00:03:39,800 --> 00:03:44,520
And especially now when you have, when you have entry and you have defender and you have
53
00:03:44,520 --> 00:03:50,200
purview that just work well together and it's so easy to see how it works together as well.
54
00:03:50,200 --> 00:03:55,640
So I think everything comes back to having governance, because you have governance over your identity
55
00:03:55,640 --> 00:04:00,360
security, you have governance over your defender and points and everything in between.
56
00:04:00,360 --> 00:04:02,440
And then you have governance over your files.
57
00:04:02,440 --> 00:04:07,480
So I think, I think we can't differentiate between them as much as we did before.
58
00:04:07,480 --> 00:04:12,760
It was more siloed earlier on and now we're talking about having them all fit together.
59
00:04:12,760 --> 00:04:15,720
You can use conditional access towards purview.
60
00:04:15,720 --> 00:04:19,160
So you can make sure that files are not accessible if you're not
61
00:04:19,160 --> 00:04:25,720
on the right set of approved offices or countries.
62
00:04:25,720 --> 00:04:31,240
So I think everything makes more of a sort of more whole approach,
63
00:04:31,240 --> 00:04:36,360
that direct and holistic sort of approach to everything now compared to it how it was.
64
00:04:37,160 --> 00:04:39,800
I get a little bit, you know, when I start talking about this.
65
00:04:39,800 --> 00:04:42,440
I get warmed up.
66
00:04:42,440 --> 00:04:44,760
That is nice.
67
00:04:44,760 --> 00:04:45,720
Yeah.
68
00:04:45,720 --> 00:04:52,840
So we also have, yeah, I think we have a little change to people.
69
00:04:52,840 --> 00:04:57,480
I have Sarah Eastkova, AI co-pilot, the Microsoft product.
70
00:04:57,480 --> 00:05:00,360
The people have a different look.
71
00:05:00,360 --> 00:05:04,840
And I read so much on LinkedIn and their say doubts.
72
00:05:04,840 --> 00:05:07,720
Absolutely no risk if you use co-pilot.
73
00:05:07,720 --> 00:05:13,160
How did you see this or what could be the biggest risk for organization when they
74
00:05:13,160 --> 00:05:14,280
adopting co-pilot?
75
00:05:14,280 --> 00:05:22,520
Basically what we see, like co-pilot and AI doesn't create new problems.
76
00:05:22,520 --> 00:05:26,280
It basically just shines a bright, big yellow light on them.
77
00:05:26,280 --> 00:05:31,000
Because the problems they're already, you have over exposed data.
78
00:05:32,040 --> 00:05:34,600
You have data that's outdated.
79
00:05:34,600 --> 00:05:39,000
Of course people have so much data that's outdated but I don't realize it themselves.
80
00:05:39,000 --> 00:05:46,280
So I think that's the issue here when you start using co-pilot is because we have a lot of people
81
00:05:46,280 --> 00:05:50,920
I talk to is like, yeah, the AI train, you have to jump aboard the AI train, it's leaving now,
82
00:05:50,920 --> 00:05:53,000
it's leaving the station now hurry up hurry up.
83
00:05:53,000 --> 00:05:55,320
Like, yeah, but shouldn't you at least know where you're going?
84
00:05:55,320 --> 00:06:00,280
You know, you can easily jump on a train but it should know the destination for that train.
85
00:06:01,160 --> 00:06:05,400
You should maybe have packed a bag and you should maybe have bought a ticket sort of.
86
00:06:05,400 --> 00:06:07,640
Sometimes you might need a passport.
87
00:06:07,640 --> 00:06:12,520
I'm just, you know, you need to prepare a little bit and I think that's the thing that people
88
00:06:12,520 --> 00:06:19,400
haven't thought of when they started using AI or co-pilot. They just clicked it on and now they
89
00:06:19,400 --> 00:06:24,520
have over exposed data. They have sensitive information that nobody knows how to handle.
90
00:06:24,520 --> 00:06:29,480
Or they're so afraid of turning it on that they just don't start doing the groundwork.
91
00:06:30,280 --> 00:06:34,840
Because everything with co-pilot is just going to see what the user have access to right
92
00:06:34,840 --> 00:06:38,600
and when you do the lift and shift from the file shares into teams in the pandemic
93
00:06:38,600 --> 00:06:44,040
and you just let loose on everything and you share it all the folders because you just need it to
94
00:06:44,040 --> 00:06:47,640
have access to something at the moment. You know, nobody's cleaned that up.
95
00:06:47,640 --> 00:06:55,880
So it's old legacy and it's old just basically shitting shit out statement from my part because you
96
00:06:55,880 --> 00:07:01,160
don't have a, you don't have the structure, you don't have the overview of how your environment
97
00:07:01,160 --> 00:07:08,600
should look like and you really don't have any clue on how the permissions are being handled
98
00:07:08,600 --> 00:07:15,720
or not handled in usually most cases. So I think having co-pilot and just turning it on, it's,
99
00:07:15,720 --> 00:07:23,240
yeah, yeah, feel free to do so. Where be aware of the risks? Basically, just be very aware of the risks.
100
00:07:25,240 --> 00:07:34,680
Yeah, and then what do you think? How can you help organization to prepare for co-pilot or other AI
101
00:07:34,680 --> 00:07:41,800
stuff? Well, I've written this in a blog a couple of times as well because I've
102
00:07:41,800 --> 00:07:49,960
arrived with some of my brilliant not only co-workers but the people who love Marksoft.
103
00:07:51,080 --> 00:07:57,720
So what we do, though, is we write up like what should you do as a bare minimum. And I think a starting
104
00:07:57,720 --> 00:08:03,240
point is always just find sensitive info types and create those yourself like recognize and sensitive
105
00:08:03,240 --> 00:08:09,000
data within your tenant and make sure that you label them accordingly. When you have, because you
106
00:08:09,000 --> 00:08:13,240
when you have sensitive info types and you have labeled your files with sensitive labels, sensitivity
107
00:08:13,240 --> 00:08:19,400
labels, you can then make sure that either co-pilot don't use the most sensitive data. If you have
108
00:08:19,400 --> 00:08:23,240
something that's highly confidential, co-pilot don't know how to touch it, that will help.
109
00:08:23,240 --> 00:08:29,480
You can make sure that SharePoint sites that have a lot of confidential data is not allowed to be
110
00:08:29,480 --> 00:08:35,720
used to what co-pilot are all. And you can put it to your p-polices and when you started using
111
00:08:35,720 --> 00:08:41,960
Perv, you have so much control mechanisms to make sure that AI and co-pilot can't handle it.
112
00:08:41,960 --> 00:08:45,800
And of course, if you start crossing that into into you and you start crossing that into
113
00:08:47,800 --> 00:08:53,080
into the entry ID and everything, you can even control it more. Like I said, if you have sensitivity
114
00:08:53,080 --> 00:08:58,280
labels and you connect the conditional access saying that you can't access this data if you travel
115
00:08:58,280 --> 00:09:02,760
to Spain, but you can access it if you're in the Nordic countries or something like that.
116
00:09:02,760 --> 00:09:08,920
Then you're so much more secure because if somebody then gets a whole of your laptop with your login,
117
00:09:08,920 --> 00:09:17,000
they still can't access that data. So I think creating all this and just like a basis have
118
00:09:17,000 --> 00:09:22,920
sensitivity labels, have sensitive info types and the LPs, then you've started, you lay the foundation,
119
00:09:22,920 --> 00:09:29,000
then you can sort of prepare a bit more for that AI train, then you can maybe abort it a little bit,
120
00:09:29,000 --> 00:09:36,440
not maybe the entire group of people, but just a few of them to get started at least, I think.
121
00:09:36,440 --> 00:09:43,480
Yeah, I have let's, two weeks I'll have a look a little bit into Entra and I see
122
00:09:44,760 --> 00:09:53,320
the co-pilot stuff is more handled like a person like it, like an application. So I found this really
123
00:09:53,320 --> 00:10:02,920
interesting. But what did you think it's the, when we talk about zero trust, what's impact or what
124
00:10:02,920 --> 00:10:11,560
problems we have with co-pilot in this area? Well, I think at the moment when we look at our
125
00:10:12,440 --> 00:10:19,880
well our customers, for example, we do see a lot of customers still not using MFA to the full extent
126
00:10:19,880 --> 00:10:27,400
and just going back to that basic, just having people realizing that they need to control their
127
00:10:27,400 --> 00:10:34,440
identities. And that's that's sort of a huge impact just there if we can get them to get into
128
00:10:34,440 --> 00:10:41,960
just using the basic of security, like because I don't think we have, have that many organizations
129
00:10:41,960 --> 00:10:48,040
that have started implementing zero trust full scale. They've just started a little bit here
130
00:10:48,040 --> 00:10:55,000
and a little bit there and then they've, you know, yes, some people have maybe added that they're
131
00:10:55,000 --> 00:11:02,440
not allowed to use chatgipotty, Gemini, Grunk from their web readers, but only from Edge, for example.
132
00:11:02,440 --> 00:11:09,240
So it's still fully open in Chrome or anything else. And they have a sort of a missile line security
133
00:11:09,240 --> 00:11:14,920
aspect, so they've started on so many different places, but they don't have any real security.
134
00:11:14,920 --> 00:11:23,080
So I think just when it comes to having zero trust enabled and built into everything you do,
135
00:11:23,080 --> 00:11:28,360
I think AI and co-pilot will be highly efficient to use within the organization.
136
00:11:28,360 --> 00:11:34,120
Because everything AI will be a risk if you don't have that.
137
00:11:35,880 --> 00:11:46,840
Yeah, I think yeah, it's a topic we have to teach the CFO and so on because I see often,
138
00:11:46,840 --> 00:11:53,000
let's do Starbiz AI, it's cool, we need AI for everything.
139
00:11:53,000 --> 00:11:55,320
Absolutely.
140
00:11:55,320 --> 00:12:05,240
And how did you see, how did sensitivity labels help on security in especially in co-pilot?
141
00:12:05,800 --> 00:12:12,280
Well, it's like, if you have, depending on how you set them up, obviously, but if you do set them
142
00:12:12,280 --> 00:12:19,080
up as a sort of the standardized public internal sensitive, highly confidential, you can always
143
00:12:19,080 --> 00:12:24,040
set that highly confidential, it's not allowed to be seen by co-pilot, like you can't extract
144
00:12:24,040 --> 00:12:31,400
information from it. So that makes it so much easier for not handling that type of data. You can't
145
00:12:31,400 --> 00:12:36,040
ask co-pilot to find that information, you can't ask co-pilot to retrieve that information from
146
00:12:36,040 --> 00:12:43,640
one word document to another if you have that label on. And you can tell certain SharePoint sites to
147
00:12:43,640 --> 00:12:49,240
have that type of label and the same things apply. You can also set different settings on the SharePoint
148
00:12:49,240 --> 00:12:55,800
site though. So one SharePoint site can be excluded from co-pilot, so you can't find information there.
149
00:12:55,800 --> 00:13:01,240
But then again, using sensitivity labels, if you do it right, you can also add
150
00:13:01,560 --> 00:13:07,560
not only with co-pilot, but you could add DLP policies. So a highly confidential document is not
151
00:13:07,560 --> 00:13:13,480
allowed to be sent out of the house, for example, just to maybe a certain set of people or domains.
152
00:13:13,480 --> 00:13:20,440
So you have so much more security towards those. And when you do a word document and you have a
153
00:13:20,440 --> 00:13:27,400
public word document and you retrieve information from a sensitive document, but that originally
154
00:13:27,400 --> 00:13:33,560
public document will become sensitive because it inherits the strictest confidential reality label.
155
00:13:33,560 --> 00:13:40,120
And that will make people a bit more aware of what they're actually typing and doing within that
156
00:13:40,120 --> 00:13:48,360
area of using AI. So when I will start, for my understanding, sensitive
157
00:13:48,360 --> 00:13:58,280
data, the levels is the second part. I have to do first data classification. Yeah. Yeah. So if you start
158
00:13:58,280 --> 00:14:02,200
with a sensitive info type to classify all your data and then you cannot label to them.
159
00:14:02,200 --> 00:14:09,480
And how can we balance between, I say, productivity and security when we start,
160
00:14:09,480 --> 00:14:16,840
or when we go to co-pilot readiness? Oh, good question because it depends on where you're asked as well,
161
00:14:16,840 --> 00:14:23,960
because if you don't have anything at all going towards security, it's going to be a task to say
162
00:14:23,960 --> 00:14:31,960
at least to get started. But I think sometimes I say that you can have a pilot group who have access
163
00:14:31,960 --> 00:14:40,520
to co-pilot and then work on a security parallel towards that. But you should then have the pilot
164
00:14:40,520 --> 00:14:47,640
group be very aware of the possibility to find data that's a bit too sensitive. Make sure that
165
00:14:47,640 --> 00:14:54,680
they report back to somebody who can remediate that and make sure that you have something set up
166
00:14:54,680 --> 00:15:00,440
with for the pilot group to not overshare everything. But then again, if you try to start with a
167
00:15:00,440 --> 00:15:06,040
purview track with sensitivity labels and all that, you should also go back and look at sharepoint
168
00:15:06,040 --> 00:15:11,960
and see everything you overshared because that's going to be a lot. Sharepoint also have with Sharepoint
169
00:15:11,960 --> 00:15:16,680
Advanced Management now. Seems like I'm selling all these solutions, but they're there. There are
170
00:15:16,680 --> 00:15:23,080
in Microsoft 365 now, so you have all those solutions anyway. But using Sharepoint Advanced Management,
171
00:15:23,080 --> 00:15:29,160
looking at broken inheritance, sharing risks and all that, it's also something that you really
172
00:15:29,160 --> 00:15:37,960
should do before you can be AI ready. So yeah, classification, oversharing, and then it's a bit
173
00:15:37,960 --> 00:15:45,880
more safer. So, I see a lot of people, or it's just like a password, co-pilot, readiness,
174
00:15:45,880 --> 00:15:54,280
so we also have it in the title, but it's like a little bit a password. What do you really mean
175
00:15:54,280 --> 00:16:01,320
co-pilot readiness? What does it actually mean for you or for your perspective? Yeah, I think I would
176
00:16:01,320 --> 00:16:06,760
have, I would have basically just moved co-pilot for it because it's more of a sort of tenant or
177
00:16:06,760 --> 00:16:14,760
M365 readiness thing because this is things you should have already have in place. It's not something new,
178
00:16:14,760 --> 00:16:21,800
it's not something that just co-pilot needs to have in place to be able to have work safely.
179
00:16:21,800 --> 00:16:28,440
It's something you should have done ages ago. I had a customer telling me that it's got a saying that
180
00:16:28,440 --> 00:16:35,320
it should have happened last year, so start yesterday. And this is something that everybody
181
00:16:35,320 --> 00:16:41,720
should have in place right now, whether or not you're using co-pilot because it's all about
182
00:16:41,720 --> 00:16:47,640
securing your data, I know how to handle your data, and I think that having the possibility now
183
00:16:47,640 --> 00:16:54,040
without co-pilot like sending emails everywhere with attachments with no security behind it, it's just,
184
00:16:54,040 --> 00:17:00,760
yeah, it's encrypted, but you don't know if the recipient can forward it or print it or copy the
185
00:17:00,760 --> 00:17:06,120
content, like you don't, you have no clue where it's sent, basically because you share something and
186
00:17:06,120 --> 00:17:11,640
then you don't know where it goes. So I think having that foundation, it should be like a
187
00:17:11,640 --> 00:17:20,120
march of 365 readiness or foundation, and not necessarily co-pilot readiness. But again, it's
188
00:17:20,120 --> 00:17:24,920
buzzwords because that's what people want. They want the buzzwords.
189
00:17:24,920 --> 00:17:33,800
It's like we talk, I have things in years about governance and so on, and security, and I think,
190
00:17:33,800 --> 00:17:42,120
yeah, this is something we try as, I don't know, as February came, all we have to do, all
191
00:17:42,120 --> 00:17:48,120
February readiness, we have to do all this stuff and the company say, I think they actually,
192
00:17:48,120 --> 00:17:56,840
this is a little bit more change since AI is there because they seem more productivity or
193
00:17:56,840 --> 00:18:03,480
faster productivity when you build up. I don't know, a data warehouse, it's called two-year
194
00:18:03,480 --> 00:18:09,880
time, and then they see, okay, I can, oh, I get direct an email or something else. I think
195
00:18:09,880 --> 00:18:17,240
this makes it more, yeah, brings it more, yeah, the governance topic more inside the companies.
196
00:18:17,240 --> 00:18:30,760
But, yeah, when we think about how should organized prepare their data and also not their data,
197
00:18:30,760 --> 00:18:35,480
I think more their employees, their employees for enabling AI.
198
00:18:35,480 --> 00:18:47,480
Yeah, I agree with this because it's the AI new shiny tool, right? Everybody wants it, so that's
199
00:18:47,480 --> 00:18:53,160
why people are more eager and more open to listen now compared to, like I said, with getting
200
00:18:53,160 --> 00:18:59,240
fabric ready. But it's more of a personal tool, so everybody can benefit from it. It's more
201
00:18:59,240 --> 00:19:05,080
easier to see values straight away, so that's why everybody wants it and everybody's more
202
00:19:05,080 --> 00:19:16,520
passionate about getting here, basically. But it's certainly something that we
203
00:19:16,520 --> 00:19:26,040
need to discuss with both the end users and the management on how can we prepare the uses for
204
00:19:26,040 --> 00:19:30,760
this because we have, everybody talks about user adoption, right, and getting that up and
205
00:19:30,760 --> 00:19:36,040
getting people to understand and how important that is. But I think the key values is making sure
206
00:19:36,040 --> 00:19:43,160
that employees know how to use it. Like they need to have actual use cases and then need to know
207
00:19:43,160 --> 00:19:49,480
that what they're filing, like I said, like Copilot will just show what you have access to,
208
00:19:49,480 --> 00:19:56,760
but they need to understand that if you have 20 files called budget, Copilot will find those 20
209
00:19:56,760 --> 00:20:04,440
files called budget and not always find the one most relevant to you. So you need to be able to
210
00:20:04,440 --> 00:20:10,840
delete all data. You need to trust that having version control in Microsoft 365 is enough
211
00:20:10,840 --> 00:20:17,640
that you don't need 20 copies of one file is all right to have just one. And I think that
212
00:20:18,440 --> 00:20:25,080
we are a bunch of orders, right, because we are so afraid of deleting stuff because we might use it
213
00:20:25,080 --> 00:20:34,280
again one day. I'm both at home and with data, I'm really good at deleting. Like I've created
214
00:20:34,280 --> 00:20:40,040
presentations and they're two years old. They give me no value today because the information
215
00:20:40,040 --> 00:20:46,120
there is outdated. I might save some of the images I have, but I will delete the power points because
216
00:20:46,120 --> 00:20:51,960
they give me absolutely no value now. And the same goes towards a lot of word files. If they don't have
217
00:20:51,960 --> 00:21:00,360
any sort of archive, leg, regulatory demand to store them, delete it because anything you created
218
00:21:00,360 --> 00:21:07,480
five to two years ago will be outdated. And I think that's something that uses need to understand
219
00:21:07,480 --> 00:21:16,040
when we do user adoption. One of the key points is have good quality data. That's going back to
220
00:21:16,040 --> 00:21:22,520
shit and shit out. If you have bad data and you have data that's so outdated, but it won't give
221
00:21:22,520 --> 00:21:29,000
you any value, you know, that's it's not going to give you anything. How could be an authorian
222
00:21:29,000 --> 00:21:40,920
elevator pitch when you speak to to I don't know that means execute executives executives. I'm sorry
223
00:21:41,640 --> 00:21:49,400
when you explain AI governance, but when it's an elevator pitch. Oh, I'm sure if I have an
224
00:21:49,400 --> 00:21:57,480
elevator pitch because they usually get longer time. It's a Borschkalif elevator.
225
00:21:57,480 --> 00:22:05,480
Oh, I think that when I try to express and talk about how governance is, it's a little bit like
226
00:22:05,480 --> 00:22:11,480
right now if you have AI and you can I cope all it towards it without having any governance in place,
227
00:22:11,480 --> 00:22:16,840
it will be like taking every file you have, shuffling them around and just throwing it out of the
228
00:22:16,840 --> 00:22:21,960
street. That's basically what you can do. You have no clue what sensitive you have no clue what
229
00:22:21,960 --> 00:22:28,600
people can find and send out. So I think that's basically how I tell it. It's just shuffling all
230
00:22:28,600 --> 00:22:33,880
the files and throwing them out because everybody can find it, everybody can create something out of it
231
00:22:33,880 --> 00:22:39,000
and everybody can send it where something wherever they want. And I think that's a little bit of an
232
00:22:39,000 --> 00:22:47,080
eye-opener for a lot of a lot of sea level people. And if I'm lucky enough to do a pervious sort of
233
00:22:47,080 --> 00:22:54,600
analysis first and can go and see how many identity numbers, how many passport numbers, how many
234
00:22:54,600 --> 00:22:59,880
credit card numbers and so on, they have in their environment. Before I sort of get to them,
235
00:22:59,880 --> 00:23:06,680
I can showcase, but yeah, you look here, you have this many personal identity numbers within your
236
00:23:06,680 --> 00:23:11,960
tenants. You think this is viable towards all the laws you have to follow. And then they're like,
237
00:23:11,960 --> 00:23:18,600
oh, yeah, we didn't know that. So usually you get attention quite quickly when you talk about how
238
00:23:18,600 --> 00:23:24,680
bad things can look like. I've been places where people have said, we'll see a level people have
239
00:23:24,680 --> 00:23:31,320
said, yeah, no, we have nothing because people are not allowed to store information like bad
240
00:23:31,320 --> 00:23:36,840
on Microsoft 365. We have systems for that. And then you go into Perv, you're going to do your search
241
00:23:36,840 --> 00:23:44,680
and then, yeah, you have about 120 here. And then you have 300 there, that's 1000, not just 100. But, yeah.
242
00:23:44,680 --> 00:23:55,480
So it's, I usually, they usually listen. And I have one company. We talked about the concerns
243
00:23:55,480 --> 00:24:01,080
of discovering data that we shouldn't see and stuff like that. And we're quite serious in it.
244
00:24:01,400 --> 00:24:06,520
We can find so much things if you turn a couple it on and it's really bad and stuff like that.
245
00:24:06,520 --> 00:24:11,320
And the management said, well, that's good because then we will actually see that we have a problem
246
00:24:11,320 --> 00:24:18,040
and then we can fix it. Like, yeah, we can. So it was, it was a great project. I was able to do everything.
247
00:24:18,040 --> 00:24:25,720
But some people also tend to just close their eyes and, yeah, no, don't want to hear about it.
248
00:24:27,320 --> 00:24:36,360
When we think about to start with co-pilot AI and deputy, who should be in your perspective responsible
249
00:24:36,360 --> 00:24:42,600
for the topic in the city, it's a security team, it's a stakeholder or it's a financial team.
250
00:24:42,600 --> 00:24:49,720
I think it's representatives from all departments, depending on the size of the organization.
251
00:24:49,720 --> 00:24:57,080
You need IT there. And like IT, both in security and just overall management, management for
252
00:24:57,080 --> 00:25:03,640
365, but you also need, you need some money from sea level because they need to own it.
253
00:25:03,640 --> 00:25:10,120
It's not enough for IT to own it because being coming from in-house IT myself, I know I can,
254
00:25:10,120 --> 00:25:15,640
you can stand atop of a table and scream, but nobody will listen. So it's more, you need to have
255
00:25:15,640 --> 00:25:19,880
the management on board. You need to have them with you and say that, yes, we need to do this.
256
00:25:19,880 --> 00:25:26,440
Otherwise, nothing will happen. And you need people from all different areas because you need to
257
00:25:26,440 --> 00:25:33,640
have the ones who can make sure that endpoints are secured. You need identity secured. You need
258
00:25:33,640 --> 00:25:39,480
the data secured. You need SharePoint secured. Then you need to talk with finance because they need
259
00:25:39,480 --> 00:25:44,040
to understand that this might be a cost now, but it will save you in the long run. And you need to
260
00:25:44,040 --> 00:25:48,520
have sea level people on board to say that, yes, this is what we're going to do. This is our strategy.
261
00:25:48,520 --> 00:25:55,880
This is where we're going. So you need a group of all of those people. And usually I can thank
262
00:25:55,880 --> 00:26:03,320
God to say that most of my projects have that sort of AI project team around it. And of course,
263
00:26:03,320 --> 00:26:08,280
sea levels don't always have to be a part of the entire project, like the day-to-day project,
264
00:26:08,280 --> 00:26:13,160
but they need to be a part and take the decisions and give their support for the project.
265
00:26:13,160 --> 00:26:23,000
Yeah, I was like a sponsor or a, let's take over. Yeah, I think that's a really, really good idea.
266
00:26:23,000 --> 00:26:32,600
When we think a little bit, I have seen the, was co-pilot for security. I don't know what's the,
267
00:26:32,600 --> 00:26:38,680
this is the right name for it, but I've seen this intro. I, I, I wonder,
268
00:26:38,680 --> 00:26:46,840
when we look in the next three years or I don't know if it's, it's already, can I really run my security
269
00:26:46,840 --> 00:26:58,200
by, by this co-pilot or I have, I have tried it a little bit and it's supposed to, when you have E5
270
00:26:58,200 --> 00:27:03,720
licenses that will, I'm not going to go into the license details because I don't have a PhD
271
00:27:03,720 --> 00:27:13,480
in Microsoft licensing and I, to, to difficult to understand that same. But if you have subscriptions
272
00:27:13,480 --> 00:27:18,840
to, is it, what's it and you have everything connected to the security co-pilot, it can do a lot of good
273
00:27:18,840 --> 00:27:25,960
things, but I think it's mostly useful to, was investigating like when incidents have happened
274
00:27:25,960 --> 00:27:31,000
and not necessarily when you need to set up your environment. So it's more of a, how can I,
275
00:27:31,000 --> 00:27:36,600
how can I figure out what's gone wrong now, sort of thing. Okay. Some other people might disagree
276
00:27:36,600 --> 00:27:45,480
with me, but that's how I looked at it so far. That's awesome. And, um, yeah, did you think in the next
277
00:27:45,480 --> 00:27:52,200
three years we need more, think about security or, or will it be less because we have all this
278
00:27:52,200 --> 00:28:00,840
nice tools and automations? I think we, I think we need more. Um, and that's based on what I've seen
279
00:28:00,840 --> 00:28:06,120
with my customers and other organization and other people I talk to because nobody is
280
00:28:06,120 --> 00:28:13,480
100% there yet with security. And when I, when I talk about security in this part is everything
281
00:28:13,480 --> 00:28:20,360
security, not, not just purview. Um, but we still have a long ways to go and I think that having tools
282
00:28:20,360 --> 00:28:27,400
like co-pilot security and other AI tools towards that still needs human touch. We still need to
283
00:28:27,400 --> 00:28:32,920
understand everything that goes on. We still need to understand how we can secure stuff. And I think
284
00:28:32,920 --> 00:28:37,880
that if we did everything, uh, suggested by Microsoft, for example, in the different tools like
285
00:28:37,880 --> 00:28:42,360
Imperview and Defender and all that, if you did everything that suggested to keep your environment
286
00:28:42,360 --> 00:28:47,960
secure, you wouldn't be able to turn on your computer. Um, because there's so many features
287
00:28:47,960 --> 00:28:53,640
that just are a bit too strict when you put all together. So you still need that, you still need
288
00:28:53,640 --> 00:29:01,000
that understanding of how things work and you still need to, to set it up with, um, with a strategy
289
00:29:01,000 --> 00:29:06,680
in mind, because not all, especially in the Nordics, like I work, I work with all the Nordic countries
290
00:29:06,680 --> 00:29:13,400
and especially here, we, we might not have all the large organizational structure that a couple
291
00:29:13,400 --> 00:29:19,320
of the American companies have. So we need to sort of dial it down a little bit, but then we need to
292
00:29:19,320 --> 00:29:25,720
think of other types of security to make sure that we still secure our environment. So I think it's,
293
00:29:26,840 --> 00:29:33,640
I think, yeah, I want to solve our security concerns. I might, I think, I might make them worse.
294
00:29:33,640 --> 00:29:38,200
When you, when you see that you have the buyer guys as well, I was starting to use AI and they
295
00:29:38,200 --> 00:29:42,360
adopt things way quicker than we are, because they don't have to think about the things we do,
296
00:29:42,360 --> 00:29:48,120
and they don't have to make key decisions on how, what a sensitivity label should be called.
297
00:29:48,120 --> 00:29:53,720
So they were, they're much quicker than we are. So I think, yeah, yeah, we will still need a lot of
298
00:29:53,720 --> 00:30:01,160
people to handle this. Yeah, when we think on not all the companies like, or big like, I don't know,
299
00:30:01,160 --> 00:30:07,480
American Express and so on, all these big companies, when we think a small or a midsize companies,
300
00:30:07,480 --> 00:30:17,400
what, what does three, I think, main tips, tricks, you will say that you have to do when you start
301
00:30:18,040 --> 00:30:25,000
with co-bind it, especially in the security part. Well, I think I'm still, still to both sort of the
302
00:30:25,000 --> 00:30:33,640
classification of data and sensitivity labels and the key essentials to going into EntryD and
303
00:30:33,640 --> 00:30:38,600
having conditional access and sort of the most simplest but easiest things to set up, just
304
00:30:38,600 --> 00:30:45,880
secure everything. And I work a lot with the public sector and public sector, you know, don't have
305
00:30:45,880 --> 00:30:52,360
enough money all the time. So a lot of them are still on the E3 license, you know, they, they, so,
306
00:30:52,360 --> 00:30:58,680
so we have a shift there as well. We can't, we can't demand them to take higher licenses because
307
00:30:58,680 --> 00:31:03,400
they don't have the budget for them. So we need to make sure that we can help those as well. And
308
00:31:03,400 --> 00:31:08,360
getting things labeled are still a part of, you can do that with the E3, you can do that with all
309
00:31:08,360 --> 00:31:15,000
the licenses. So if you start there and make sure that you at least have that covered, I think
310
00:31:15,800 --> 00:31:21,560
you have to just build on that. And when you, I, because that again, do that and then you can get
311
00:31:21,560 --> 00:31:30,040
co-pull it. But yeah, it's, it's, I think it's, it's easy for me to say what you need to have done
312
00:31:30,040 --> 00:31:38,840
and it's always easy if you have now the new E7 Bungu. But getting the basics, just having that
313
00:31:38,840 --> 00:31:46,040
basic structure and if you don't know how to do it, talk to a grownup, talk to a grownup who knows
314
00:31:46,040 --> 00:31:54,440
something about it to get help to get started. Okay. I want to talk, jump a little bit in
315
00:31:54,440 --> 00:32:00,760
scenario. So when the companies came, I don't know, and say we need co-pilot immediately,
316
00:32:01,640 --> 00:32:12,600
what will you do first? I'm going to, I might ask them why? Why do you need it now? Is there any
317
00:32:12,600 --> 00:32:19,400
reason why you need it now? I might be, because I work as a consultant in the consultant company,
318
00:32:19,400 --> 00:32:24,200
so we have salespeople. And I think some of the salespeople get a little bit scared when I start
319
00:32:24,200 --> 00:32:27,880
talking because I'm like, yeah, but why do you need it? Why, what's you going to do with it?
320
00:32:27,880 --> 00:32:34,680
Is that of just yes, yes, you can have it. But it's more like, I've like co-pilot is one thing,
321
00:32:34,680 --> 00:32:39,880
co-pilot licenses is one thing because it's productivity, help, and stuff like that. But when you have
322
00:32:39,880 --> 00:32:45,240
companies who says like, ah, we need agents, so we need co-pilot studio. So I'm like, okay, what are
323
00:32:45,240 --> 00:32:50,600
we going to build agents for? What are their tasks? What do you need them? Why do you need them?
324
00:32:50,600 --> 00:32:55,400
And no, we just want to build agents. I'm like, okay, so you're going to do a huge cost now,
325
00:32:55,400 --> 00:33:00,920
but you don't have any use cases for it. And I think that was one of the key features to anything AI,
326
00:33:00,920 --> 00:33:07,720
what's your use case? What are you trying to achieve? And obviously co-pilot, a mark of 365 co-pilot,
327
00:33:07,720 --> 00:33:13,720
that is productivity, help, raise everything that you need to help your day smoother. So it's sort of
328
00:33:13,720 --> 00:33:19,160
a given if people want that. I will always suggest that, yeah, but have you looked at just security
329
00:33:19,160 --> 00:33:26,200
before people buy it? But when it comes to talking about getting a license for something, I think
330
00:33:26,200 --> 00:33:30,520
co-pilot studio might be something, I always question a little bit, like, what are you trying to
331
00:33:30,520 --> 00:33:38,600
achieve here? Because it is a huge cost, especially like getting the license and attacking subscriptions
332
00:33:38,600 --> 00:33:45,160
and all that. But I think it's really like having the security basis fair enough, that's one thing,
333
00:33:45,160 --> 00:33:49,960
but then again, finding the use case, what are you trying to achieve? What's your agenda? It's another
334
00:33:49,960 --> 00:34:01,880
key thing before you go get into anything AI. And when we think, how do you convince leadership
335
00:34:01,880 --> 00:34:06,920
to invest in governance and compliance? What's your tip?
336
00:34:06,920 --> 00:34:14,520
Getting them to invest in it is always difficult. Is easier when you have private companies than
337
00:34:14,520 --> 00:34:21,160
public sector? Obviously, again, public sector doesn't have much money, so there's like, how can
338
00:34:21,160 --> 00:34:27,080
what's your best practice is always their demand? What is the key features we need to do?
339
00:34:27,080 --> 00:34:35,480
But getting them to understand that this is a cost now that will save you in the long run and not
340
00:34:35,480 --> 00:34:43,720
only saving, I got necessarily saving money, but saving, you know, loss of data. Because loss of data
341
00:34:43,720 --> 00:34:49,160
can be money things, you know, it can be, it can be the actual costs. You can get a fine in no way,
342
00:34:49,160 --> 00:34:54,280
like if you have sensitive data that you shouldn't have or send that somewhere you shouldn't,
343
00:34:54,280 --> 00:34:59,480
or it's open to everyone, that can give you a huge fine. And that can be crippling for some.
344
00:34:59,480 --> 00:35:05,480
Other than that, it's like having sensitive data that nobody should know about,
345
00:35:05,480 --> 00:35:13,400
and that's only public, that can be, that could be a tremendous exposure, and then you can lose
346
00:35:13,400 --> 00:35:20,520
creditability towards your customers, or maybe your competition can find something that I shouldn't
347
00:35:20,520 --> 00:35:27,400
know, and stuff like that. So when you talk about this towards management places, you know, they just
348
00:35:27,400 --> 00:35:37,000
see dollar signs going away. So they really know the importance of doing it. I think it's sometimes
349
00:35:37,000 --> 00:35:44,840
difficult to sort of get it, get projects like this started because they don't, the project don't
350
00:35:44,840 --> 00:35:51,800
bring back revenue straight away. So it's like it's a cost, but it doesn't give anything back in their
351
00:35:51,800 --> 00:35:59,800
eyes. Yeah, I see. But they're usually keen to have it, but I don't know how it's going to explain it.
352
00:36:01,560 --> 00:36:11,000
I have this really interesting, that's a tool, I say redweed, no marketing here, but they have a
353
00:36:11,000 --> 00:36:23,960
tool, they say how many you can can save on this space, and on, yeah, I think expired guest access,
354
00:36:23,960 --> 00:36:34,920
and so on in your tent. So I think it's, yeah, it's one part, the other part, it's, I think, it's, yeah,
355
00:36:34,920 --> 00:36:42,200
it's, I don't know, it's in Europe, I can say in Europe, you know, it could be really expensive if you
356
00:36:42,200 --> 00:36:51,160
don't be safe, especially it was the data, I think it's worth minimum, $5 million, and I think the AI,
357
00:36:52,920 --> 00:37:02,120
EU, uh, act says it's $10 million, uh, $1,000, I don't know, minimum, minimum, when you, when you have a
358
00:37:02,120 --> 00:37:11,240
security issue, so I think, yeah, that's, that's the, it's the, yeah, it's the, uh, what, the two arguments,
359
00:37:11,240 --> 00:37:22,600
we can bring. So, uh, I have, uh, today I was a little bit on, um, I think, and,
360
00:37:22,600 --> 00:37:27,320
and I read some headlines, and now my, we come to my favorite part, the hot topics.
361
00:37:27,320 --> 00:37:35,160
So the first line was, uh, co-pilot readiness is actually a data governance problem.
362
00:37:35,160 --> 00:37:40,520
Uh, yeah, so, what, what will you say to this headline?
363
00:37:40,520 --> 00:37:47,720
No, I completely agree. Uh, that's what I've said earlier, as well, this is, it's got nothing to
364
00:37:47,720 --> 00:37:52,840
do with AI. It's got everything to do with your sort of framework and everything in your tenant,
365
00:37:52,840 --> 00:38:00,200
and it's a data readiness project, um, more than a, uh, co-pilot project. So I think, uh,
366
00:38:00,200 --> 00:38:05,320
saying that co-pilot readiness is actually a data governance, uh, is completely correct.
367
00:38:05,320 --> 00:38:12,920
And another topic was today I read, the, the truth about SharePoint and Teams oversharing,
368
00:38:12,920 --> 00:38:19,720
who is most the headline, and I, and that's, sir, I think the, the best sentence in it was,
369
00:38:19,720 --> 00:38:25,240
how years of uncontrolled co-pulation create and as a fit and exposure risked for AI.
370
00:38:25,240 --> 00:38:29,560
So I also found, it's really interesting. What, what did you think about it?
371
00:38:29,560 --> 00:38:36,680
Well, I think we've, like, we've shared so much things over the years now, and, like I said,
372
00:38:36,680 --> 00:38:41,320
with, with the, with the pandemic and everything, when people suddenly just through everything from
373
00:38:41,320 --> 00:38:47,320
file sharing to Teams and SharePoint because they're just needed access to it. Um, and after that,
374
00:38:47,320 --> 00:38:53,400
we have shared so many files and we, we don't realize that when we share a file or we share a folder,
375
00:38:53,400 --> 00:38:59,240
it's still shared. Even, it's shared basically until your question if you don't stop sharing,
376
00:38:59,240 --> 00:39:05,160
because there's the link and sharing lives on forever. And a lot of people, uh, because there was
377
00:39:05,160 --> 00:39:11,240
an option earlier on that they shared anyone with a link, not just everyone in my organization,
378
00:39:11,240 --> 00:39:17,800
but everyone with a link and they don't realize what you share. And that's again, maybe goes back to,
379
00:39:17,800 --> 00:39:25,640
uh, user adoption. I can nobody knows. And I think when we started using Teams as well, a lot of
380
00:39:25,640 --> 00:39:30,280
companies weren't ready before the pandemic to start using Teams, so they hadn't started the journey.
381
00:39:30,280 --> 00:39:37,240
And then the pandemic hit and then everybody started using Teams. And right now it's, uh, sort of,
382
00:39:37,240 --> 00:39:43,160
we live, we live on that cell and people just assume that everybody knows how to use Teams because
383
00:39:43,160 --> 00:39:48,600
you had to use it in the pandemic. But in the pandemic, people created Teams just to have a meeting.
384
00:39:48,600 --> 00:39:53,320
So you have also, you have old legacy there, right? You have old Teams that's called meetings,
385
00:39:53,320 --> 00:39:58,200
status meetings, so planning meetings, so, so, so, something about that. People forgot about the
386
00:39:58,200 --> 00:40:02,760
Teams. Nobody in the company works in that, uh, the works that anymore. That was in the team.
387
00:40:02,760 --> 00:40:08,840
And we still have the sharing links there. They still live there. And, and, you know, nobody takes ownership.
388
00:40:08,840 --> 00:40:15,640
And that I think is one of the key issues to what's data sharing and, and the thing that we don't,
389
00:40:15,640 --> 00:40:22,200
we don't delete stuff because nobody takes ownership of the files. We still have that folder from Mike,
390
00:40:22,200 --> 00:40:27,960
who quit 10 years ago, because nobody wants to delete that folder because it might be important.
391
00:40:27,960 --> 00:40:34,040
And the same goes towards sharing. So oversharing in Teams and SharePoint is one of the key issues to
392
00:40:34,040 --> 00:40:41,480
our co-pilot might find data it shouldn't. I think doing, doing assessment of that is really important.
393
00:40:41,480 --> 00:40:49,560
Yeah, I have seen for two or three weeks ago, there was a session and, uh, this was from, from
394
00:40:49,560 --> 00:40:58,200
a security company. And there was, um, also, Teams and they have, um, I, I call it, I don't know, the
395
00:40:58,200 --> 00:41:06,040
birthday channels and, and so on. And they also share holiday pictures. And there was, uh, one woman,
396
00:41:06,040 --> 00:41:12,920
they write, yeah, every year in the same week, I'm in, in this holiday's, then they, they do a really
397
00:41:12,920 --> 00:41:21,080
cool, uh, process how, how they get the information, uh, to release and then the, how, uh, social
398
00:41:21,080 --> 00:41:28,840
engineering and, and so, and people use this data, especially, uh, to get, yeah, to, to hack a company.
399
00:41:28,840 --> 00:41:35,240
So this was really interesting session. And I think this is, it's a risk. A lot of people don't
400
00:41:35,240 --> 00:41:41,800
understand because there are so much more information in the data. It's not only the, the world
401
00:41:41,800 --> 00:41:47,640
file, the text, it's when it's created, we are created, who created, uh, what department and all this,
402
00:41:47,640 --> 00:41:52,520
inside, in this, this, this data and all the people, yeah, I think it's, uh, really, really
403
00:41:52,520 --> 00:41:59,960
cool, trophy. And then I got to the next headline I read, uh, Pue, was Pue View versus Reality.
404
00:41:59,960 --> 00:42:08,040
What, yeah, my security tools can really do and their organization still failed operationally.
405
00:42:08,920 --> 00:42:18,680
Yeah. Well, again, um, Pue View, it's not, uh, perfect. We still, there's still some way to go towards,
406
00:42:18,680 --> 00:42:24,520
you know, making perfect still with the sense, built in sensitive info types, uh, sensitive, built
407
00:42:24,520 --> 00:42:30,360
in, uh, sensitive info types can show, like if you, if you go into medical terms, it will show you
408
00:42:30,360 --> 00:42:35,400
things that are not medical terms, maybe and just, there's a lot of things that that will give you
409
00:42:35,400 --> 00:42:40,840
false positives, but if you create your own and you have a structure on it, it can be really good.
410
00:42:40,840 --> 00:42:49,960
But I think like if you start with a basics and you start with everything, um, sort of easy there,
411
00:42:49,960 --> 00:42:54,680
is so much better than nothing at all, but usually when I come in to companies, yeah, yeah,
412
00:42:54,680 --> 00:42:59,320
we started using labels, I'm like, okay, um, how many documents are you able to label, uh,
413
00:42:59,320 --> 00:43:06,280
around 10, because they haven't enforced it, they've just, they just let uses decide themselves,
414
00:43:06,280 --> 00:43:11,880
um, if they want to use it or not, and then other companies have, you come into and they have,
415
00:43:11,880 --> 00:43:17,960
yeah, we started using labels and we have a default label saying internal, so all their files are
416
00:43:17,960 --> 00:43:24,600
internal, because nobody, uh, the end uses don't take that conscious choice, themselves creating
417
00:43:24,600 --> 00:43:30,360
information, so they just go in and, yeah, they don't have, have, they don't know about the labels
418
00:43:30,360 --> 00:43:33,960
because everything is created internally, anyways, so they just don't know it, it's excesses.
419
00:43:33,960 --> 00:43:41,960
So when you look at everything in purview, uh, that you're capable of doing and able to do
420
00:43:41,960 --> 00:43:46,040
and everything and you look at the reality, there's a huge mismatch there,
421
00:43:47,960 --> 00:43:55,960
like huge, um, it's getting better, um, some, some people do read my blog, so some people are starting
422
00:43:55,960 --> 00:43:59,880
to create things and some of my customers are listening to me, so we're starting to get there,
423
00:43:59,880 --> 00:44:09,880
but then again, not all, uh, not all, um, projects like purview are perfect, but that's not a reason to
424
00:44:09,880 --> 00:44:16,280
not start doing it. All right, you need to start somewhere, so you better start with the projects you
425
00:44:16,280 --> 00:44:22,920
have already integrated in everything instead of trying to buy third-party tools that you don't know
426
00:44:22,920 --> 00:44:28,120
if perfect either, so stop with the simple things you can control and you already have access to
427
00:44:28,120 --> 00:44:34,600
in the Microsoft world as long as you're in the Microsoft world, you know, it's easy to just get started.
428
00:44:34,600 --> 00:44:43,160
So, yeah, uh, I also want to say, uh, you find the, the listener find all the, the links, uh, from your
429
00:44:43,160 --> 00:44:49,800
profiles, from your blog in the show notes, so, uh, and then I have, uh, this is my favorite,
430
00:44:49,800 --> 00:44:54,920
but what's my favorite headline? AI governance explained with our buzzwords.
431
00:44:54,920 --> 00:44:57,960
Governance. Yeah.
432
00:44:57,960 --> 00:45:06,680
Yeah, totally agree. I totally agree. It's got, basically what we're talking about, has nothing to do with AI,
433
00:45:06,680 --> 00:45:10,600
AI just over-exposes everything and makes it easier for us to talk about now.
434
00:45:11,400 --> 00:45:17,800
I think that's the, the people who are worked in, in security, uh, per view for such a long time,
435
00:45:17,800 --> 00:45:24,120
now finally people are starting to listening to us because now they see that it impacts the day-to-day
436
00:45:24,120 --> 00:45:31,000
work. Um, we just need an AI to sort of expose all of this, so I, I completely agree. It's just governance.
437
00:45:31,000 --> 00:45:39,400
So awesome. So, yeah, uh, we are running out of time, so then my, my last question is, what message do
438
00:45:39,400 --> 00:45:45,400
you want to, yeah, the, want you to leave the people, the listeners, uh, with?
439
00:45:45,400 --> 00:45:53,640
I think, um, I usually, uh, end a lot of my sessions with class of, uh, know your data, class of
440
00:45:53,640 --> 00:46:00,920
your data and protect your data, and I think if you do that, uh, you can do anything. I think, uh,
441
00:46:00,920 --> 00:46:08,680
try to keep the, keep the message simple. Yeah, thank you. This was, uh, also an interesting session,
442
00:46:08,680 --> 00:46:15,240
and I think the people get it over view, uh, and, uh, uh, uh, I hope we make them not so
443
00:46:15,240 --> 00:46:24,760
scared about AI adoption. But I think, uh, yeah, if we have, uh, uh, they can contact you when they have
444
00:46:24,760 --> 00:46:29,400
questions, I think, or LinkedIn or something, and yeah, thank you for, for being here.
445
00:46:29,400 --> 00:46:34,680
Thank you for having me. This was really fun and I have a great weekend, and I hope a lot of people
446
00:46:34,680 --> 00:46:40,280
will listen and be a bit more secure in what they need to do. Thank you.

Founder of m365.fm, m365.show and m365con.net
Mirko Peters is a Microsoft 365 expert, content creator, and founder of m365.fm, a platform dedicated to sharing practical insights on modern workplace technologies. His work focuses on Microsoft 365 governance, security, collaboration, and real-world implementation strategies.
Through his podcast and written content, Mirko provides hands-on guidance for IT professionals, architects, and business leaders navigating the complexities of Microsoft 365. He is known for translating complex topics into clear, actionable advice, often highlighting common mistakes and overlooked risks in real-world environments.
With a strong emphasis on community contribution and knowledge sharing, Mirko is actively building a platform that connects experts, shares experiences, and helps organizations get the most out of their Microsoft 365 investments.

![The Truth About Microsoft Security and Copilot Readiness with Åsne Holtklimpen [MVP/MCT] The Truth About Microsoft Security and Copilot Readiness with Åsne Holtklimpen [MVP/MCT]](https://img.youtube.com/vi/VzSzfIXZOzI/maxresdefault.jpg)





