Python is NOT the language of AI inside the Microsoft stack—and in this episode, I show you why that belief is quietly wrecking your Power Platform projects, inflating defects, and burning your budget. If you’re cramming Python into Power Automate, Power BI, Fabric, or custom connectors as “glue code,” this is your wake-up call.

We break down why Python is amazing for analytics, ML, and Fabric notebooks—but a terrible choice for everyday orchestration inside Power Automate, Power BI Dataflows Gen2, Dataverse, and Microsoft 365. You’ll learn how Office Scripts (TypeScript-flavoured), Copilot, and agent-style orchestration (like type-agent) can write and run the glue for you, with typed contracts, native connectors, and AI-generated scripts that actually respect your schemas and governance.

Instead of debugging brittle Python in Azure Functions at 2:14 a.m., you’ll see how to:
Keep Python where it shines: Fabric notebooks, advanced analytics, ML.
Use Copilot + Office Scripts to automate Excel, SharePoint, and approvals without custom connectors.
Let agents orchestrate flows with typed tool contracts and pre-run validation.
Cut run costs, cold starts, and defect recurrence by staying inside the Power Platform’s native surfaces.
Collapse build → debug → deploy into a governed conversational loop using prompts as reusable assets.

If you’re using Python as duct tape inside Power Automate and Power BI, this episode will show you the hybrid pattern that stops the bleeding—and lets you ship faster with fewer outages.

Apple Podcasts podcast player iconSpotify podcast player iconYoutube Music podcast player iconSpreaker podcast player iconPodchaser podcast player iconAmazon Music podcast player icon

The claim that "Python is dead" sparks debate among developers. Is it true, or merely a misconception? As you navigate the AI-driven landscape, understanding Python's transformation is vital. In recent years, over 51% of Python developers focus on data exploration and processing, showcasing Python's growing importance in AI and data science. Yet, misconceptions persist, such as misunderstandings about its capabilities and tools. You should consider Python's relevance and potential challenges as you explore its future in this evolving field.

Key Takeaways

  • Python remains a dominant language in AI and data science, with over 51% of developers focusing on data exploration.
  • The language's simple syntax makes it easy to learn, allowing developers to solve problems without complex code.
  • Python's rich ecosystem of libraries, like TensorFlow and PyTorch, accelerates AI development and reduces coding time.
  • Despite performance challenges, Python's community continuously innovates, enhancing its capabilities for modern applications.
  • Emerging trends like ethical AI and edge computing support Python's relevance in the evolving tech landscape.
  • Automation with Python can streamline workflows, freeing up time for more strategic tasks in AI projects.
  • Engaging with the Python community can provide valuable resources and keep you updated on the latest tools and innovations.
  • Python's adaptability ensures it remains a powerful choice for future AI developments, making it essential for your career growth.

Python’s Historical Strengths

Rise in Data Science

Python’s rise in data science did not happen overnight. It grew steadily through key milestones that shaped its popularity and capabilities. You can see how Python evolved from a hobby project into the dominant language in AI and data science by looking at this timeline:

YearMilestone Description
1989Guido van Rossum began working on Python as a hobby project.
1991Official release of Python.
2000Release of Python 2.0, introducing list comprehension and garbage collection.
2008Release of Python 3.0, focusing on language consistency.
2020End-of-life for Python 2, shifting focus to Python 3.
Mid-2000sRise of big data, machine learning, and social media, increasing Python's popularity in data science.
2022Python overtakes Java and C in popularity for the first time in 20 years.

This steady progress helped Python become the go-to language for data science tasks. Over the past decade, Python’s adoption rate in data science has surged dramatically. You will find that Python’s growth outpaces many other languages, with a 7 percentage point increase from 2024 to 2025 alone. This jump represents the largest single-year increase for any major programming language in over ten years. Python now leads the US job market with over 64,000 open positions, more than double those for JavaScript and far ahead of Java. These facts prove Python’s unmatched relevance in data science today.

Key Features of Python

You will appreciate Python’s simple and readable syntax, which makes it easy to learn and use. This feature allows you to focus on solving AI and data science problems rather than wrestling with complex code. Python’s syntax is more intuitive than many other languages, making it accessible for beginners and efficient for experts. This ease of use accelerates your development process and reduces errors.

Python also supports rapid prototyping, letting you experiment quickly and iterate on AI models without delay. Its cross-platform nature means you can run your AI code on Windows, macOS, or Linux without changes, giving you flexibility in deployment.

The rich ecosystem of AI and data science libraries makes Python indispensable. Libraries like TensorFlow, PyTorch, and Scikit-learn provide powerful tools for building and training machine learning models. You can rely on these pre-built components to reduce development time and improve your results. Here are some key libraries that support your AI and data science work:

Library/FrameworkDescription
TensorFlowEnd-to-end machine learning framework widely used in production.
PyTorchA rich ecosystem of tools and libraries for deep learning.
KerasUser-friendly interface for deep neural networks.
NumPyFast numerical computing.
PandasData manipulation and analysis.
Scikit-learnConsistent model building for machine learning.

Community and Ecosystem

You will find Python’s community one of its greatest strengths. Over half of Python developers work full-time, showing strong professional commitment. The community welcomes newcomers, with 41% of developers having less than two years of experience. This influx keeps Python fresh and innovative.

Python’s popularity also reflects in the job market and salaries. It ranked as the most desirable language to learn in the 2018 Stack Overflow Developer Survey. The average salary for a Python developer in the U.S. reached $115,835 in 2018, highlighting high demand for skilled professionals.

The ecosystem around Python continues to grow, with countless libraries, frameworks, and tools supporting AI and data science. This vibrant community shares knowledge, creates tutorials, and builds open-source projects that help you stay ahead in AI development. By choosing Python, you join a global network of developers pushing the boundaries of what AI and data science can achieve.

Tip: Embrace Python’s ecosystem and community to accelerate your AI projects. Leveraging existing libraries like TensorFlow and engaging with the community can save you time and boost your skills.

AI Challenges for Python

AI Challenges for Python

Performance Issues

You might have heard that python is dead because of its performance limits. Python’s Global Interpreter Lock (GIL) restricts true multithreading, which can slow down CPU-bound tasks. This design choice means Python cannot fully utilize multiple CPU cores simultaneously. Also, Python’s lack of compile-time safety sometimes leads to runtime bugs, which can cause headaches in production environments. Memory management can become a bottleneck when handling large datasets, leading to slower performance and higher resource consumption.

AspectDescription
SpeedPython runs slower than compiled languages like C++, but its simplicity often outweighs speed concerns.
LibrariesLibraries like NumPy and TensorFlow use C/C++ under the hood to speed up heavy computations.
EfficiencyPython lets you prototype rapidly and access a vast ecosystem of AI tools.
ScalabilityYou can scale Python projects using distributed computing frameworks.
JIT CompilationJust-In-Time compilation improves runtime speed by compiling code on the fly.
GPU AccelerationTensorFlow and PyTorch support GPUs, drastically cutting training times.
Memory ManagementLarge datasets can cause Python to use more memory, affecting performance.
Concurrency IssuesPython struggles to scale across CPU cores due to its interpreter overhead and GIL.

Despite these challenges, Python’s interpreter overhead remains negligible for core AI workloads because most heavy lifting happens in optimized compiled libraries like CUDA and C++. Some companies use Rust in performance-critical parts, such as data pipelines and custom operators, to boost reliability and speed. This hybrid approach lets you enjoy Python’s ease of use while overcoming its limits of the language.

Competition from New Languages

You may notice new languages emerging as python killer contenders in AI development. These languages offer unique advantages that appeal to specific AI tasks. For example, Julia excels at rapid prototyping and handling large datasets with high performance. JavaScript shines in real-time AI applications like chatbots due to its client-side capabilities. C++ supports parallel computing and integrates well with machine learning libraries, making it ideal for advanced AI features.

Programming LanguageAdvantages in AI Development
JavaBuilt-in modules speed development and improve performance, great for simulations and robotics.
JavaScriptExcels in real-time AI apps like chatbots and recommendation engines.
JuliaHigh performance and fast prototyping for large datasets.
HaskellFunctional programming helps manage complex AI data structures.
C++Supports parallel computing and integrates with ML libraries.
ROffers extensive data visualization tools for AI insights.

Industry trends suggest python may lose its throne as some developers explore these alternatives. However, Python still leads in adoption, GitHub activity, and package availability. The language’s vast ecosystem and community support keep it relevant despite rising competition. You should watch these trends closely but remember that Python’s flexibility and maturity remain hard to beat.

Misconceptions: 'Python is Dead'

Critics claim python is dead because it cannot scale or perform well enough for modern AI. They argue that Python’s design choices limit its efficiency, especially as data science demands grow. Some even say google killed it by developing Swift for TensorFlow, signaling Python’s decline.

These claims miss the bigger picture. Python continues to power automation tasks effectively. You can automate keyword research, backlink monitoring, and on-page SEO optimization with Python, proving its strength in complex workflows. Its powerful AI and machine learning libraries enable you to build autonomous agents that adapt and thrive in changing environments. By integrating Python with APIs, you streamline data collection and analysis, making automation smoother and more efficient.

The rise of ai is writing code tools does not mean python is dead. Instead, it highlights Python’s role as a foundation for AI innovation. While some say python is dying, the language evolves through community innovations and integrations with new platforms. You should not dismiss Python just because it faces challenges or competition. Instead, recognize its ongoing transformation and the opportunities it offers.

If you worry that python may lose its throne, remember that no language is perfect. Each has strengths and weaknesses. Python’s adaptability and vast ecosystem make it a resilient choice for AI and automation. Don’t fall for the myth that python is dead. Instead, embrace its evolution and use it where it shines best.

Adapting Python for AI

Integration with AI Tools

Python's adaptability shines through its seamless integration with various AI tools. You can leverage a range of powerful frameworks that enhance your AI projects. Here are some of the most widely used tools that work exceptionally well with Python:

  • Cursor: Best for complex projects, offering repository-wide reasoning for Django or FastAPI.
  • GitHub Copilot: The standard tool with massive library support, particularly for Pandas and NumPy.
  • Windsurf: Ideal for agentic flow with a cascade mode for multi-file refactoring.
  • Bito AI: Focused on code reviews, providing deep PR analysis and security scanning.
  • Tabnine: Emphasizes privacy with local model execution.
  • Sourcegraph Cody: Designed for navigating large codebases across many repositories.
  • Amazon Q Dev: Optimizes AWS cloud-first applications, particularly with Boto3 and Lambda functions.

These tools not only enhance your productivity but also streamline your workflow, allowing you to focus on building innovative AI solutions.

Enhancements in Performance

Recent advancements have significantly improved Python's performance for AI applications. You can expect faster execution speeds and better concurrency with the latest versions. Here are some key enhancements:

  • Python 3.11 execution speeds are up to 2× faster.
  • Python 3.13 introduced a no-GIL build, enhancing concurrency.
  • Python 3.14 provided additional speed boosts.
  • Python 3.15 features a significantly upgraded JIT compiler.

Additionally, integrating Python with Rust or C++ extensions via tools like PyO3 can further enhance performance. Emerging options like Mojo promise high performance while remaining compatible with Python, making it easier for you to build efficient AI applications.

To address Python's speed limitations, consider using tools like:

  • PyPy: Improves performance for long-running pure-Python code with a Just-in-Time (JIT) compiler.
  • Numba: Accelerates numeric loops by compiling selected functions, making it effective for tight numeric computations.
  • Cython: Allows adding optional type hints and compiling modules, providing predictable performance gains.

These tools can help you overcome Python's inherent limitations, enabling you to tackle demanding AI workloads more effectively.

Community Innovations

Python's global community plays a crucial role in enhancing its capabilities for AI development. Continuous contributions from developers lead to new tools, improvements, and research through open-source collaboration. This collective effort ensures that Python libraries rapidly incorporate advancements in key AI areas such as natural language processing, reinforcement learning, and computer vision.

Moreover, Python has adapted to emerging AI trends, including generative AI, large language models, edge computing, and autonomous systems. This adaptability positions Python as the primary language for most modern AI frameworks, preparing organizations for future innovations.

Open-source collaboration fosters inclusivity in technological advancement. It allows contributions from individuals regardless of their resources, shaping innovation by leveraging collective input from a diverse community. Community-driven tools like Hugging Face and PyPI create collaborative ecosystems that enhance AI progress through shared innovation.

Tip: Engage with the Python community to stay updated on the latest tools and innovations. This engagement can significantly enhance your AI projects and keep you at the forefront of technology.

Future Outlook for Python

Python's Role in AI Development

Python continues to play a pivotal role in AI development. Its libraries serve as the building blocks for today's advanced AI systems. Mastering these libraries is a strategic investment for your long-term career growth in the AI field. As you dive deeper into AI, you will find that proficiency in Python's evolving libraries is crucial for staying competitive. The language powers the explosion of interest in deep learning and AI, with many models being prototyped or executed using Python.

Emerging Trends Supporting Python

Several emerging trends support Python's continued relevance in the AI landscape:

  • Performance Optimization: Enhancements in CPython, JIT compilation via PyPy, and projects like Mojo improve Python's speed for compute-intensive tasks.
  • Ethical AI: Libraries such as AIF360 and Fairlearn help ensure responsible AI development by addressing bias.
  • Edge Computing and IoT: Lightweight frameworks like MicroPython enable Python to run on resource-constrained devices, facilitating real-time analytics.
  • Federated Learning: Tools like PySyft support decentralized training, which is crucial for privacy in sensitive fields.
  • AutoML and Democratization: Automation tools like H2O.ai make machine learning accessible to non-experts, promoting wider adoption.

These trends indicate that Python is not just surviving; it is thriving and adapting to the demands of modern AI applications.

Optimizing Workflows with Python

You can significantly enhance your productivity as an AI developer by leveraging Python's automation and scripting capabilities. Automating tedious tasks like data cleaning, model training, and report generation frees up your time for more strategic work. With Python scripts, you can create efficient pipelines that ensure data is processed, analyzed, and visualized seamlessly.

In 2026, Python has evolved into a core engine for hyper-automation and autonomous decision systems. It supports seamless integration with modern libraries that enable the creation of autonomous agents capable of reasoning through complex problems. Additionally, Python's asynchronous capabilities allow it to handle numerous concurrent operations efficiently, making it ideal for real-time data processing.

By adopting these practices, you can quickly adapt to new data and insights, allowing you to respond faster to changes and opportunities. Embrace Python's potential to optimize your workflows and elevate your AI projects to new heights.


Python is far from dead. You will find it crucial for handling large datasets, automating tasks, and collaborating in AI projects. It works alongside other tools, boosting your efficiency and learning curve.

Here’s why Python remains your best choice:

Embrace Python’s transformation. It stands ready to power your AI journey with flexibility and strength. Your future projects will benefit from this ever-evolving language.

FAQ

What makes Python suitable for AI development?

Python's simplicity and readability allow you to focus on solving problems rather than struggling with complex syntax. Its extensive libraries, like TensorFlow and PyTorch, provide powerful tools for building AI models efficiently.

Is Python still relevant in data science?

Absolutely! Python remains the leading language in data science, with a vast ecosystem of libraries and frameworks. Its popularity continues to grow, making it essential for data analysis and machine learning tasks.

How does Python compare to newer languages?

While newer languages like Julia and Rust offer performance benefits, Python's extensive libraries and community support keep it relevant. Its ease of use and flexibility make it a preferred choice for many developers.

Can I use Python for automation tasks?

Yes! Python excels in automation. You can automate repetitive tasks, data processing, and even integrate with APIs to streamline workflows, making it a powerful tool for enhancing productivity.

What are the performance limitations of Python?

Python's Global Interpreter Lock (GIL) limits true multithreading, affecting performance in CPU-bound tasks. However, you can mitigate this by using optimized libraries or integrating with languages like C++ for critical components.

How can I improve Python's performance for AI?

You can enhance Python's performance by using tools like PyPy for JIT compilation, Numba for numeric computations, and Cython for compiling modules. These tools help you tackle demanding AI workloads more effectively.

What role does the community play in Python's evolution?

The Python community drives innovation through open-source collaboration. Developers continuously contribute new tools and libraries, ensuring Python adapts to emerging trends and remains a leading choice for AI and data science.

How can I stay updated on Python developments?

Engage with the Python community through forums, social media, and conferences. Follow influential developers and organizations on platforms like GitHub and Twitter to stay informed about the latest tools and trends.

🚀 Want to be part of m365.fm?

Then stop just listening… and start showing up.

👉 Connect with me on LinkedIn and let’s make something happen:

  • 🎙️ Be a podcast guest and share your story
  • 🎧 Host your own episode (yes, seriously)
  • 💡 Pitch topics the community actually wants to hear
  • 🌍 Build your personal brand in the Microsoft 365 space

This isn’t just a podcast — it’s a platform for people who take action.

🔥 Most people wait. The best ones don’t.

👉 Connect with me on LinkedIn and send me a message:
"I want in"

Let’s build something awesome 👊

1
00:00:00,000 --> 00:00:05,600
Python is the language of AI, incorrect, inside Microsoft stack, AI writes the glue, not you.

2
00:00:05,600 --> 00:00:09,340
You keep shoving Python into power automate and power BI, then wonder why your flows wobble

3
00:00:09,340 --> 00:00:10,820
like a three-legged chair.

4
00:00:10,820 --> 00:00:11,740
Here's the fix.

5
00:00:11,740 --> 00:00:16,380
I'll show you why type agent style orchestration and co-pilot turn TypeScript-like scripts

6
00:00:16,380 --> 00:00:20,900
into the first class glue, push Python to contain analytics and collapse build, debug,

7
00:00:20,900 --> 00:00:23,200
deploy into a single conversational loop.

8
00:00:23,200 --> 00:00:27,280
Stay for the secret step that kills defect inflation, the mistake that silently drains

9
00:00:27,280 --> 00:00:32,840
budgets and the hybrid pattern that lets you keep Python without the pain.

10
00:00:32,840 --> 00:00:35,800
The problem, Python's friction inside power platform.

11
00:00:35,800 --> 00:00:37,960
Let's start with the obvious that somehow isn't obvious.

12
00:00:37,960 --> 00:00:40,720
Power automate does not run arbitrary Python.

13
00:00:40,720 --> 00:00:44,760
Office scripts, TypeScript flavoured are the native script surface for Microsoft 365,

14
00:00:44,760 --> 00:00:45,760
Excel online host them.

15
00:00:45,760 --> 00:00:46,760
That's the lane.

16
00:00:46,760 --> 00:00:50,560
When you insist on Python, you bolt on Azure Functions logic apps or custom connectors.

17
00:00:50,560 --> 00:00:54,600
Now you've added external compute deployment pipelines, authentication and my favorite run-time

18
00:00:54,600 --> 00:00:55,600
mystery meet.

19
00:00:55,600 --> 00:00:58,400
This hurts as simple economics plus probability.

20
00:00:58,400 --> 00:01:03,720
External setup means more services to configure, more credentials to rotate and more costs

21
00:01:03,720 --> 00:01:06,080
ticking even while you sleep.

22
00:01:06,080 --> 00:01:07,600
Dynamic typing adds roulette.

23
00:01:07,600 --> 00:01:11,560
The code compiles because there is no compile, then fails mid-flight because the field wasn't

24
00:01:11,560 --> 00:01:12,560
what you assumed.

25
00:01:12,560 --> 00:01:13,720
In a flow, that's not cute.

26
00:01:13,720 --> 00:01:18,140
It's a broken approval, a missed SLA, and another ticket titled urgent typed in lower

27
00:01:18,140 --> 00:01:19,140
case.

28
00:01:19,140 --> 00:01:21,480
Specific scenarios, Power BI data flows.

29
00:01:21,480 --> 00:01:22,880
You're transforming data.

30
00:01:22,880 --> 00:01:26,940
Power pilot can now generate Power Query M or Python for you, inside of the data flow

31
00:01:26,940 --> 00:01:27,940
Gen 2.

32
00:01:27,940 --> 00:01:31,580
Great, until you mix this with a separate Python service for glue tasks that Power Automate

33
00:01:31,580 --> 00:01:35,000
could have handled natively, now you're lineage crosses products.

34
00:01:35,000 --> 00:01:39,000
Observability, enjoy tracing an error from a flow to a function to a data flow.

35
00:01:39,000 --> 00:01:43,200
In the time it takes your stakeholders to ask, is it done yet for the seventh time?

36
00:01:43,200 --> 00:01:44,200
Power Automate flows.

37
00:01:44,200 --> 00:01:47,000
The whole point is native connectors and Office scripts.

38
00:01:47,000 --> 00:01:50,760
People still shove Python behind a custom connector to rename columns in Excel.

39
00:01:50,760 --> 00:01:53,700
Just like calling a tow truck to back out of your driveway.

40
00:01:53,700 --> 00:01:55,900
Every hop increases Britleness.

41
00:01:55,900 --> 00:02:00,480
Connector schema is drift, function dependencies age, permissions sprawl across entra

42
00:02:00,480 --> 00:02:03,800
rolls, resource groups, and mystery storage accounts.

43
00:02:03,800 --> 00:02:06,000
No one will admit to creating.

44
00:02:06,000 --> 00:02:10,400
Fabric notebooks, yes, Python belongs here for data science ML series compute.

45
00:02:10,400 --> 00:02:14,200
But when you use notebooks as orchestration glue, you weld business logic to a compute

46
00:02:14,200 --> 00:02:15,200
kernel.

47
00:02:15,200 --> 00:02:18,440
Scheduling, validation, and IO become bespoke.

48
00:02:18,440 --> 00:02:22,320
An environment update later pandas goes up a minor version and your simple glue becomes

49
00:02:22,320 --> 00:02:24,320
archaeological strata.

50
00:02:24,320 --> 00:02:25,720
Beginners think they're simplifying.

51
00:02:25,720 --> 00:02:30,000
They're actually creating a single point of procedural failure with bonus yaml.

52
00:02:30,000 --> 00:02:32,600
The thing most people miss office scripts aren't toy typescript.

53
00:02:32,600 --> 00:02:36,920
They are the sanctioned edge for automating Excel and related Microsoft 365 tasks without

54
00:02:36,920 --> 00:02:38,240
dragging in infra.

55
00:02:38,240 --> 00:02:41,640
Compare that to full node typescript which you might use in Azure functions for robust

56
00:02:41,640 --> 00:02:43,280
APIs when you actually need them.

57
00:02:43,280 --> 00:02:45,760
Python inside Power Platform is indirect by design.

58
00:02:45,760 --> 00:02:46,760
You can do it.

59
00:02:46,760 --> 00:02:47,760
You shouldn't do it for glue.

60
00:02:47,760 --> 00:02:50,360
You can talk cost math without the fairy dust.

61
00:02:50,360 --> 00:02:53,280
Simple flows are cheap because they stay in platform.

62
00:02:53,280 --> 00:02:56,640
Connectors, triggers, and office scripts ride your existing licensing.

63
00:02:56,640 --> 00:02:59,760
Bring in Python, via functions, and you pay in two currencies.

64
00:02:59,760 --> 00:03:00,960
Money and attention.

65
00:03:00,960 --> 00:03:03,360
Money for execution, storage, and networking.

66
00:03:03,360 --> 00:03:07,520
Attention for versioning, image hardening, secrets, retreats, cold starts, and the weekly,

67
00:03:07,520 --> 00:03:09,280
why did it fail at 2.14 a.m.

68
00:03:09,280 --> 00:03:10,280
Saga.

69
00:03:10,280 --> 00:03:11,280
At small scale find.

70
00:03:11,280 --> 00:03:14,600
At organizational scale, that's a budget line item with regret baked in.

71
00:03:14,600 --> 00:03:16,920
Common failure modes show up on repeat.

72
00:03:16,920 --> 00:03:20,640
Visual connectors where your swagger spec lags reality by one field.

73
00:03:20,640 --> 00:03:21,640
Version drift.

74
00:03:21,640 --> 00:03:24,920
Library updates that aren't pinned because someone loves latest.

75
00:03:24,920 --> 00:03:25,920
Permissions sprawl.

76
00:03:25,920 --> 00:03:28,080
Service principles with rights just for now.

77
00:03:28,080 --> 00:03:29,640
That become forever.

78
00:03:29,640 --> 00:03:31,040
Cross service debugging.

79
00:03:31,040 --> 00:03:34,520
Replaying runs across power automate, Azure functions, and data flow logs like a true

80
00:03:34,520 --> 00:03:35,520
crime podcast.

81
00:03:35,520 --> 00:03:36,520
The truth?

82
00:03:36,520 --> 00:03:40,800
Dynamic typing's freedom is fun until it's 3 p.m. on quarter close and your flow succeeds

83
00:03:40,800 --> 00:03:42,120
with the wrong shape.

84
00:03:42,120 --> 00:03:45,240
Strong typed boundaries catch stupidity before it ships.

85
00:03:45,240 --> 00:03:49,720
Type-script like scripts and schema connectors give you those guardrails Python can have

86
00:03:49,720 --> 00:03:50,720
types too.

87
00:03:50,720 --> 00:03:52,600
Yes, most of you don't enforce them in flows.

88
00:03:52,600 --> 00:03:54,840
Don't argue your log files already did.

89
00:03:54,840 --> 00:03:56,600
And now the transition you were waiting for.

90
00:03:56,600 --> 00:04:00,800
If friction stack every time your hand stitch code as glue what replaces the glue.

91
00:04:00,800 --> 00:04:05,600
Enter AI that writes and runs the glue for you within the lanes the platform optimizes.

92
00:04:05,600 --> 00:04:11,440
Copilot can translate natural language into office scripts for Microsoft 365 tasks.

93
00:04:11,440 --> 00:04:16,760
In Power BI, copilot uses semantic model metadata to generate M or Python accurately and produce

94
00:04:16,760 --> 00:04:18,360
report scaffolds in seconds.

95
00:04:18,360 --> 00:04:21,920
Agents like type agent coordinate tools with memory and guardrails reducing the ping pong

96
00:04:21,920 --> 00:04:24,440
of spec code deploy debug redeploy.

97
00:04:24,440 --> 00:04:28,840
You constrain the surface area you get typed edges where it matters and you keep Python where

98
00:04:28,840 --> 00:04:29,840
it shines.

99
00:04:29,840 --> 00:04:31,640
Contained analytics not duct tape.

100
00:04:31,640 --> 00:04:34,680
Yes, you can cling to but Python can do anything.

101
00:04:34,680 --> 00:04:38,200
So can duct tape that doesn't make it the right fastener for aircraft.

102
00:04:38,200 --> 00:04:42,320
Microsoft orchestration native let AI write the glue put Python in notebooks or services

103
00:04:42,320 --> 00:04:43,400
designed for it.

104
00:04:43,400 --> 00:04:48,680
You'll spend less break less and this is the part that stings ship more.

105
00:04:48,680 --> 00:04:52,480
Why current approaches fail manual code and static playbooks.

106
00:04:52,480 --> 00:04:57,360
The manual coding loop looks heroic on a whiteboard requirement code deploy test fix redeploy.

107
00:04:57,360 --> 00:05:01,720
In reality, it's slow motion whack a mole business rules change mid sprint.

108
00:05:01,720 --> 00:05:06,640
Your final schema shifts by lunch and the only constant is another redeploy in the

109
00:05:06,640 --> 00:05:11,520
power platform that loop is even more painful because every hop power automate to Azure

110
00:05:11,520 --> 00:05:16,200
functions to power BI amplifies the friction and multiplies your failure modes.

111
00:05:16,200 --> 00:05:17,840
Static playbooks are the other trap.

112
00:05:17,840 --> 00:05:22,760
You wrote a golden path runbook for quarter and logic then sales events in new discount

113
00:05:22,760 --> 00:05:28,120
finance tweaks revenue recognition and legal ads of four step approval yesterday.

114
00:05:28,120 --> 00:05:32,560
Your Python in functions glue doesn't bend it cracks your update code bump dependencies

115
00:05:32,560 --> 00:05:37,040
rebuild containers and rediscover that cold starts love to appear during executive demos

116
00:05:37,040 --> 00:05:38,320
fascinating right.

117
00:05:38,320 --> 00:05:42,480
The truth most people avoid dynamic typing freedom feels fast when you're alone at scale

118
00:05:42,480 --> 00:05:47,280
inflows its bug roulette an optional field changes from string to number and your function

119
00:05:47,280 --> 00:05:52,280
happily sales through until a branch expects text and throws an exception at runtime.

120
00:05:52,280 --> 00:05:57,520
That's not a unit test failing loudly that's production silently misclassifying transactions

121
00:05:57,520 --> 00:06:01,680
and yes your audit trail is basically a novella observability is a scavenger hunt errors

122
00:06:01,680 --> 00:06:05,720
originate in power automate get transformed inside a custom connector trigger a Python

123
00:06:05,720 --> 00:06:11,280
function bounce to data flow gen 2 and finally surface as a broken tile in power BI.

124
00:06:11,280 --> 00:06:15,120
Now you're reading five logs that can't agree on the same timestamp format you call this

125
00:06:15,120 --> 00:06:20,400
investigation users call it we still don't have the numbers micro story a team shaved hours

126
00:06:20,400 --> 00:06:25,320
of monthly reporting by consolidating transforms into a single Python function it worked until

127
00:06:25,320 --> 00:06:29,680
dependency conflicts resurfaced a minor pandas version update changed the default their

128
00:06:29,680 --> 00:06:34,480
function silently coerced nulls and two weeks later variance reports were off by just enough

129
00:06:34,480 --> 00:06:39,520
to be dangerous they fixed it then it occurred because pinning was on the backlog bug recurrence

130
00:06:39,520 --> 00:06:45,120
isn't a mystery it's a process floor and governance custom connectors age like milk open API specs

131
00:06:45,120 --> 00:06:49,680
drift behind reality tokens expire service principles collect excessive rights temporarily

132
00:06:49,680 --> 00:06:55,200
and your once elegant pipeline is now a rubegoldberg machine with s.o.c findings the manual loop

133
00:06:55,200 --> 00:07:00,480
doesn't just cost time it accumulates operational debt every quick fix is a future outage with a

134
00:07:00,480 --> 00:07:05,440
calendar invite the center cannot hold because the center is brittle glue manual code works when the

135
00:07:05,440 --> 00:07:10,960
domain is stable the power platforms business logic is not stable therefore pay attention your

136
00:07:10,960 --> 00:07:15,920
orchestration must be adaptive type that the edges and generated close to the platforms primitives

137
00:07:15,920 --> 00:07:20,640
not hand stitched far away stop treating glue like an app let an agent orchestrate and constrain

138
00:07:21,200 --> 00:07:26,400
the better method type agent plus copilot as the orchestrator enter the model that doesn't fight

139
00:07:26,400 --> 00:07:32,720
the platform agents generate call tools observe results and revise not free for all code spew tool

140
00:07:32,720 --> 00:07:38,880
calling with guardrails you give the agent a bounded toolbox office scripts for Microsoft 365

141
00:07:38,880 --> 00:07:44,640
actions power bi data flow gen two for transformations connectors for data movement and strictly

142
00:07:44,640 --> 00:07:49,520
typed interfaces between them the agent holds context reasons across steps and only rights code

143
00:07:49,520 --> 00:07:54,560
where code belongs start with office scripts and copilot you describe the outcome when a new

144
00:07:54,560 --> 00:07:59,920
row lands in this table normalize dates fill blanks and email approvers with a summary copilot

145
00:07:59,920 --> 00:08:04,320
translates that into type script like office scripts and flow steps there's no external runtime

146
00:08:04,320 --> 00:08:09,600
no container to harden no secret store to babysit the code sits where the data lives excel online

147
00:08:09,600 --> 00:08:14,960
sharepoint one drive and your flow stitches native connectors fewer moving parts fewer ways to fail

148
00:08:14,960 --> 00:08:19,280
the thing most people miss is that typed edges exist even here object models

149
00:08:19,600 --> 00:08:24,720
method signatures and connector schemas and force shape before runtime turns on the blender move

150
00:08:24,720 --> 00:08:30,160
to data flow gentle with copilot prompts become power query m or python but with context copilot is

151
00:08:30,160 --> 00:08:35,760
semantic model aware it knows your tables columns measures and synonyms so generation is anchored to

152
00:08:35,760 --> 00:08:41,120
your reality not a hallucinated schema need a quick report scaffold copilot can spin a page with

153
00:08:41,120 --> 00:08:46,960
visuals in seconds not because magic because it leverages metadata you already curated you validate

154
00:08:46,960 --> 00:08:51,760
the diffs lock the pattern and this is important keep orchestration out of python use python inside

155
00:08:51,760 --> 00:08:57,200
the data flow for analytics kernels not for renaming columns or pinging approvals now bring in type

156
00:08:57,200 --> 00:09:02,000
agent pie or a similar agent framework for multi step reasoning the agent isn't replacing your

157
00:09:02,000 --> 00:09:06,880
governance it's enforcing it it remembers prior failures chooses the right tool and retrieves

158
00:09:06,880 --> 00:09:12,240
intelligently tool calling accuracy and context retention matter here when the agent can consistently

159
00:09:12,240 --> 00:09:17,600
pick update worksheet versus send email you stop shipping human wiring errors time to resolution

160
00:09:17,600 --> 00:09:22,640
shrinks because the agent handles the lead up generating the script testing against the sample

161
00:09:22,640 --> 00:09:27,280
validating outputs and only then promoting changes why this works is simple architecture

162
00:09:27,280 --> 00:09:32,240
constrained the code surface put statically typed boundaries at the seams open api schemas on

163
00:09:32,240 --> 00:09:38,080
connectors office script object models semantic models in power bi inside those boundaries let AI

164
00:09:38,080 --> 00:09:43,120
generate code to spec the reason this reduces defects is not supernatural it's pre validation

165
00:09:43,120 --> 00:09:48,000
you're catching shape mistakes before they become outages and because the agent keeps memory

166
00:09:48,000 --> 00:09:53,120
you kill recurrence the bug that returns because your fix didn't update the pattern just the file

167
00:09:53,120 --> 00:09:58,720
practical shift python remains where it dominates within fabric notebooks or data flows for

168
00:09:58,720 --> 00:10:04,240
advanced analytics modeling and ml you let co pilot help with exploratory data analysis vectorization

169
00:10:04,240 --> 00:10:10,000
patterns and visualization code around that the agent handles the perimeter scheduling validation

170
00:10:10,000 --> 00:10:16,320
i.o. and policy agents trigger notebooks validate outputs and root results without inventing bespoke

171
00:10:16,320 --> 00:10:21,360
orchestration in python you do not weld business logic to a kernel you separate concerns like an adult

172
00:10:21,360 --> 00:10:27,120
office scripts and co pilot give the quick wins the agent enforces repeatability here's the shortcut

173
00:10:27,120 --> 00:10:32,640
nobody teaches codify prompts as assets treat them like templates with variables and acceptance

174
00:10:32,640 --> 00:10:38,960
criteria if you remember nothing else capture the successful prompt and the expected shape of outputs

175
00:10:38,960 --> 00:10:44,000
the agent can reuse it adapt it and link the result against your contracts that's how you collapse

176
00:10:44,000 --> 00:10:49,200
build debug deploy into a conversational loop that's still governed the game changer nobody talks

177
00:10:49,200 --> 00:10:54,720
about typed contracts for agent tools you define the input and output schemas the agent can validate

178
00:10:54,720 --> 00:10:58,480
before it runs a single step in production compared that to just run the python and pray your

179
00:10:58,480 --> 00:11:02,880
json didn't sprout an extra property and yes python can be typed the point is discipline at the

180
00:11:02,880 --> 00:11:08,880
edges not language loyalty in the power platform the edges are Microsoft's domains office power bi

181
00:11:08,880 --> 00:11:13,440
data verse so use the primitives they optimize once you nail that everything else clicks

182
00:11:13,440 --> 00:11:19,040
versioning is smaller surface area prompts and scripts not monoliths observability is centralized

183
00:11:19,040 --> 00:11:24,480
power automate run history data flow lineage and agent traces not five disconnected lock dashboards

184
00:11:24,480 --> 00:11:28,960
cost goes down because you retire as your functions that were just doing make work chores

185
00:11:28,960 --> 00:11:34,160
and defect rates drop because your orchestrator is opinionated not artisanal now you might be thinking

186
00:11:34,160 --> 00:11:40,000
this sounds like giving up control incorrect you're moving control upper level you decide the contracts

187
00:11:40,000 --> 00:11:44,800
the tools the guardrails and the review gates the agent does the stitching faster than your human loop

188
00:11:44,800 --> 00:11:50,160
and less error prone you reserve python for high value analytics where libraries earn their keep

189
00:11:50,160 --> 00:11:55,440
that's not surrender that strategy let me show you exactly how this feels in practice a request lands

190
00:11:55,440 --> 00:12:01,920
we need a weekly report that cleans csv's enriches with data verse pushes to a semantic model and emails

191
00:12:01,920 --> 00:12:07,520
approvers old way stand up a python function wire a custom connector argue with oath fix types three

192
00:12:07,520 --> 00:12:13,760
times new way agent calls data flow gen 2 via co-pilot to generate the transforms validates columns

193
00:12:13,760 --> 00:12:18,800
against the model uses office scripts to prep the excel drop triggers a flow for notifications

194
00:12:18,800 --> 00:12:24,160
and logs the lineage no bespoke glue no midnight redeploy the reason this works is you stopped using

195
00:12:24,160 --> 00:12:29,120
python as duct tape in a platform designed with its own fasteners you let a i write glue near the

196
00:12:29,120 --> 00:12:34,720
joints not across the gap and when the business changes spoiler alert it will you update a prompt

197
00:12:34,720 --> 00:12:39,040
not a container image that's how you keep shipping while everyone else is still in dependency hell

198
00:12:39,040 --> 00:12:45,120
explaining to finance why latest was a terrible idea application one power bi data flows python

199
00:12:45,120 --> 00:12:49,680
generated not hand written here's where the light bulb goes on in power bi you're not paid to write

200
00:12:49,680 --> 00:12:55,120
artisanal m or bespoke python you're paid to deliver clean data and working reports co-pilot inside

201
00:12:55,120 --> 00:13:00,640
data flow gen 2 let's you state the transformation in plain english then it generates power query m or

202
00:13:00,640 --> 00:13:06,400
yes python anchored to your actual semantic model not fantasy tables your tables columns and measures

203
00:13:06,400 --> 00:13:11,760
that semantic awareness kills a whole category of column not found chaos before it starts

204
00:13:11,760 --> 00:13:16,960
why this matters generation plus context equals fewer dumb bugs the thing most people miss is the review

205
00:13:16,960 --> 00:13:22,240
loop you don't trust the black box you use it to draft fast then you review the diff lock the pattern

206
00:13:22,240 --> 00:13:28,000
once it's right from there use co-pilot to iterate by prompt not by endless hand edits your treating

207
00:13:28,000 --> 00:13:33,040
code like a template with variables rather than a diary of your keyboards feelings what does the

208
00:13:33,040 --> 00:13:38,320
workflow look like you prompt from sales transactions filter to the last 12 months normalize date formats

209
00:13:38,320 --> 00:13:44,480
left join customer master and compute gross margin co-pilot proposes m or python you preview the result

210
00:13:44,480 --> 00:13:50,000
against a sample validate column names against the semantic model and pin the step need python for

211
00:13:50,000 --> 00:13:55,600
an analytics kernel say outlier detection or seasonal decomposition fine let co-pilot generate

212
00:13:55,600 --> 00:14:00,800
that inside the data flow but keep orchestration out of it python computes the platform orchestrates

213
00:14:00,800 --> 00:14:06,800
quick win reports got folding co-pilot can spit out a first pass page with visuals in seconds

214
00:14:06,800 --> 00:14:11,280
it's not magic it's metadata you already named your measures added synonyms and curated

215
00:14:11,280 --> 00:14:16,560
relationships co-pilot leverages that to place charts intelligently you save the groundwork and

216
00:14:16,560 --> 00:14:23,440
spend time refining what matters business logic and presentation common mistakes that inflate defects

217
00:14:23,440 --> 00:14:28,640
first letting co-pilot's generated python grow tentacles if it starts renaming columns calling

218
00:14:28,640 --> 00:14:33,680
external endpoints or embedding workflow logic you've slipped back into brittle glue second skipping

219
00:14:33,680 --> 00:14:39,360
code reviews because a i wrote it no ad review standardized prompts keeper checklist column shapes

220
00:14:39,360 --> 00:14:45,600
validated null handling explicit joins deterministic third mixing orchestration and analytics if you

221
00:14:45,600 --> 00:14:50,480
blend approval logic into a python transform you've guaranteed the next schema change breaks your

222
00:14:50,480 --> 00:14:56,160
workflow the hybrid rule here is simple python for analytics kernels m or platform native steps

223
00:14:56,160 --> 00:15:01,200
for plumbing use data verse or one lake for inputs and outputs that the rest of the platform

224
00:15:01,200 --> 00:15:06,400
understands when you need to adjust the transform change the prompt and validate the diff rather than

225
00:15:06,400 --> 00:15:12,000
spelunking through a 400 line function business changes become edits to intent not surgery on glue

226
00:15:12,000 --> 00:15:17,840
and yes observability improves data flow lineage shows what fed what co-pilot's proposed steps form

227
00:15:17,840 --> 00:15:22,080
a readable chain rather than a bucket of hand rolled scripts when something fails you troubleshoot

228
00:15:22,080 --> 00:15:26,880
in one place compare that to your old pattern power automate triggers a custom connector hits a

229
00:15:26,880 --> 00:15:31,920
python api that mutates columns then writes to a lake you forgot to tag delightful the payoff is

230
00:15:31,920 --> 00:15:36,800
speed with fewer surprises you generate review lock you reserve python for the heavy math where it

231
00:15:36,800 --> 00:15:41,760
earns its keep and when the cfo invents a new metric on a Tuesday afternoon you update a prompt and

232
00:15:41,760 --> 00:15:46,640
revalidate not redeploy a container image try pretending that isn't better data flows stabilized

233
00:15:46,640 --> 00:15:52,800
good now kill the fragile glue living rent free in power automate application two power automate

234
00:15:52,800 --> 00:15:57,920
replace python glue with office scripts plus agents in power automate the correct move is boring

235
00:15:57,920 --> 00:16:03,040
on purpose native connectors and office scripts you describe your steps co-pilot drafts the flow

236
00:16:03,040 --> 00:16:08,720
in scripts and you keep everything inside Microsoft 365 no Azure functions no custom connector

237
00:16:08,720 --> 00:16:14,160
scaffolding no midnight patching of ssl ciphers on a container you forgot existed shocking revelation

238
00:16:14,160 --> 00:16:19,120
when you reduce moving parts you reduce failures why office scripts because they're type script

239
00:16:19,120 --> 00:16:24,480
flavoured with a defined object model that gives you typed edges method signatures predictable shapes

240
00:16:24,480 --> 00:16:30,720
before runtime gets a chance to embarrass you co-pilot turns for each new row in this table normalize

241
00:16:30,720 --> 00:16:37,920
date strings computer status and email the owner with a summary into exactly that a flow plus a script

242
00:16:37,920 --> 00:16:42,240
your data stays near excel share point or one drive your governance stays simpler and your

243
00:16:42,240 --> 00:16:46,880
defect rate drops because you eliminated the ad hoc python detour that loved to break on Tuesdays

244
00:16:46,880 --> 00:16:52,800
how to structure it so it scales adopt an agent driven pattern the agent type agent or comparable

245
00:16:52,800 --> 00:16:57,840
selects tools from a constrained toolbox invoke office script call a graph connector write to

246
00:16:57,840 --> 00:17:02,640
data verse send an approval inputs and outputs are schemered the agent validates preconditions runs

247
00:17:02,640 --> 00:17:08,000
a test against the sample and only then promotes the change tool calling accuracy matters so name

248
00:17:08,000 --> 00:17:13,840
tools unambiguously worksheet update table beats do stuff when the agent consistently picks the

249
00:17:13,840 --> 00:17:18,560
right tool you stop wiring the wrong action to the right trigger quick wins that retire python glue

250
00:17:18,560 --> 00:17:24,960
excel operations cleaning columns the duping splitting fields office scripts handle these natively

251
00:17:24,960 --> 00:17:30,480
approvals and notifications native actions in minutes zero custom auth data verse and sharepoint

252
00:17:30,480 --> 00:17:35,920
updates connectors with metadata awareness not raw HTTP calls costs go down because you're not paying

253
00:17:35,920 --> 00:17:40,880
for always on compute to rename columns time to resolution shrinks because your agent can generate

254
00:17:40,880 --> 00:17:45,680
test and iterate without a human setting up a dev container and your security team stops glaring

255
00:17:45,680 --> 00:17:50,560
because you removed the unknown API endpoint that someone labeled temporary mistakes to avoid

256
00:17:50,560 --> 00:17:55,520
because of course you'll try them first forcing python via a custom connector for trivial tasks

257
00:17:55,520 --> 00:18:01,040
if the job is to reshape a table use office scripts second ignoring governance scripts need

258
00:18:01,040 --> 00:18:06,800
versioning review gates and naming standards treat them like code because they are third bearing

259
00:18:06,800 --> 00:18:11,760
business logic across five flows with no type contracts define your input and output schemas

260
00:18:11,760 --> 00:18:17,200
upfront the agent can enforce them the platform will validate them production will thank you you

261
00:18:17,200 --> 00:18:22,880
still want python put it where it belongs analysis if a flow needs predictions or advanced transforms

262
00:18:22,880 --> 00:18:29,440
call a proper service fabric notebook job ml endpoint behind a clean API not an improvised flask

263
00:18:29,440 --> 00:18:35,440
app someone left on a free tier the flow orchestrates the python computes they interact through a contract

264
00:18:35,440 --> 00:18:40,560
not through vibes the subtle advantage of office scripts plus agents is maintainability prompts become

265
00:18:40,560 --> 00:18:46,160
assets scripts become reusable tools flows become thin orchestration layers not logic museums when

266
00:18:46,160 --> 00:18:51,040
requirements change you update a prompt and if needed a script with a review rather than revisiting

267
00:18:51,040 --> 00:18:56,560
an entire custom connector stack the agent remembers the previous fix reuses the pattern and prevents

268
00:18:56,560 --> 00:19:02,160
recurrence that's how you turn we spent the afternoon debugging types into we shipped before lunch

269
00:19:02,160 --> 00:19:07,120
the truth in power automate python is glue is performative difficulty you're proving you can

270
00:19:07,120 --> 00:19:13,280
while the platform quietly offers you a simpler safer path use it application three fabric notebooks contain

271
00:19:13,280 --> 00:19:18,560
python automate the perimeter in fabric python finally sits where it's strongest the compute core

272
00:19:18,560 --> 00:19:22,880
not the doorbell not the ductwork the engine you keep the business logic at the edges and let

273
00:19:22,880 --> 00:19:29,120
the notebook do analytics modeling and heavy transforms the perimeter scheduling validation i.o policy

274
00:19:29,120 --> 00:19:34,240
belongs to agents and typescript flavor tooling why this matters when you weld orchestration to a

275
00:19:34,240 --> 00:19:39,280
notebook every environment tweak becomes a production incident separate concerns the agent calls

276
00:19:39,280 --> 00:19:44,400
the notebook through a clean job API passes typed inputs and expects typed outputs if the shape

277
00:19:44,400 --> 00:19:50,080
deviates it fails fast at the edge not halfway through a 20 minute run how this looks in practice

278
00:19:50,080 --> 00:19:55,120
co-pilot helps inside the notebook for eda vectorization visualization scaffolds you still review

279
00:19:55,120 --> 00:20:00,720
and pin library versions adults remember the agent handles pipelines triggers on data arrival runs

280
00:20:00,720 --> 00:20:05,360
a schema check launches the notebook job and validates outputs against a contract before publishing

281
00:20:05,360 --> 00:20:10,480
to one lake or a semantic model version control notebooks like code store prompts that generated

282
00:20:10,480 --> 00:20:15,680
helper functions unit test outputs metrics distributions row counts rather than line by line

283
00:20:15,680 --> 00:20:21,360
plumbing quick win prompt co-pilot to generate a seasonal decomposition step keep it in the notebook

284
00:20:21,360 --> 00:20:26,560
and let the agent orchestrate retries and alerts business logic lives in declarative contracts

285
00:20:26,560 --> 00:20:32,080
python stays computational mistakes to avoid hiding approvals or renames inside the notebook

286
00:20:32,080 --> 00:20:36,880
coupling flows to ad hoc endpoints or storing the only definition of a kpi inside a cell comment

287
00:20:36,880 --> 00:20:43,760
don't do archaeology do architecture now quantify it so finance stops sighing results time-saved

288
00:20:43,760 --> 00:20:49,680
cost-reduced defects down time first a i assisted generation collapses the build debug deploy loop

289
00:20:49,680 --> 00:20:54,400
co-pilot drafts transforms and scripts the agent tests against samples before promotion

290
00:20:54,400 --> 00:20:58,880
you cut rework and context switching because orchestration stays native and code sits where it

291
00:20:58,880 --> 00:21:04,240
executes cost next retire as your functions that were renaming columns keep flows in platform

292
00:21:04,240 --> 00:21:09,440
notebooks in fabric and glue in office scripts you pay less in compute and far less in attention

293
00:21:09,440 --> 00:21:14,800
no container patching fewer secrets fewer weekend outages defects finally typed boundaries

294
00:21:14,800 --> 00:21:19,920
on connectors office script object models and semantic metadata catch shape errors early agent

295
00:21:19,920 --> 00:21:25,360
guardrails tool calling accuracy context retention kill the fix it Friday break it Monday cycle

296
00:21:25,360 --> 00:21:30,400
recurrence drops because you update patterns not just files practical benchmarks to track

297
00:21:30,400 --> 00:21:36,320
time to resolution per incident tool call success rate and change lead time implementation checklist

298
00:21:36,320 --> 00:21:41,600
pick your orchestrator define contracts templatize prompts add review gates and monitor outcomes

299
00:21:41,600 --> 00:21:47,120
centrally python isn't dead it's demoted to its specialty good that's how you ship it counter-argument

300
00:21:47,120 --> 00:21:52,720
and rebuttal but python dominates data science yes python dominates data science libraries community

301
00:21:52,720 --> 00:21:57,280
notebooks the works in fabric that's exactly why you keep it inside notebooks and analytics kernels

302
00:21:57,280 --> 00:22:03,280
it's the engine the truth engines don't root traffic lights orchestration is roads signals and rules

303
00:22:03,280 --> 00:22:09,120
that's power platforms domain connectors office scripts data flow gen 2 and co-pilot

304
00:22:09,760 --> 00:22:14,480
here's what most people blur dominance in analytics doesn't equal fitness for glue inside

305
00:22:14,480 --> 00:22:20,720
Microsoft stack in power automate python is indirect external compute via azure functions or custom

306
00:22:20,720 --> 00:22:26,240
connectors that adds cost secrets cold starts and version drift meanwhile office scripts run where

307
00:22:26,240 --> 00:22:31,760
your Microsoft 365 data lives and co-pilot drafts them from plain English you're choosing friction

308
00:22:31,760 --> 00:22:37,120
versus native speed but we need custom logic good put it behind typed boundaries use a fabric

309
00:22:37,120 --> 00:22:42,400
notebook job or ml endpoint for predictions and heavy transforms expose a clean API let the flow

310
00:22:42,400 --> 00:22:47,120
orchestrate with contracts not vibes you get python's strengths without hand wiring every approval

311
00:22:47,120 --> 00:22:52,640
rename and file hop through a flask app you'll forget to patch Microsoft's trajectory reinforces this

312
00:22:52,640 --> 00:22:58,880
split office scripts are the sanctioned scripting edge for m365 co-pilot is semantic model aware in

313
00:22:58,880 --> 00:23:04,720
power bi and data flow gen 2 generating m or python anchored to your metadata agents improve tool

314
00:23:04,720 --> 00:23:10,000
calling accuracy and context retention so the glue is reliable and auditable none of that requires you

315
00:23:10,000 --> 00:23:14,880
to embed orchestration in python hybrid is not a compromise it's the design type script flavoured

316
00:23:14,880 --> 00:23:20,320
scripts and agents handle the parameter python handles compute they meet at typed APIs or cues

317
00:23:20,320 --> 00:23:24,560
you reduce time to resolution shrink run costs and cut defect recurrence because the seams I

318
00:23:24,560 --> 00:23:30,320
explicit use python where it shines stop forcing it to be duct tape implementation playbook your

319
00:23:30,320 --> 00:23:36,480
next 30 days week one audit tag every workflow and data flow as glue or analytics highlight python

320
00:23:36,480 --> 00:23:42,000
in functions that touch approvals renames or file IO anti patterns document inputs outputs and

321
00:23:42,000 --> 00:23:47,360
failure hotspots pull real metrics incident time to resolution change lead time and tool call success

322
00:23:47,360 --> 00:23:53,600
rate if you have agents if you don't find establish the baseline now week two migrate trivial glue

323
00:23:53,600 --> 00:23:59,120
replace python endpoints used for excel or sharepoint chores with office scripts plus native connectors

324
00:23:59,120 --> 00:24:05,040
use co pilot to draft scripts from plain English then add a lightweight review checklist column shapes

325
00:24:05,040 --> 00:24:10,480
null handling identity standardized prompt templates and name scripts like adults verb noun with

326
00:24:10,480 --> 00:24:16,960
scope not script final two version them in your repo yes scripts are code week three refactor data flows

327
00:24:16,960 --> 00:24:22,160
move handwritten transforms to co pilot assisted generation in data flow gen 2 keep orchestration out

328
00:24:22,160 --> 00:24:27,840
of python reserved for analytics kernels only validate against your semantic model column names data

329
00:24:27,840 --> 00:24:32,480
types measures capture the diff you accept and added to your prompt library so the next change is a

330
00:24:32,480 --> 00:24:38,480
prompt edit not a spelunking expedition week four introduce an agent for complex orchestration

331
00:24:38,480 --> 00:24:45,040
constrain the toolbox worksheet update table dataverse dot absurd data flow dot run email dot

332
00:24:45,040 --> 00:24:50,560
send define type contracts for each tool including sample payloads and acceptance criteria turn on

333
00:24:50,560 --> 00:24:55,760
pre run validation against samples track tool calling accuracy and time to resolution when the agent

334
00:24:55,760 --> 00:25:01,920
fails fix the pattern update prompts schemas or tool names so recurrence drops governance throughout

335
00:25:01,920 --> 00:25:07,120
and forced typed interfaces at every seam store prompts alongside code add review gates for scripts

336
00:25:07,120 --> 00:25:12,640
and data flow changes centralized telemetry power automate run history data flow lineage and agent

337
00:25:12,640 --> 00:25:18,320
traces in one dashboard rollback plans are non-negotiable previous script versions last known good

338
00:25:18,320 --> 00:25:25,360
data flow and notebook job snapshots exit criteria at day 30 fewer custom connectors for trivial glue

339
00:25:25,360 --> 00:25:30,480
measurable drop in run cost faster changes with smaller blast radius and documented interfaces

340
00:25:30,480 --> 00:25:36,480
that let a new t-made chip in a day not a week python remains inside notebooks and analytics kernels

341
00:25:36,480 --> 00:25:43,680
the glue it's generated typed and boring that's the point key takeaway in Microsoft's ecosystem agents

342
00:25:43,680 --> 00:25:48,560
plus type script flavored scripts orchestrate cleanly while python stays contained in analytics where

343
00:25:48,560 --> 00:25:54,880
libraries actually earn their keep if this saved you time repay the debt subscribe watch the follow

344
00:25:54,880 --> 00:25:59,440
up on building your first type agent playbook with typed two contracts and review gates your ship

345
00:25:59,440 --> 00:26:05,120
faster with fewer outages and finance will finally stop sighing at your invoices proceed

Mirko Peters Profile Photo

Founder of m365.fm, m365.show and m365con.net

Mirko Peters is a Microsoft 365 expert, content creator, and founder of m365.fm, a platform dedicated to sharing practical insights on modern workplace technologies. His work focuses on Microsoft 365 governance, security, collaboration, and real-world implementation strategies.

Through his podcast and written content, Mirko provides hands-on guidance for IT professionals, architects, and business leaders navigating the complexities of Microsoft 365. He is known for translating complex topics into clear, actionable advice, often highlighting common mistakes and overlooked risks in real-world environments.

With a strong emphasis on community contribution and knowledge sharing, Mirko is actively building a platform that connects experts, shares experiences, and helps organizations get the most out of their Microsoft 365 investments.