In this episode of the M365.fm podcast, Mirko Peters speaks with Microsoft MVP and MCT Anitha Eswaran about the realities of integrating Dynamics 365 Finance & Operations (D365FO) into modern enterprise environments. The discussion focuses on why integrations often become fragile, difficult to scale, and hard to govern when organizations rely on outdated architectural thinking or tightly coupled systems.

The episode explores how Azure-native integration patterns can improve reliability, scalability, and long-term maintainability. Anitha explains the importance of event-driven architecture, asynchronous communication, APIs, and message-based systems when connecting D365FO with Microsoft 365, Power Platform, external SaaS platforms, and legacy enterprise applications. Rather than building direct point-to-point integrations, the conversation emphasizes designing loosely coupled systems that can evolve without constantly breaking dependencies.

Another key topic is governance and operational visibility. The speakers discuss how telemetry, monitoring, and centralized integration management become critical as environments grow more complex. They also highlight common mistakes organizations make during ERP modernization projects, including poor dependency mapping, weak authentication design, and underestimating integration lifecycle management.

The episode also looks at the growing influence of AI within ERP ecosystems. Mirko and Anitha discuss how AI-powered workflows, Copilot experiences, and intelligent automation are changing expectations around business systems, while also increasing the need for secure architecture, clean data models, and strong governance boundaries.

Overall, the conversation provides practical guidance for architects, consultants, and IT leaders who want to build resilient, scalable, and future-ready Dynamics 365 Finance & Operations integrations using Microsoft Azure and modern cloud design principles.

Apple Podcasts podcast player iconSpotify podcast player iconYoutube Music podcast player iconSpreaker podcast player iconPodchaser podcast player iconAmazon Music podcast player icon

You play a key role in shaping how your enterprise connects and grows with d365fo integrations. As dynamics 365 finance and operations evolves from a traditional erp to a cloud-native, integration-ready platform, you must understand how scalable architecture supports business needs. The shift to d365 means integrations now require horizontal scaling and resilience. You can boost reliability by building for distributed systems and using event-driven patterns. In d365 fo, robust and secure integration enables your business to adapt quickly, making dynamics 365 f&o a foundation for digital transformation.

Key Takeaways

  • Understand the importance of D365FO integrations for automating workflows and improving data accuracy.
  • Choose the right integration pattern based on your business needs, such as batch data movement for large migrations or real-time sync for immediate updates.
  • Utilize event-driven architecture to create responsive integrations that react to changes in real time.
  • Implement a hub-and-spoke model for easier management of multiple integrations and to enhance scalability.
  • Focus on modular design to create independent components that can be updated without affecting the entire system.
  • Prioritize security by using Azure Active Directory for authentication and following best practices for access control.
  • Regularly monitor your integrations to quickly identify and resolve issues, ensuring smooth operations.
  • Embrace continuous improvement by reviewing and updating your integration strategies as your business evolves.

D365FO Integrations Overview

What Are D365FO Integrations

You interact with d365fo integrations when you connect dynamics 365 finance and operations to other systems. These integrations let you move data, automate processes, and share information across your enterprise. You can use integration types like batch data movement, real-time sync, or event-driven triggers. Each approach helps you solve different business needs. For example, batch data APIs support large-scale migrations, while OData services enable real-time operations for web and mobile apps.

Tip: Choosing the right integration pattern depends on your business goals, transaction volume, and how often you need data updates.

Here is a table showing common integration scenarios in dynamics 365 finance & supply chain management:

Integration PatternDescription
OData ServicesBest for real-time, synchronous data exchange, suitable for CRUD operations from web/mobile apps.
Data Management FrameworkIdeal for high-volume, asynchronous bulk data operations, such as large migrations or nightly syncs.
Business EventsEnables event-driven integrations, allowing systems to react to published business events.
Dual-writeProvides near real-time, bidirectional synchronization between Dynamics 365 and Dataverse.

Why Integrations Matter in Dynamics 365 F&O

You rely on integrations to unlock the full power of d365 fo. Integrations help you automate workflows, reduce manual tasks, and improve data accuracy. When you connect dynamics 365 f&o to other applications, you gain agility and speed. Your business can respond faster to changes, scale operations, and support digital transformation. Integrations also make your erp more flexible, letting you adapt to new requirements without major system changes.

You see integrations as the backbone of modern enterprise architecture. They enable seamless communication between systems, making sure your data stays consistent and reliable. You can use integration types like event-driven triggers or batch processing to match your business needs.

Key Concepts in D365 Integrations

You need to understand several core concepts to master d365fo integrations. Each concept helps you build robust and scalable solutions. Here is a table that summarizes these key ideas:

Integration PatternDescriptionUse Cases
Batch Data APIsSupports data import/export through batch processing.Large-scale data migrations, periodic updates, master data imports.
OData ServicesREST-based API for real-time data access.Real-time operations, mobile apps, lightweight integrations.
Business EventsTriggers external systems based on in-app occurrences.Alerting external systems, real-time event-driven integrations.
Recurring IntegrationUses DMF for regularly scheduled imports/exports.Daily syncs of pricing data, supplier information updates.
Electronic ReportingEnables creation of document formats for compliance reporting.Exporting financial statements, tax reports, invoice formatting.

You build integrations by combining these patterns. You select batch APIs for periodic updates, OData for real-time sync, and business events for event-driven automation. You use recurring integration to schedule regular data exchanges. Electronic reporting helps you meet compliance needs.

Note: Understanding these concepts gives you the foundation to design integrations that scale and adapt as your business grows.

Integration Types in Dynamics 365 Finance & Supply Chain Management

When you work with integrations in d365, you need to understand the main types that support your business. Each type helps you connect systems, move data, and automate processes in different ways. Let’s explore the core integration types you will use in dynamics 365 finance & supply chain management.

Data Integrations

Data integrations help you move information between d365 and other systems. You can choose the right method based on your needs for speed, volume, and reliability.

Batch Data Movement

Batch data movement lets you transfer large amounts of data at scheduled times. You often use this method for data migrations, nightly updates, or syncing master data. The Data Management Framework (DMF) in d365 gives you tools to import and export data in bulk. You can schedule jobs to run during off-peak hours, which helps you avoid performance issues.

Tip: Use batch data movement when you need to process high volumes of records without impacting real-time operations.

Real-Time Sync

Real-time sync keeps your data up to date across systems as soon as changes happen. You use this approach when you need immediate updates, such as syncing inventory levels or customer orders. Real-time sync relies on APIs or services that push data instantly. This method supports business agility and helps you respond quickly to changes.

Application Integrations

Application integrations connect d365 with other business applications. These integrations let you automate tasks and share information between systems.

OData and REST APIs

OData and REST APIs give you a way to access and update data in d365 from external apps. You can use these APIs for real-time operations, such as creating sales orders from a web portal or updating customer records from a mobile app. OData supports standard CRUD operations, making it easy to build integrations that fit your needs.

Event-Driven Integrations

Event-driven integrations let you react to specific actions in d365. When a business event occurs, such as a new invoice or shipment, d365 can send a notification to another system. You can use this pattern to trigger workflows, alert users, or update external databases. Event-driven integrations help you build responsive and scalable solutions.

Process Integrations

Process integrations help you automate and coordinate business processes across systems.

Workflow Automation

Workflow automation lets you define steps that run automatically when certain conditions are met. You can use tools like Power Automate to create workflows that connect d365 with other apps. For example, you might automate approval processes or send notifications when tasks are complete.

Orchestration Patterns

Orchestration patterns help you manage complex processes that involve multiple systems. You can use Azure Logic Apps to design orchestrations that call APIs, move data, and handle errors. This approach gives you control over the flow of information and ensures that each step happens in the right order.

Note: Choosing the right integration type helps you build solutions that scale with your business and support your goals.

Scalable Patterns for D365FO Integrations

Scalable Patterns for D365FO Integrations

Point-to-Point vs. Hub-and-Spoke

You often start with point-to-point integrations when connecting two systems. This pattern links each application directly. You set up a connection between dynamics 365 finance and operations and another system. Point-to-point works well for simple use cases. You can quickly build integrations for small data exchanges or basic automation.

As your enterprise grows, you see challenges with point-to-point architecture. Each new integration adds complexity. You must manage multiple connections, which increases maintenance. You risk data inconsistency and limited scalability.

Hub-and-spoke patterns solve these problems. You connect each system to a central hub. The hub manages data flow and orchestration. You simplify integration architecture and reduce the number of direct connections. You gain better control over data movement and error handling. Hub-and-spoke supports future scalability and helps you adapt to new business requirements.

PatternDescriptionProsCons
Point-to-PointDirect connection between two systemsSimple, fast setupHard to scale, complex
Hub-and-SpokeCentral hub manages all integrationsEasy to manage, scalableRequires central platform

Tip: Choose hub-and-spoke when you need to support many integrations and want to simplify maintenance.

Message-Based Patterns

Message-based integration patterns help you build scalable and reliable solutions in d365fo integrations. You use messaging to decouple systems. Each application sends and receives messages through a broker. You avoid direct dependencies and reduce risk during updates.

Publish/Subscribe

Publish/subscribe lets you send messages to multiple subscribers. You publish an event in dynamics 365 finance & supply chain management. Any system that subscribes receives the message. You use this pattern for real-time notifications, business events, or workflow triggers.

You improve scalability by decoupling producers and consumers. You can add new subscribers without changing the publisher. You support future scalability and adapt to new business needs.

  • Organizations using Azure Service Bus for messaging see fewer integration failures during platform updates. You experience a 40% reduction in failures compared to direct API connections.
  • You gain flexibility to add or remove systems as your integration strategy evolves.
  • You ensure reliable delivery of messages, even if some systems are offline.

Queue-Based Processing

Queue-based processing helps you manage workloads and balance system resources. You send messages to a queue. Each consumer processes messages one at a time. You use queues for batch jobs, asynchronous tasks, or error handling.

You improve reliability by storing messages until they are processed. You avoid data loss and handle spikes in transaction volume. You can scale consumers to match demand. Queue-based processing supports robust integrations in d365 fo.

Note: Message-based patterns help you build resilient integration architecture. You reduce risk and improve performance in your erp environment.

API Gateway Pattern

API gateway pattern gives you a single entry point for all integrations. You route requests through a gateway. You manage authentication, security, and traffic control in one place. You use API gateways to connect d365 to external systems, mobile apps, or partner platforms.

You simplify integration architecture by centralizing APIs. You enforce security policies and monitor usage. You can scale your gateway to handle high transaction volumes. API gateway pattern supports hybrid integrations and future scalability.

FeatureBenefit
Centralized APIsEasy to manage and secure
Traffic ControlBalance load and prevent overload
MonitoringTrack usage and detect issues

Tip: Use API gateways to protect your data and streamline integrations in dynamics 365 f&o.

You build your integration strategy by combining these patterns. You choose the right approach based on business needs, transaction volume, and future scalability. You create a flexible architecture that supports growth and innovation in your enterprise.

Microservices and Modular Design

You can boost your integration strategy by using microservices and modular design. Microservices break down large applications into smaller, independent services. Each service handles a specific business function. This approach helps you manage complexity and scale your solutions as your needs grow.

  • Microservices architecture allows you to deploy each service independently. You can update or fix one service without affecting others.
  • You improve modularity by separating business logic into focused components. This makes your integrations easier to maintain.
  • Microservices communicate efficiently with each other. This supports scalability and helps you handle more transactions as your business expands.
  • You can quickly adapt to changes in demand or business requirements. This flexibility supports operational efficiency and keeps your d365 integrations future-ready.

When you use modular design, you create reusable building blocks. You can combine these blocks to solve new business challenges. This design pattern fits well with cloud-native platforms like d365.

Tip: Start small by identifying business functions that can become independent services. Over time, you can expand your microservices landscape as your integration needs grow.

Event-Driven Architecture

Event-driven architecture helps you build responsive and scalable integrations in d365. In this pattern, systems react to events as they happen. You do not need to wait for scheduled jobs or manual triggers.

You can use event-driven architecture in many ways:

  • Trigger external systems when a purchase order is confirmed. This keeps your supply chain partners updated in real time.
  • Notify external systems when an invoice is posted. This speeds up financial processes and improves accuracy.
  • Enable real-time integrations with Microsoft or third-party services. This supports business agility and helps you respond quickly to changes.

Event-driven architecture reduces delays and improves data consistency. You can connect d365 to other systems using business events or messaging platforms. This pattern supports high transaction volumes and helps you scale your integrations as your business grows.

Note: Event-driven integrations help you automate processes and keep your data up to date across all connected systems.

Hybrid Integration Approaches

Hybrid integration approaches combine modern cloud-based solutions with existing on-premises or legacy systems. You often need this pattern when your organization uses both new and old technologies.

Hybrid integrations help you solve common challenges in d365 environments. The table below shows how these approaches address business needs:

Challenges AddressedBusiness Outcomes
Disconnected CRM and ERP systemsUnified enterprise data
Manual data entry and delaysFaster workflows through automation
Lack of real-time reportingImproved operational efficiency
Failed or outdated integrationsBetter analytics and reporting
Legacy systems limiting scalabilityReduced operational costs

You can use hybrid integration to connect d365 with legacy applications, cloud services, and partner platforms. This approach lets you modernize your architecture at your own pace. You keep your business running smoothly while you upgrade systems over time.

Tip: Use hybrid integration patterns to bridge the gap between old and new systems. This helps you deliver value quickly without disrupting your operations.

Tools and Technologies for D365 Integrations

Microsoft Power Platform

Power Automate

You use Power Automate to build automated workflows that connect d365 with other systems. This tool helps you trigger actions based on changes in data, such as sending notifications or updating records. You can set up flows that move information between modules or external platforms. Power Automate supports both simple and complex integration scenarios, making it easy to maintain consistency across your business.

  • You trigger business events or workflows when data changes.
  • You keep operations synchronized across platforms.
  • You reduce manual tasks and improve efficiency.

Power Apps

Power Apps lets you create custom business applications that work with d365. You design apps for specific tasks, such as managing inventory or tracking expenses. These apps connect to data sources using apis, giving you real-time access to information. You can build solutions that fit your unique needs without writing much code.

Here is a table showing common Power Platform components and their functions:

ComponentFunction
Power BIFor analytics
Power AppsFor building custom business apps
Power AutomateFor creating automated workflows
Power Virtual AgentsFor intelligent bots

Azure Integration Services

Logic Apps

Logic Apps helps you automate workflows across multiple applications. You use this tool to set up event-driven processes, such as real-time notifications or multi-step approvals. Logic Apps connects d365 to cloud services, on-premises systems, and partner platforms. You gain flexibility to scale your integration as your business grows.

Azure Service Bus

Azure Service Bus provides reliable messaging queues for your integration needs. You send messages between systems without direct connections. This approach decouples communication and ensures high-reliability message delivery. You handle spikes in transaction volume and avoid data loss.

API Management

API Management gives you a central place to manage and secure your api connections. You control access, monitor usage, and enforce security policies. You protect your data and streamline integration with d365 and other platforms.

Tip: Use Azure Integration Services to enhance scalability and security. Monitor performance, handle throttling, and document endpoints to ensure data consistency.

Data Management Framework (DMF)

You rely on the Data Management Framework for high-volume data operations in d365. DMF supports asynchronous processing, making it ideal for bulk data migrations and updates. You import and export data using batch processing, and DMF accepts Excel, CSV, and XML file formats. This tool fits use cases like nightly data syncs, initial data loads, and master data imports.

Function/FeatureDescription
High-volume data operationsDesigned for handling large-scale data migrations and updates.
Asynchronous processingSupports non-real-time bulk data integrations.
Data import/exportFacilitates data import and export through entities using batch processing.
File format supportAccepts Excel, CSV, and XML file formats.
Ideal use casesNightly data syncs, initial data loads, and master data imports.

Note: Choosing the right tools helps you build scalable and secure integration solutions for d365.

Custom Connectors and Middleware

You often need to connect Dynamics 365 Finance & Operations (D365FO) with systems that do not have out-of-the-box connectors. Custom connectors and middleware help you bridge these gaps. They let you build integrations that fit your unique business needs.

Custom connectors allow you to define how D365FO talks to other applications. You can use them to connect to web services, REST APIs, or even legacy systems. Middleware acts as a central layer that manages data flow between D365FO and other platforms. This approach gives you more control and flexibility.

Middleware solutions often provide visual mapping tools. These tools let you match data fields between systems without writing code. You can see how information moves from one system to another. This makes the integration process easier to understand and manage.

Here is a table that shows how custom connectors and middleware extend your integration capabilities:

FeatureDescription
Visual mappingSimplifies the integration process by allowing users to visually connect data points.
Cross-system orchestrationManages workflows across different systems, enhancing data flow and integration.
Integrated error handlingProvides mechanisms to handle errors during data transfer, ensuring reliability.
ExampleA company uses Middleware Integration Toolkit to synchronize customer and order data between D365 Finance and a legacy warehouse system.

You can use middleware to automate complex workflows. For example, you might need to synchronize customer and order data between D365FO and a legacy warehouse system. Middleware can manage this process and handle errors if something goes wrong.

  • Native connectors in Dynamics 365 sometimes cannot handle complex scenarios like multi-warehouse inventory or real-time synchronization.
  • Middleware solutions automate bidirectional data flow. This is important when you manage large product catalogs or process orders from many channels.

Custom connectors and middleware also help you scale your integrations. As your business grows, you can add new systems or update existing ones without starting from scratch. You keep your architecture flexible and ready for future needs.

Third-Party Integration Tools

You may find that your integration needs go beyond what Microsoft tools offer. Third-party integration tools can help you fill these gaps. These tools often provide pre-built connectors, advanced mapping features, and support for a wide range of systems.

Many third-party platforms focus on making integrations easier to set up and manage. You can use drag-and-drop interfaces to build workflows. Some tools offer monitoring dashboards, so you can track data movement and spot issues quickly.

Popular third-party integration tools include solutions like KingswaySoft, Scribe, and Celigo. These platforms support connections to cloud services, on-premises databases, and industry-specific applications. You can use them to automate data transfers, synchronize records, or trigger actions based on business events.

When you choose a third-party tool, look for features like:

  • Support for both cloud and on-premises systems
  • Built-in error handling and retry logic
  • Detailed logging and monitoring
  • Flexible mapping and transformation options

Third-party tools can save you time and reduce complexity. They help you deliver reliable integrations that keep your business running smoothly.

Best Practices for Dynamics 365 F&O Integrations

Modular and Decoupled Design

You can build strong d365fo integrations by focusing on modular and decoupled design. This approach lets you create systems where each part works on its own. When you use a loosely coupled architecture, you allow producers and consumers to operate independently. This flexibility helps you adapt to changes quickly.

Decoupling your systems means the sending application does not need to know about the receiving systems. This promotes modularity and makes your integration architecture easier to manage. You can tailor specific components to the systems you want to connect. When you design with modularity, you also make debugging simpler because you can track issues at the module level.

You gain higher performance because components can run independently and in parallel. This design also improves scalability, so you can make changes or add new features without affecting the whole system. Testing becomes easier since you can test each module separately. You also increase agility, allowing you to upgrade certain parts without touching the entire application.

  • Loosely coupled architecture enhances flexibility.
  • Decoupling systems promotes modularity.
  • Components execute independently for better performance.
  • Modularity allows for tailored solutions.
  • Debugging is easier at the module level.
  • Scalability supports easy changes and enhancements.
  • Testing is more manageable with separate workflows.
  • Agility enables targeted upgrades.

Tip: Start with small, independent modules and connect them using proven integration patterns. This will help you scale your integrations as your enterprise grows.

Reusability and Maintainability

You should always aim for reusability and maintainability in your d365 integrations. These practices help you save time and reduce errors as your business processes change. When you reuse components, you avoid building the same logic over and over. Maintainable solutions are easier to update and support.

Here is a table that shows best practices for reusability and maintainability in large-scale d365fo integrations:

Best PracticeExplanation
Configuration-first approachUse business rules, Power Automate, and workflows to minimize custom code and lower maintenance.
ModularizationSeparate region-specific customizations into managed solutions to prevent conflicts.
Lifecycle governanceUse structured Application Lifecycle Management (ALM) for predictable deployments and rollbacks.

You can use a configuration-first approach by relying on business rules and automation tools. This reduces the need for custom code and lowers your maintenance workload. Modularization helps you keep region-specific changes separate, so you avoid conflicts and keep your core logic clean. Lifecycle governance ensures you have a clear process for deploying updates and rolling back changes if needed.

Note: Reusable and maintainable integrations help you keep your dynamics 365 finance & supply chain management environment stable and ready for future growth.

Security and Governance

You must protect your integration endpoints and data at all times. Security and governance are key parts of any successful d365fo integrations project. You need to define clear policies for access control and incident response. Regularly review your compliance requirements to keep up with new regulations. Invest in continuous monitoring to get real-time insights into your security posture.

You should also deploy web application firewalls to protect your integration endpoints. Monitor network traffic with intrusion detection and prevention systems. Implement DNS security to filter and watch for threats.

Secure App Registration

You need to register every application that connects to your dynamics 365 finance and operations environment. Secure app registration ensures only trusted apps can access your data. Use Azure Active Directory (Azure AD) to manage app identities and permissions. Always follow the principle of least privilege, giving each app only the access it needs.

Managed Authentication

You should use managed authentication to control how users and apps connect to your integrations. Azure AD provides strong identity verification and supports multi-factor authentication. This helps you prevent unauthorized access and keeps your data safe. Managed authentication also makes it easier to track who accessed what and when.

Role-Based Security

Role-based security lets you control what each user or app can do in your d365 fo environment. Assign roles based on job functions and limit access to sensitive data. This approach helps you protect your erp and maintain data accuracy. Review roles regularly and update them as your business changes.

Tip: Combine secure app registration, managed authentication, and role-based security to create a strong security foundation for your integrations.

By following these architectural best practices, you build integrations that are resilient, scalable, and secure. You support business agility and protect your enterprise as you grow with dynamics 365 f&o.

Monitoring and Observability

You need strong monitoring and observability to keep your D365FO integrations healthy. These practices help you spot issues early and keep your business running smoothly. You can track data flows, watch for failures, and measure performance. Good observability gives you the power to act before small problems become big ones.

Logging and Diagnostics

You should log every important action in your integration workflows. Logs help you understand what happens inside your system. They show you when data moves, when errors occur, and how long each process takes. You can use logs to find the root cause of problems.

Diagnostics tools give you deeper insights. You can trace transactions across systems. You can see where delays happen or where data gets stuck. This information helps you fix issues quickly. You should set up logging and diagnostics from the start. This makes troubleshooting easier and keeps your integrations reliable.

Tip: Store logs in a secure, centralized location. Use tools like Azure Monitor or Application Insights to collect and analyze your data.

Alerting and Incident Response

You must set up alerts for critical events. Alerts tell you when something goes wrong. You can get notifications by email, text, or through dashboards. Fast alerts help you respond before users notice a problem.

Incident response plans guide your team when issues arise. You should define clear steps for handling incidents. Assign roles so everyone knows what to do. Practice your response plan to make sure your team acts quickly and confidently.

  • Set up alerts for failed data transfers, API errors, or slow performance.
  • Use dashboards to monitor integration health in real time.
  • Review incidents after they happen to improve your response process.

Note: Quick detection and response reduce downtime and protect your business from bigger problems.

Error Handling and Recovery

You must design error handling into your integrations from the beginning. This prevents system failures and protects your data from corruption. A strong error handling strategy helps you recover quickly and keeps your business running.

A robust error architecture should include:

  • Contextual capture: Record details about each error. This helps you understand why failures happen.
  • Intelligent routing: Send errors to the right people or systems. This speeds up resolution.
  • Resolution tracking: Track how you fix errors. Make sure you address root causes, not just symptoms.

You should not wait to add error handling later. Build it into your system from the start. This approach keeps your integrations resilient and reliable.

Tip: Use retry logic for temporary failures. Log all errors and resolutions for future analysis.

Performance Optimization

You need to optimize performance for high-volume D365FO integrations. Fast and efficient integrations keep your business data fresh and your users happy. You can use different techniques and tools to boost performance.

Here is a table that shows effective performance optimization techniques:

TechniqueDescriptionAdvantages
Batch Data APIHandles recurring integrations and large data volumes efficiently.Stable platform, supports importing data in sequence, skips empty files for faster processing.
Data Management FrameworkDesigned for high-volume, asynchronous bulk data operations.Ideal for large migrations and nightly syncs, avoids real-time data staleness.

You can choose the right approach based on your needs. For example, use the Data Management Framework for big data migrations. Use Batch Data API for regular, high-volume updates.

Integration ApproachDescriptionUse Case
Data Management FrameworkOptimized for high-volume, asynchronous bulk data operations.Best for large data migrations and non-real-time syncs.

Note: Test your integrations under real-world loads. Monitor performance and adjust your approach as your business grows.

Implementation Challenges and Solutions in D365FO Integrations

Testing and Validation Strategies

You face many challenges when you build d365fo integrations. Testing and validation help you avoid errors and keep your enterprise resource planning system reliable. You need a scalable architecture and structured documentation. Comprehensive testing ensures your integrations perform well across the enterprise.

You should validate data before writing it into dynamics 365 finance and operations. Use staging tables or pre-validation logic in the Data Management Framework. Apply field-level validations in Power Automate or Azure Functions. After integration, perform reconciliation checks to confirm data accuracy. Design your flows to handle partial failures and ensure atomic transactions.

Tip: Always select the right integration pattern and enforce principles that keep your solution secure and maintainable. Prepare your integrations for future upgrades.

Here are common challenges you may encounter:

  1. Data synchronization can be difficult and may lead to errors.
  2. Compatibility issues arise when you connect systems with different architectures.
  3. Complex business requirements demand technical skills and deep process knowledge.
  4. Security concerns increase as you integrate multiple systems.
  5. Performance issues may occur with large data volumes.

Change Management and Version Control

You must manage changes carefully in d365 integrations. Version control helps you track and manage updates. You can revert changes if needed. Conduct impact analysis before you implement updates. This helps you understand how changes affect integrations, workflows, and reporting.

Continuous monitoring lets you spot issues early and keep your integrations reliable. Maintain clear documentation of updates and changes. This provides clarity and reference for all users. You build a stable integration environment by following these practices.

  • Use version control to manage changes.
  • Analyze the impact of updates.
  • Monitor integrations regularly.
  • Document every change for transparency.

Data Consistency and Integrity

You need to maintain data consistency and integrity across integrated d365 fo systems. Data validation rules ensure incoming and outgoing data meets quality standards. Error handling mechanisms track failures and notify users of discrepancies. Data mapping prevents mismatches during transfers. Data auditing tracks changes and integration history for transparency.

MethodDescription
Data Validation RulesEstablish rules for data quality.
Error HandlingTrack errors and notify users.
Data MappingPrevent mismatches during transfer.
Data AuditingTrack changes and integration history.
REST APIsEnable automatic updates with other systems.
OData EndpointsConnect f&o with reporting tools for real-time access.
Custom APIsDevelop tailored solutions for unique needs.
API SecurityUse authentication and encryption to secure exchanges.

You use REST APIs and OData endpoints to connect d365 with other systems. Custom APIs help you meet unique integration needs. API security protects your data during exchanges. These methods keep your erp integration reliable and your business data accurate.

Note: Strong data consistency and integrity practices help you build trust in your enterprise resource planning environment.

Managing Complexity and Scalability

You face many challenges when you build integrations for Dynamics 365 Finance & Operations. Complexity grows as you connect more systems and handle larger volumes of data. You need a clear plan to manage this complexity and ensure your integrations scale as your business expands.

Start by mapping out your entire business process. Identify every trigger and action that affects your integration landscape. This step helps you see where data moves and where bottlenecks might appear. You can use diagrams or flowcharts to visualize these connections.

You must determine the frequency and volume of data passing through your integrations. High-frequency updates require different strategies than occasional batch jobs. You can reduce overhead by batching messages. This approach lets you process many records at once, saving time and resources.

Idempotency plays a key role in handling duplicate messages. You design your integrations so that repeated messages do not cause errors or data corruption. This practice keeps your data accurate and prevents unnecessary processing.

Service-level agreements (SLAs) help you set expectations for data freshness. You decide how quickly data must update across systems. SLAs guide your monitoring and alerting strategies. You can use the Data Management Framework (DMF) in Dynamics 365 to track integration issues and monitor data flows.

Planning for failures is essential. You build error logging and continuity plans into your design. When something goes wrong, you capture detailed logs and send alerts to the right people. Monitoring systems cover all integration components. You set up alerts for key processes, so you respond quickly to problems.

Security and privacy matter as you scale your integrations. You ensure your monitoring system complies with security principles and privacy regulations. Protecting sensitive data keeps your business safe and builds trust with partners.

Here is a checklist to help you manage complexity and scalability:

  • Map business processes and integration triggers.
  • Determine data frequency and volume.
  • Batch messages to reduce overhead.
  • Design for idempotency to handle duplicates.
  • Set SLAs for data freshness.
  • Use DMF for monitoring and logging.
  • Plan for failures with error logging and continuity.
  • Ensure compliance with security and privacy standards.

Tip: Review your integration architecture regularly. Adjust your strategies as your business grows and new requirements emerge.

You build a strong foundation by following these steps. Your integrations stay reliable, scalable, and ready for future challenges.

Real-World D365 Integration Scenarios

Real-World D365 Integration Scenarios

CRM and D365FO Integration

You often need to connect your customer relationship management system with dynamics 365 finance and operations. This integration helps you share customer data, sales orders, and invoices between teams. You can automate lead-to-cash processes and improve customer service by keeping information consistent across platforms.

When you plan d365fo integrations with CRM systems, you must consider several factors. The table below outlines the most important points:

Key ConsiderationDescription
Real-time vs. BatchDecide if you need real-time data updates or if scheduled batch syncs are enough.
Data VolumeUse batch integrations for large data sets; use APIs for smaller, frequent transactions.
Frequency of UpdatesChoose REST APIs for frequent updates; use scheduled imports for less frequent changes.
Security & ComplianceAlign with your company’s data governance, encryption, and authentication policies.

You should also ensure compliance with data protection rules. Always check that third-party tools work well with dynamics 365 f&o. Plan for future growth so your integration can scale as your business expands.

Legacy System Connectivity

You may still rely on older systems that are critical to your business. Connecting these legacy platforms to d365 can seem challenging, but you can follow best practices to make the process smoother:

  1. Use service accounts with the least privilege. Assign only the permissions needed for each integration. This reduces the risk of unauthorized access.
  2. Monitor performance and failures. Track how your integrations perform and watch for errors using monitoring tools.
  3. Handle throttling and retries gracefully. Design your integrations to respect API limits and use strategies like exponential backoff when you need to retry.

You can use middleware or custom connectors to bridge the gap between legacy systems and d365 fo. This approach helps you automate data flows and keep your erp running smoothly.

Financial Data Automation

You can streamline your financial operations by automating data movement between d365 and other financial systems. Automation reduces manual entry and errors. You can set up integrations that transfer invoices, payments, and journal entries directly into d365fo. This keeps your records accurate and up to date.

You might use batch jobs for high-volume imports, or APIs for real-time data updates. Automation supports compliance by ensuring all transactions follow your business rules. You can also improve reporting by syncing data between f&o and analytics tools. This gives you a clear view of your financial health.

Tip: Start with simple automation, such as importing bank statements, then expand to more complex workflows as your team gains experience.

Real-Time Inventory Updates

You need accurate inventory data to run your business smoothly. Real-time inventory updates in Dynamics 365 Finance & Operations (D365FO) help you keep every channel in sync. When you connect your ERP with ecommerce, warehouse, and retail systems, you make sure everyone sees the same stock levels.

You can use the Available-to-Promise (ATP) feature to check inventory across all locations. ATP works in real time and supports different calculation formulas. This means you can answer customer questions about product availability and delivery dates right away. You also protect important stock by using inventory allocation. This feature lets you reserve inventory for key customers or sales channels, so you avoid overselling.

Real-time inventory integration gives you several advantages:

  • Your ecommerce site always shows live stock status from your ERP.
  • Online orders move directly into D365FO for processing and shipment.
  • Returns and inventory adjustments sync back to your ERP, keeping records accurate.
  • Warehouse logic, including multi-location inventory, reflects online for better order fulfillment.

You can see how these integration points work together in the table below:

Integration PointWhat It Does
Inventory AvailabilitySends real-time stock levels to your ecommerce storefront
Orders & FulfillmentPushes online orders to D365FO for processing
Returns & AdjustmentsSyncs returns and updates to maintain accuracy
Warehouse SyncReflects multi-location inventory and rules online

Inventory Visibility is another key feature. It gives you access to the most up-to-date inventory quantities across all channels and locations. You can use its API to post inventory changes instantly from external systems. Soft reservation helps you manage available inventory for orders, which prevents overselling. You also get immediate responses on ATP quantity and delivery dates, so you can set clear expectations for your customers.

Tip: Use central inventory adjustment to update stock levels quickly when you receive new shipments or process returns.

You improve customer satisfaction when you keep your inventory data accurate and up to date. Real-time updates help you avoid stockouts, reduce manual errors, and support fast order fulfillment. You build trust with your customers because they know your inventory information is reliable.

Advancing Your D365 Integration Strategy

Building an Integration Roadmap

You need a clear roadmap to guide your integration journey. Start by identifying your business goals. List the systems you want to connect with d365. Map out the data flows between these systems. Set priorities based on business impact and technical complexity. You can use a simple table to organize your roadmap:

StepDescriptionPriority
Identify systemsList all systems to integrateHigh
Map data flowsShow how data moves between systemsMedium
Set milestonesDefine key project checkpointsHigh
Assign ownershipName responsible team membersMedium

Review your roadmap often. Update it as your business changes. This approach keeps your integration projects on track and aligned with your goals.

Tip: Break large projects into smaller phases. This makes progress easier to measure and manage.

Skills and Training for Teams

Your team needs the right skills to succeed with d365 integration. Focus on both technical and business knowledge. Encourage your team to learn about APIs, data mapping, and workflow automation. Provide training on Microsoft Power Platform and Azure Integration Services. You can use online courses, workshops, or hands-on labs.

Create a learning plan for your team. Include regular skill assessments. Offer opportunities for team members to share what they learn. This builds a culture of continuous improvement.

  • Schedule weekly knowledge-sharing sessions.
  • Assign mentors to help new team members.
  • Use real-world scenarios for practice.

Note: Well-trained teams solve problems faster and deliver better integration results.

Leveraging Microsoft and Community Resources

You have access to many resources to support your integration strategy. Microsoft offers official documentation, learning paths, and webinars. The Microsoft Learn platform provides step-by-step guides for d365 and related tools. You can join the Dynamics 365 Community to ask questions and share experiences.

Explore community forums, blogs, and user groups. These platforms offer practical advice and real-world solutions. Attend virtual events to stay updated on new features and best practices.

Resource TypeExampleBenefit
Official DocumentationMicrosoft LearnIn-depth technical guidance
Community ForumsDynamics 365 CommunityPeer support and Q&A
Webinars & EventsMicrosoft Ignite, User GroupsLatest trends and networking

Tip: Bookmark your favorite resources. Visit them regularly to keep your knowledge current.

Continuous Improvement

You need to treat your D365FO integration strategy as a living process. Technology and business needs change quickly. You must adapt your integrations to keep up. Continuous improvement helps you stay ahead and deliver more value to your organization.

Start by reviewing your integrations regularly. Schedule time each quarter to check if your solutions still meet business goals. Look for bottlenecks, outdated workflows, or new requirements. You can use feedback from users to spot areas for improvement. Encourage your team to share ideas and report issues. This open communication helps you find problems early.

Set up a feedback loop for your integrations. Collect data on performance, errors, and user satisfaction. Use dashboards and reports to track key metrics. For example, you can monitor data transfer times, error rates, and system uptime. This information shows you where to focus your efforts.

Tip: Use tools like Azure Monitor or Power Platform analytics to get real-time insights into your integrations.

You should also keep up with new features in Dynamics 365 Finance & Operations and related Microsoft technologies. Microsoft releases updates and enhancements often. These updates can improve performance, security, or add new integration options. Assign someone on your team to follow product announcements and test new features in a safe environment before rolling them out.

Here are some steps you can follow to drive continuous improvement:

  1. Review integration performance: Check logs and dashboards for slowdowns or failures.
  2. Gather user feedback: Ask users about their experience and pain points.
  3. Update documentation: Keep your integration guides current as changes happen.
  4. Test new features: Try out new tools or APIs in a sandbox before production.
  5. Automate monitoring: Set up alerts for critical issues so you can respond quickly.
  6. Train your team: Share lessons learned and best practices after each project.

You can use a simple table to track your improvement actions:

Improvement AreaAction TakenNext Review Date
PerformanceOptimized batch jobsNext quarter
User FeedbackAdded survey toolMonthly
DocumentationUpdated API guideOngoing

Note: Continuous improvement is not a one-time task. Make it part of your team’s routine. Small changes add up to big results over time.

By focusing on continuous improvement, you keep your D365FO integrations reliable, efficient, and ready for the future.


You now see how mastering d365fo integrations shapes a future-ready enterprise. When you use scalable patterns and strong security, you build a foundation for growth in d365 and dynamics 365 f&o. Review your current integration setup in d365 fo and dynamics 365 finance and operations. Adopt best practices to keep your integrations resilient. Explore Microsoft resources and connect with the community to keep your erp and f&o integration strategy strong.

FAQ

What is the best integration pattern for high-volume data in D365FO?

You should use the Data Management Framework (DMF) for high-volume data. DMF supports batch processing and handles large imports or exports efficiently. This pattern helps you keep your system stable and your data accurate.

How do you secure integrations in Dynamics 365 Finance & Operations?

You secure integrations by using Azure Active Directory for authentication, registering apps securely, and applying role-based access. Always follow the principle of least privilege. Regularly review permissions and monitor activity to protect your data.

Can you connect D365FO with legacy systems?

Yes, you can connect D365FO with legacy systems. Use middleware or custom connectors to bridge the gap. These tools help you automate data flows and keep your business running smoothly.

What tools help you monitor D365FO integrations?

You can use Azure Monitor, Application Insights, and built-in D365FO monitoring tools. These solutions help you track data flows, spot errors, and measure performance. Set up alerts to respond quickly to issues.

How do you handle errors in D365FO integrations?

You should build error handling into your integrations from the start. Use retry logic for temporary failures. Log all errors with details. Send alerts to the right people so you can fix problems quickly.

Is real-time integration possible with D365FO?

Yes, real-time integration is possible. You can use OData APIs, business events, or Azure Logic Apps for instant data updates. Real-time sync helps you keep systems aligned and supports fast business decisions.

What is the difference between batch and real-time integration?

Batch integration moves data at scheduled times, often in large groups. Real-time integration updates data instantly as changes happen. Choose batch for high volumes and non-urgent updates. Use real-time for immediate needs.

How do you ensure data consistency across integrated systems?

You use data validation rules, mapping, and auditing. Always check data before and after transfers. Set up reconciliation checks to confirm accuracy. Consistent data builds trust and supports better business decisions.

🚀 Want to be part of m365.fm?

Then stop just listening… and start showing up.

👉 Connect with me on LinkedIn and let’s make something happen:

  • 🎙️ Be a podcast guest and share your story
  • 🎧 Host your own episode (yes, seriously)
  • 💡 Pitch topics the community actually wants to hear
  • 🌍 Build your personal brand in the Microsoft 365 space

This isn’t just a podcast — it’s a platform for people who take action.

🔥 Most people wait. The best ones don’t.

👉 Connect with me on LinkedIn and send me a message:
"I want in"

Let’s build something awesome 👊

1
00:00:00,000 --> 00:00:07,000
Yeah, hello everybody to another edition of the M365FM podcast.

2
00:00:07,000 --> 00:00:12,000
And today I have special guest Anita S. Run.

3
00:00:12,000 --> 00:00:19,000
And we're talking about mastering dynamics 365 financial operations,

4
00:00:19,000 --> 00:00:24,000
integrations, scalable pattern for modern enterprise architecture.

5
00:00:24,000 --> 00:00:40,000
Yeah, my guest Anita, she is an MVP and an MCT and she works as technical architect and is specialized in building complex scalable ERP solutions.

6
00:00:40,000 --> 00:00:46,000
And yeah, welcome to the show.

7
00:00:46,000 --> 00:00:49,000
Thanks, Marco. Thank you for hosting me.

8
00:00:49,000 --> 00:01:05,000
So can you tell us a little bit about your journey into the Microsoft ecosystem and how you became specialized in dynamic 365 financial and operations integrations?

9
00:01:05,000 --> 00:01:14,000
Yeah, so as you already said, I am into this D365F4 for almost 20 years.

10
00:01:14,000 --> 00:01:19,000
So it was called as accepta when I started my career in 2006.

11
00:01:19,000 --> 00:01:23,000
So with the blue screen where we enter the lock in details.

12
00:01:23,000 --> 00:01:38,000
So this is how I started my my career and with this 20 years, like I specialize in designing the scalable intelligent ERP solutions so that that would help the organization modernize and transform.

13
00:01:38,000 --> 00:01:43,000
So my see now it is not only as a D365 finance and operations.

14
00:01:43,000 --> 00:01:50,000
So the recent journey is also worth the AI co pilot and the Microsoft 365 ecosystem.

15
00:01:50,000 --> 00:01:57,000
So I need to also focus on bringing the practical real time a adoption into my business applications, right?

16
00:01:57,000 --> 00:01:59,000
It is not it is not anymore standalone.

17
00:01:59,000 --> 00:02:17,000
So I work extensively with the FO co pilot then co pilot studios so that helps the team to understand how to integrate the generate a into their daily workflows and how to automate the process and unlock the new productivity patterns.

18
00:02:17,000 --> 00:02:27,000
So one significant milestone in this journey is completing the AB 900 and the the AB 731.

19
00:02:27,000 --> 00:02:42,000
So these certifications they strengthen the my foundations in the responsible AI prompt engineering how how do we extend the co pilot co pilot extensibility and all this is the also shaped how I approach a solution design.

20
00:02:42,000 --> 00:02:52,000
So I can ensure in that case that every A A capability I build or the recommend is secure scalable and aligned with the business value.

21
00:02:52,000 --> 00:03:09,000
Alongside my technical work I am also an active community contributor where I share my insights writing articles, delivering sessions, then mentoring professionals who want to grow in the 365 for a and the modern enterprise architecture.

22
00:03:09,000 --> 00:03:12,000
So this is short of about my journey.

23
00:03:12,000 --> 00:03:18,000
Awesome. So I also can say all the links and the people in the show notes from you.

24
00:03:18,000 --> 00:03:34,000
Yeah, for listeners like me who maybe you are in the space, what will you describe the modern these 65th or integration links here today.

25
00:03:34,000 --> 00:03:52,000
So the dynamics 365 finance and enterprise architecture is a complete ERP where you can come up with the you I mean it is like complete ERP solution where you can implement the help to organization to modernize and transform for instance me I am from a technical background.

26
00:03:52,000 --> 00:04:04,000
So I am I am I am as a technical back at architite what I will do I align collaborate with my functional consultants take the assess the feasibility with me.

27
00:04:04,000 --> 00:04:10,000
Okay, and check whether it is technically feasible when there is ask whether an extra additional customization.

28
00:04:10,000 --> 00:04:17,000
So I assess whether it is possible or not and then I in case of any integrations I would tell them.

29
00:04:17,000 --> 00:04:30,000
This is how the integration has to be done. This is how it will be fitting in our D 365 FO I tell them the pros and cons of the particular solution and then we arrive at a so design here.

30
00:04:30,000 --> 00:04:35,000
So this is how I picture the D 365 FO enterprise architecture.

31
00:04:35,000 --> 00:04:48,000
And yeah, you are more technical woman and what what did you say are the biggest changes you have seen in the 65.

32
00:04:48,000 --> 00:05:03,000
So I started my career as a technical or developer and then I am technical architect now.

33
00:05:03,000 --> 00:05:09,000
So what has changed is I mean the everyday you have an update in our FNO ecosystem.

34
00:05:09,000 --> 00:05:18,000
Not only the dynamics ecosystem it is across. So I need to learn myself update myself and then I have a team under me.

35
00:05:18,000 --> 00:05:26,000
I need to guide them and then hey, this is what is the proper for a performance scalability and you should not do this.

36
00:05:26,000 --> 00:05:32,000
And then I need to keep myself updated to tell them hey, this is not the right way you should be doing.

37
00:05:32,000 --> 00:05:41,000
And this is this way you can give a try. I need to train them in such a way they can work without my assistance in the next assignment.

38
00:05:41,000 --> 00:05:47,000
So this is how my my task begins and on top of that the challenge what I phases.

39
00:05:47,000 --> 00:05:53,000
So when I started with a X3 you accept that for you the debugging was very easy.

40
00:05:53,000 --> 00:05:58,000
You finish your code compile and you will be immediately see your result.

41
00:05:58,000 --> 00:06:09,000
But in case of d3. So you have a cloud hosted VMs and you compile your code then it takes some time for the aOS to come up.

42
00:06:09,000 --> 00:06:17,000
And then maybe it will take for a four or five minutes to come up and then I need to then again test my solution whether it is properly working or not.

43
00:06:17,000 --> 00:06:29,000
The time I am spending in delivering the solution is little more. I mean here it is little more time consuming when compared to the earlier version.

44
00:06:29,000 --> 00:06:37,000
So this is one of the challenges even I am having and even my team is also having so at times this building and compilation takes more time.

45
00:06:37,000 --> 00:06:40,000
That is what I we feel here.

46
00:06:40,000 --> 00:06:53,000
And what companies is this product? Right or I think it's a little it's where when I look at it is or it's it looks really complicated.

47
00:06:53,000 --> 00:06:59,000
It is not complicated but maybe for the beginners until you get some grip it seems to be complicated.

48
00:06:59,000 --> 00:07:08,000
But when you get into the process the ERP right it is it is it is mainly designed by Microsoft it is for mainly for scalable and you can.

49
00:07:08,000 --> 00:07:18,000
I mean what you say you you can think how scalable it is and how you need to handle it cautiously that you need to tell advice that he is the perfect one.

50
00:07:18,000 --> 00:07:22,000
This is not the right one otherwise your system will collapse so you need to get up.

51
00:07:22,000 --> 00:07:27,000
So that is where the solution architect and the technical architect plays a role.

52
00:07:27,000 --> 00:07:43,000
So maybe for beginners it might little complex but since I am in this dynamics 365 effort world for almost 20 years I know where and how to customize it that is where I stand.

53
00:07:43,000 --> 00:07:58,000
And yeah let I think as company I start a new project what integration mistakes see you often.

54
00:07:58,000 --> 00:08:16,000
So the integration mistakes one is we have few things called as the volume so see anybody can do the integration anybody can suggest you can do like this way or that way but there are few factors which you need to assess before designing this solution for your integration one is volume.

55
00:08:16,000 --> 00:08:37,000
So you need to understand what volume your current process is going to give it to you for instance if it is for more than 10,000 per 5 minutes then you need to decide as an asynchronous if it is less than 1000 then you need to see it as synchronous on top of that you need to understand what frequency you are going to integrate.

56
00:08:37,000 --> 00:08:58,000
So there are few factors which you need to assess and then you need to suggest the perfect solution for your project so that is what I think I am going to tell you in our episode so where I can explain you about the integration landscape and what are all the factors you need to understand when you design an integration solution.

57
00:08:58,000 --> 00:09:14,000
And yeah let's deep dive a little bit in integration architecture and strategy so how do you approach designing and enterprise integration architecture around tool.

58
00:09:14,000 --> 00:09:37,000
Yeah see for instance 10 years ago there were not much integrations they were very simple for instance you have a bad job or a file drop or it might be a nightly synthesis but today's enterprise is almost a real time even driven or APA so it can also be because it is a cloud native so this is where d36 by a 4-6.

59
00:09:37,000 --> 00:10:06,000
It sits in the middle of the CRM system e-commerce platforms or the MES the WMS so there are many dozens of Microsoft services so the challenges no longer how do I connect the system a to be the challenges how do I build an integration ecosystem that scales with the business so this is the question you need to keep in mind whenever you design an integration.

60
00:10:06,000 --> 00:10:33,000
Awesome and what factors help that I might whatever an integration should be real time yeah real time I call it or a batch based so that depends on your scenario Mirko for instance if you go for a real time I would usually suggest I would not immediately decide or give a solution to my client if it should be a real time or as a current process.

61
00:10:33,000 --> 00:10:46,000
I will be studying what the process is about I need to understand what is the frequency what is the load and whether it has to be for for this load whether it has to be a real time or near real time or asynchronous.

62
00:10:46,000 --> 00:11:10,000
Wither I should use logic caps or the power automate to save the billing for my client and whether how do I handle the failure see for instance if you are if you are getting a you are you are you are data in the end system at least within 30 seconds your system should be able to send you an alert here something has failed has gone wrong so all those you need to understand when you design for an integration.

63
00:11:10,000 --> 00:11:30,000
Can you a little bit explain I see often the people talk about when they talk about dynamics it's it's data worse it's the last months that we a lot of over it and and power platform is there also topics or integration strategies for this.

64
00:11:30,000 --> 00:11:54,000
Yes it is not only f or no dynamics 365 F1 is not only about the data was for instance when I design or review the integration landscape I usually have some four pillars with me for instance the decoupling for the title coupled integrations are the biggest source of the long term.

65
00:11:54,000 --> 00:12:16,000
If your system depend on each other's timings or the scheme or availability you are building a fragility right because your system is not going to be a fragile system right so deep decoupling means you are going to use the cues or sorry I am being little technical here because maybe then if you have any doubts you can you can stop me at any time.

66
00:12:16,000 --> 00:12:44,000
So here decoupling means you are going to use the cues using the events or using the APA with the versioning and avoiding the point to point the dependencies here also if you cannot answer what has happened to this message in under 37 30 seconds you don't have an observability so observability is also one of the pay point you need to make a note so logs are not observable in my case I would say log a log is not an observability.

67
00:12:44,000 --> 00:13:12,000
So what is observability means it is tracing correlation IDs may be a dashboard you can build where you can capture the failures or an alert or a replay capability and the next third one I usually say is the scalability so the scalability patterns include the event driven workflows and the asynchronous patterns so what are even to driven workflows that maybe I can say as a business event.

68
00:13:12,000 --> 00:13:41,000
So you can see as a business event using the service bus or the event grid and I can suggest an asynchronous processing for a DIXF batch or the DIXF batch processing you have radius APA for DIXF batch processing we can use that also the last one is the resilience failures I can say they are not exceptions they are they are normal see every system tend to fail I mean the process what you are sending from the system to you.

69
00:13:41,000 --> 00:14:09,000
So the system to your end system need not be successful always so failures or not exceptions they are normal so the resilient integration what you are building should include a retry policy for instance if you are I have implemented the Azure service but so I can use a dead letter queue for for retrying my failed messages so these are all the pillars that form the foundation of everything else.

70
00:14:09,000 --> 00:14:35,000
Awesome I think a little bit you say there are integration options there are I think API's odata graph yeah some data management frameworks and I think it's that always was middle where platforms where or how did you decide between all these options.

71
00:14:35,000 --> 00:15:04,000
This is what we call as integration patterns so what works and what doesn't so that you need to match so let me give you some idea on the integration patterns which is available in the D365 F1 the recurring data jobs for instance what I told you as the DIXF data import export framework so if you see the string strength of the XF it is great for bulk export or import bulk data.

72
00:15:04,000 --> 00:15:28,000
And it is easy to configure you have so many standard entities which are already available in the system and if it is also possible to customize the entities as well and it is it is it works well for the master data things it supports BIO D as well but the limitation here is if you want to something as a real time DA except is not the right choice so it is not for the real time process.

73
00:15:28,000 --> 00:15:51,000
So maybe for the high frequency transactions we need to scale accordingly and limited error handling which means the system is not it will throw the error but if you want to capture it outside your system then you need to go for that extra middle well so this is for the DIXF the next integration pattern I am saying about us odata.

74
00:15:51,000 --> 00:16:09,000
The odata I have written the lot so many post about odata in my blog maybe we can refer it from the from how to set up the odata what are old odata how do you consume it in a for no all this I have did in detail I have given it in my in my in my blog post.

75
00:16:09,000 --> 00:16:35,000
What is the strength of the odata is it is real time and it is easy to consume and the next is it is good for the small transactional values but the limitations is they will end up in throttle if you cross the limit for instance in FNO if there is more than if you send a record if you send up more than five 6000 records in a per minute window it will end up in throttle.

76
00:16:35,000 --> 00:16:47,000
So it is not designed for the high throughput so this is one of the this is the major limitation in odata so whenever I see I do a design I would make sure odata is not.

77
00:16:47,000 --> 00:17:16,000
So it is not suggested for any height height volume transactions and the next one is the customs services maybe a soap or the rest services like so if you see the strength here you have a high performance on this soap the customs service it is also flexible so if you want to accommodate any new changes it is possible it is very flexible and it also supports the complex logic for instance in odata you will not be able to handle multiple jobs.

78
00:17:16,000 --> 00:17:40,000
But in your complex in your customs services you will be able to do multiple joins and you can accommodate the complex logic and of course so since it occurred it will handle the complex logic it is it can be optimized for any specific scenario but the limitation is it requires development and of course every time you need to have a different on a developer any change.

79
00:17:40,000 --> 00:18:09,000
Any change hey you go and do this so you will have end up with lots of version versioning is there and of course you need to think about the security how secure is your customs service so because that comes as a part that security is already taken care by the except for odata but in case of customs services security handling security is in our bucket and the last one is the business event so even what it says it is even driven so the strength here is even driven.

80
00:18:09,000 --> 00:18:38,000
And it can be like for instance when you post a payment if you want to trigger something like then business event can be probably can be used for instance when you are releasing a production order so a business event can be used and it can send the data to the external assistant so here it can be scalable and it can be decoupled and it is also native integration with the issue but the limitation is the payloads can be limited at times it will not be able to handle the large payloads.

81
00:18:38,000 --> 00:19:05,000
And it is not available for all the entities and coming to your point of data was we call it as dual right here so not every project would use dual right it depends on the scenario when you are integrating your dynamics 365 F over the C system or the field service then you bring dual right into picture here so strength here is

82
00:19:05,000 --> 00:19:33,000
you have a tight integration with the data was it is a real time thing but only limitation is it is I feel it is complex to troubleshoot at times then initial setup is also challenging because when we do an initial thing we will have no clue if that end up in ends up in an error so we will have to shed some time in understanding where we must what happened so it is not suitable for all the data models

83
00:19:33,000 --> 00:19:45,000
it is good when you have a CRC are a Monday ERP alignments like the product or the customer sync or the order sync in that case it is good.

84
00:19:45,000 --> 00:20:13,000
And I have for a few days I have talked to Nicholas Heydok and he talk about power pages how did you see the road for power pages is there is this factor for for I don't know use built portals for for I don't know for the for the non CRM users so they can connect to

85
00:20:13,000 --> 00:20:20,000
to a platform or is there something included in dynamics.

86
00:20:20,000 --> 00:20:41,000
Yeah we have an option to call the power I mean you cannot directly call this from the date 365 F O but of course there is an option where we need to setup that the power platform the power portals in the portals in your system and using that we have a configuration called as

87
00:20:41,000 --> 00:21:04,000
Microsoft F and O application integration I so I don't I don't remember that name so where you refer your app registration details in this and then you will be able to call the portals it up here but maybe this is not so much supported in this few scenarios whatever I have worked but it is possible.

88
00:21:04,000 --> 00:21:24,000
Yeah I do a little bit topic a little bit back before I read an article it was a says was article from some so say but what they say middle where is that how did you see this is has middle where a future.

89
00:21:24,000 --> 00:21:53,000
Yes we are working with the them see they are like that is where this magic happens right make of because F and O's you have it is possible to communicate with the external system using the custom code from F and O but the load the AOS we call it as AOS is will be taking more load during this communication which will affect the performance in turn so that is where the middle where comes into picture right.

90
00:21:53,000 --> 00:22:19,000
They do it take that communication load from F and O the AOS and then they do all the transformation or the what what you say the connectivity all this it is they take that load for instance the logic caps or the service bus or the event given the Azure function so they allows you to build the enterprise grade the scalable integration patterns here right for instance.

91
00:22:19,000 --> 00:22:48,000
This event it again driven architecture let me come back to my favorite even driven architecture so F and O publishes the some business event okay so whenever the business event is published I have an option to go write an extra additional code to communicate to connect to the external system for instance the sales force I can do it but if I go via the Azure even to grid over the service bus then the load is on them for instance the F and O publishes the business event and the.

92
00:22:48,000 --> 00:23:12,000
And the service bus or the event to get receives them so the downstream system reacts here correct so this pattern would give you a high scalability and lose coupling and it is almost like a near real time in this case so I would say a middle where is always needed when you have an integration because that reduces the load in the AOS.

93
00:23:12,000 --> 00:23:22,000
And we talked about patterns so did you are what are some anti patterns you see in integration design.

94
00:23:22,000 --> 00:23:51,000
Anti patterns I would say so without studying the process properly you are designing an integration for instance we are seeing hey my load us then 1000, 1000 records so what is 1000 records is it per day or per hour or per month so without understanding all this underlying rules then we end up with for you say okay then let me design the or data this is very good but then if the load is 1000 records.

95
00:23:51,000 --> 00:24:13,000
And then we will end up in the building so a proper study this must proper study about the processes must before designing an integration solution so this is where I see most of the integrations field because without understanding all this factors.

96
00:24:13,000 --> 00:24:28,000
So in complex design where it is not possible to scale where you would not have a comrade that you are failure handling also where there is no retry policy also where you don't understand about the load everything will become a kiosk there.

97
00:24:28,000 --> 00:24:52,000
So yeah the this was awesome okay let's go back more more more it's more really technical but yeah let's go to the technical part I think what integration tools or technologies do you use there most frequently with dynamics 365 financial operations.

98
00:24:52,000 --> 00:25:21,000
So in my case I have an experience with logic caps so I usually in the events like for instance if my client is see for instance logic caps is mainly used for the widely based organizations right so it depends when I win my it is when the load is less when there is a medium sized integrations then I would suggest for the power platform power automate whether this can be done but in case when I can able to achieve something via the logic caps.

99
00:25:21,000 --> 00:25:42,000
For instance I need to send the data to the sales force then I can use my logic caps use the business event or the API then send it via even to great business event sales force then whether I can use the assure functions or not so it is very so mostly I try to leverage the assure capabilities in my inner in my project.

100
00:25:42,000 --> 00:25:54,000
Awesome and yeah when we think or how do you approach security and auditedification for enterprise integrations.

101
00:25:54,000 --> 00:26:11,000
Okay the security so of course security is one of the important topics because data is our asset right so we need to for instance if I am using the logic caps or the old data then I make sure I have created enough.

102
00:26:11,000 --> 00:26:40,000
Security I mean when you create or even to great or when you communicate with FNO from the business event or via the event great I make sure the app registration takes care of all the security aspects for instance in where we will be creating the app registration with the client secret and then whether I will be able to limit the access to the applications which uses this particular app registration so all these are taken care and are also make sure that.

103
00:26:40,000 --> 00:27:03,000
When I am designing a customer services so I need to make sure I need to educate my team is this is how you need to handle the security for instance whether you need to bring the key world into picture how do you use the key whether you need to limit your access policies of the key walls so all this I need to go and then train my team accordingly.

104
00:27:03,000 --> 00:27:29,000
And I don't know but I am more the data guy and I see we have a lot of bottlenecks in performance is there also I don't know yeah in in in the 65 for integration see you there also bottlenecks and when how can yeah how can you optimize around them.

105
00:27:29,000 --> 00:27:44,000
Yeah without performance and the bottlenecks then the project doesn't move right so in the initial stage of the project so whenever there is a rollout so we have a large data to load correct we have a set of master data to load in our ERB.

106
00:27:44,000 --> 00:28:13,000
ERB obviously need certain data for instance vendor master customer master financial set all this we need to load into the system so we need to understand what is the right time to load what batch AOS we can use for instance if the batch AOS server capacity is very limited it will not be able to support your records which are in the 10,000 format for instance you are going to upload a file which has more than 20,000 or 55.

107
00:28:13,000 --> 00:28:41,000
Then you need to make sure the batch server which you are using should be able to accommodate that particular load otherwise what we usually end up the batch will keep on executing and what would happen it would also block the other batches which will be waiting for the request so it will be like it will be a kind of a deadlock where this also will not complete and it will not allow the new batch also to execute to pick up the new thread.

108
00:28:41,000 --> 00:29:01,000
So this kind of deadlock situation we have faced a lot and then it is a kind of learning for me same so then I be worked with Microsoft to understand how do we load the data and how do we go to the batch servers when we are when we are setting up the live environment for particular legal entity.

109
00:29:01,000 --> 00:29:15,000
Can you I think a little bit here about you don't have to say a client then but can you walk us through a particular selling during integration project you solved.

110
00:29:15,000 --> 00:29:29,000
Yes, my my project currently I'm working as a multi country rollout so we already have six countries in life and whenever we have a new rollout

111
00:29:29,000 --> 00:29:58,000
that we have a one single a voice instance so I think you can understand what is one instance we have only one live system there are chances where if you have multiple legal entities there is also an option to have multiple instances but in our case we are going only with one instance so whenever we go live I need to make sure the integration what I have done for my previous entities the previous rollouts is not broken they are not broken with the.

112
00:29:58,000 --> 00:30:18,000
So that I need to make sure so whenever I make myself when when this move to production environment I make sure hey Anita okay this is all the ones which you I have prepared a checklist okay so these are all the check list checklist which you need to have a look before you go for a rollout

113
00:30:18,000 --> 00:30:36,000
before you enable the interaction because I am in charge of enabling the integration so I go through this yeah these are all enabled okay so this enabling this should not break the existing one so that I will make sure and also the next point what I usually do is there are

114
00:30:36,000 --> 00:30:56,000
the inventory so extracting the inventory on hand where you have a call it as on hand inventory so that takes most of the time in our system so then I figured out we can handle it via adding so so that it doesn't come directly to our FNO it was called inventory visibility so via inventory visibility

115
00:30:56,000 --> 00:31:20,000
that the addings will be able to extract the on hand data which doesn't impact any performance in the environment and then it sends it to the external system via via the logic app so this is a perfect one which is which is which we say I mean I got a good conversation from my client that this is one of the approach what I have suggested to them works well even though

116
00:31:20,000 --> 00:31:30,000
okay awesome can you also tell us a little bit how do you handle error management and monitoring in our production integrations

117
00:31:30,000 --> 00:31:44,000
yes error management of failure so of course there comes to scalable architecture right so we need to design for the failure our systems for failure for instance we need to

118
00:31:44,000 --> 00:32:13,000
assume apis will timeout correct we cannot think that we cannot be of under pause the optimist being optimist is good but at times we need to assume apis will timeout your messages will fail all the systems will go offering so your architecture should be able to absorb the failure it should not collapse under it that is my principle so we need to make sure you have a backup for that and also your system should also be designed for the change for instance the in the

119
00:32:13,000 --> 00:32:29,000
integration can outlive the applications right so your integration your application must support schema evolution and worsening or it can also support the new customers for instance I have a multi country rollout so how my customers new customers are being a

120
00:32:29,000 --> 00:32:57,000
custom to this new entity so I need to make sure and also as I already said design for observability so if you can't see it you cannot fix it if you cannot raise it you cannot trust it so this is the design for observable so the observability is not about the locks so this is only for the failure alerts and also to re-itrate each system should be able to operate independently so one system

121
00:32:57,000 --> 00:33:13,000
should not block another so the scale is not about the speed it is about the elasticity so this is what I keep in mind whenever I think about the failure handling scenarios

122
00:33:13,000 --> 00:33:33,000
and we have talked a little bit before you have say over or a tool I'm also often work with is as a integration services can you tell a little bit more about this tool what's role plays in your architecture decisions

123
00:33:33,000 --> 00:33:47,000
so here you mean you are asking about the integrations right how sessions which I have given for the integrations correct yeah or could you replace your questions or you didn't get you

124
00:33:47,000 --> 00:34:03,000
yeah there's a tool it's a service as an integration service how what role plays it for for for your work

125
00:34:03,000 --> 00:34:19,000
yes the Azure integration service so I mainly work with the logic apps Azure data factory even great so this is these are all the major tools which are used in my project every day where it helps to

126
00:34:19,000 --> 00:34:29,000
complete my integration for instance in case of logic apps I have taken tips from your episode also Mirko like for instance you should not leave the workflow

127
00:34:29,000 --> 00:34:41,000
because in not use we should not leave it as such because it will add cost to the build it will have an additional cost correct these are all the few tips which I learned and for instance if I am designing if I am

128
00:34:41,000 --> 00:34:55,000
doing my team to create some workflows I make sure if it is not used either you delete it or you deactivate them so that it doesn't add any additional building and the Azure service bus so this is like for instance when

129
00:34:55,000 --> 00:35:05,000
when I am sending the data from my d36 by f or not to Azure service bus I don't need to go via the logic apps f or not has an integration

130
00:35:05,000 --> 00:35:19,000
architecture it can communicate directly it can send the data directly to the service bus so that also I take I make use of the Azure service but even great I have an option to send the data via the

131
00:35:19,000 --> 00:35:33,000
even grid also so in case of even great I fall out to step approach f or no f or no sending the data to even great from the even great logic apps is reading and sending the data to the end system whereas my integration

132
00:35:33,000 --> 00:35:51,000
is with the various system for instance sales force s of tp then the Azure SQL server on premises SQL server ftp so the end system or various at times the end system also becomes a system for instance I am getting the data from

133
00:35:51,000 --> 00:36:09,000
a eDA system where they will be dropping the filing on s of tp I need to read and then send the data to my f and so it depends so in this case most of my work is based on the logic apps Azure service bus and the event grid mostly on the event driven patterns

134
00:36:09,000 --> 00:36:27,000
and I think so it's an enterprise solution and a lot of companies will this use over years so I think in long term how is your strategy or best practice for versioning and maintaining

135
00:36:27,000 --> 00:36:42,000
so yeah understood so yes it is an ERP enterprises are spanning right so these risks my integrations there if I say they are no longer just technical connectors like they are the strategic

136
00:36:42,000 --> 00:37:05,000
and also the tech in the market enables for modern enterprise architecture so if you embrace even driven patterns decoupled design strong observability then scalable cloud services and the resilient architecture you will be able to build an integration landscape that not only works today but it grows with your business for the years to come

137
00:37:05,000 --> 00:37:14,000
and yeah what advice would you give developers trying to build a scalable integration from the one.

138
00:37:14,000 --> 00:37:26,000
Yes so there are few tips for the developers so maybe I am going more into a technical you understand how the X plus we call it as X plus plus our language is X plus plus in D3's

139
00:37:26,000 --> 00:37:51,000
if not C plus plus it is X plus plus so you need to make sure you understand you are when when to use where loop and when to use an insert for instance a while loop will have multiple calls to the database right so you need to understand this would take more of the time so whether you need to use an insert or the insert record set or how do you update and there are certain

140
00:37:51,000 --> 00:38:20,000
development guidelines given in our Microsoft documentation so I usually tell that my developers to go through the Microsoft guidelines get the basics of it because if they follow that they will be able to they will be able to deliver the quality code so that is what I usually says on top of that we also have our every client has their own guidelines based on the models we call it as models we call we have models in F and so how to create a model what are the naming conventions for the model all this is the model

141
00:38:20,000 --> 00:38:34,000
for the model all this we have a specific island for each client so we will need to train the developers on how to follow these guidelines to achieve a quality and the performance app code.

142
00:38:34,000 --> 00:38:50,000
Let us look a little bit more in the career and community stuff so I think the people can read the guidelines or Microsoft learn and they can read your blog so

143
00:38:50,000 --> 00:39:08,000
for what are tips to are you can give to become an expert in these 65 or and yeah where should you start and see you different expert roles.

144
00:39:08,000 --> 00:39:37,000
So you don't need to have any extra blog extra content in any of your when you when you want to become an expert for instance how I started I in our F and oh we have this development environment where you have tables classes data views whatever whatever you want to learn you can go to that particular class the standard class and understand how they have designed that particular

145
00:39:37,000 --> 00:40:06,000
code you can read it and practice it by yourself so or you can open a form you can open a form in the front and you can understand what that form does then you can go to the back end of the form and then as is how this form is being transformed to code so how this code and this form UI is is in sync so that we can do and on top of it it is not only now with the development is not the only one which is going to help you in your career.

146
00:40:06,000 --> 00:40:27,000
We also need to what is with the AI co pilot and with the Microsoft 365 because we need to have this real world a adoption right so we need to have that also into the in our carrier so that is how I advise my my team or my juniors to start when they start their carrier and I advise like that.

147
00:40:27,000 --> 00:40:39,000
Okay, you are your integration architect so what separates a good integration developer from a great integration architect what what skills.

148
00:40:39,000 --> 00:40:53,000
So integration see if you are a developer now the pattern is you should know you should have the exposure on the integration as well so it is not that you are your only your bubble you should not be in a bubble of the technical world.

149
00:40:53,000 --> 00:41:06,000
Hey, I know only x plus plus code no you should come out of that bubble and then give the exposure to the exposure to the integration so when I say when me being an architect I should be able to guide both the teams.

150
00:41:06,000 --> 00:41:13,000
My developer team by development team and the middleware team hey this is how you need to get the work done.

151
00:41:13,000 --> 00:41:20,000
This is the API you need to create and this is how you need to design this in the middleware so this is where we need to.

152
00:41:20,000 --> 00:41:33,000
If you if you want to become a technical architect from a developer to technical architect you should not only be in the x plus plus you should also come out of the technical bubble and learn about this integration landscape.

153
00:41:33,000 --> 00:41:45,000
And we have the start we talk about AI how is AI being to influence the integration development and Microsoft ecosystems from your perspective.

154
00:41:45,000 --> 00:42:01,000
Yes, as I said earlier like my this one I have this journey my significance is like it's like it's kind of it is a milestone for me like I have completed this a B 900 that a fundamental and a B 731.

155
00:42:01,000 --> 00:42:30,000
It's for the Microsoft co pilot so it gave me an idea I mean it's trying to my foundation in the responsible AI and the prop I mean the co pilot extensibility and how this enterprise great a a governance is working so it shaped my how I approach the solution design ensuring that every a capability I built or recommend a secure so this such certifications will give you an idea how to then scale I mean how to secure scale the underlying the business.

156
00:42:30,000 --> 00:42:59,000
Value certification is one of the one one which is must for us and also I work extensively with the details for co pilot and the co pilot studio so this would help my team to understand how to integrate this generate a into their everyday work flow see for instance they have their daily work with how do you automate it how do you unlock the new productivity person with that I will be able to guide them so this AI certifications help me a lot.

157
00:42:59,000 --> 00:43:12,000
And can you more specific or do you see co pilot I assess yeah how how did they they change the role your role as architect.

158
00:43:12,000 --> 00:43:28,000
Yes for instance from technical perspective I could say I have automated work for instance I need to create some of course the one of the pain point of a developer is creating a technical development document.

159
00:43:28,000 --> 00:43:57,000
So whatever you have returned the call it has to be as a document also so we have a tool where it reads my code on the tool to create a TV for me so that saves my time and also we have a tool where it can create the comments where we have some XML documentation so that that can be automatically generated so from the technical perspective I can say it has reduced my Monday in work to some to get the extent so well I can focus on my quality.

160
00:43:57,000 --> 00:44:09,000
So where I can give my energy to expeter to expeter to experiment more of my technical challenges.

161
00:44:09,000 --> 00:44:20,000
And what emerging technologies are patterns are you most exciting about when you look a little bit in the future or today.

162
00:44:20,000 --> 00:44:44,000
Every day this echo AIS evolving so we need to keep our self updated with that so for instance now we recently have this go work right so that is the hot topic now so how it is integrated with the D3 so it's my F1 is the next topic which I need to check so maybe then I can share my my my blog with you Mirko then maybe we can then discuss in the next one.

163
00:44:44,000 --> 00:44:57,000
Yeah you're you're a very time welcome again so let's start the rapid fire out so I have some questions your favorite education tool.

164
00:44:57,000 --> 00:45:02,000
Okay rest or soap.

165
00:45:02,000 --> 00:45:09,000
Most underrated the 365 of our future.

166
00:45:09,000 --> 00:45:16,000
One integration with that needs to die.

167
00:45:16,000 --> 00:45:21,000
Integration should never die.

168
00:45:21,000 --> 00:45:35,000
That's debugging tip debugging yes I be me being a developer debugging is our right and left hand for us so without debugging we will never be able to do so we make sure the code is compiled.

169
00:45:35,000 --> 00:45:40,000
And we need to build error free before you start your debugging.

170
00:45:40,000 --> 00:45:45,000
And one Azure service you called live without.

171
00:45:45,000 --> 00:45:51,000
Sorry can you could you repeat it one Azure service you couldn't live without.

172
00:45:51,000 --> 00:45:57,000
My my alerts which I have configured in my system.

173
00:45:57,000 --> 00:46:02,000
And what is the most important soft skill for architect.

174
00:46:02,000 --> 00:46:16,000
So I need to be aligned with everyone so be it a personal or the professional I need to be aligned I need to be freechable for everybody so that is one of the soft skills.

175
00:46:16,000 --> 00:46:28,000
Okay cool and now yeah I have a look a little bit about what people write about or in the 65 ever communities.

176
00:46:28,000 --> 00:46:51,000
And we have some yeah our takes and you can say what would you see so the first is today's conversion was a reminder that integrations are no longer just technical plumping there are strategic business infrastructure building scalable and secure architecture around the 65 was becoming more critical than ever.

177
00:46:51,000 --> 00:47:04,000
Yes absolutely because we need to build the integration which scales and that should be designed for resilient observable and for observability and for the future growth.

178
00:47:04,000 --> 00:47:12,000
The Microsoft ecosystem keeps evolving rapidly with Azure power platform AI data worse and all.

179
00:47:12,000 --> 00:47:20,000
Influencing integration strategy staying adaptable is now part of the drop description.

180
00:47:20,000 --> 00:47:38,000
What stood out most is how enterprise integration should require by length, seeing performance security scalable and business well when you symmetically and that's where the real experience shows.

181
00:47:38,000 --> 00:47:53,000
So that is what so your architecture must support the schema evolution you are versioning and the new customers and also the new business process so your application what you are building should be designed for accommodating the change.

182
00:47:53,000 --> 00:48:12,000
And if you working in dynamics we 65 financial operations integrations knowledge is become one of the most valuable technical skill set can you develop that for the future.

183
00:48:12,000 --> 00:48:22,000
Yes because of course the ones we share our knowledge beat via the kit hub or via the blog or via the newsletter.

184
00:48:22,000 --> 00:48:35,000
We are sharing the knowledge and we are also understanding what is happening in the other I mean when I read others blog also I am learning when I am sharing others they are learning it from as it is just knowledge is sharing and doing together.

185
00:48:35,000 --> 00:48:45,000
So thank you for your time and I will give the last words to my audience from you so you three.

186
00:48:45,000 --> 00:48:58,000
Sure and thank you for listening and I hope this conversation helps you to design the integrations that are not functional but truly future ready so that is the tips you should always keep it in mind.

187
00:48:58,000 --> 00:49:05,000
So then I say thank you for this technical deep ties.

188
00:49:05,000 --> 00:49:14,000
This was awesome and I think there was a lot of insights for the audience and so I say thank you and yeah goodbye.

189
00:49:14,000 --> 00:49:17,000
Thank you thank you Mirko have a good evening bye.

190
00:49:17,000 --> 00:49:17,840
Bye.

Mirko Peters Profile Photo

Founder of m365.fm, m365.show and m365con.net

Mirko Peters is a Microsoft 365 expert, content creator, and founder of m365.fm, a platform dedicated to sharing practical insights on modern workplace technologies. His work focuses on Microsoft 365 governance, security, collaboration, and real-world implementation strategies.

Through his podcast and written content, Mirko provides hands-on guidance for IT professionals, architects, and business leaders navigating the complexities of Microsoft 365. He is known for translating complex topics into clear, actionable advice, often highlighting common mistakes and overlooked risks in real-world environments.

With a strong emphasis on community contribution and knowledge sharing, Mirko is actively building a platform that connects experts, shares experiences, and helps organizations get the most out of their Microsoft 365 investments.

Anitha Eswaran Profile Photo

D365FO MVP Technical Architect

I am a Microsoft MVP in AI ERP and a Microsoft Certified Trainer (MCT) with over 20 years of architectural and technical expertise in Microsoft Dynamics AX and Dynamics 365 Finance & Operations (D365FO). My career spans multiple generations of the product—D365FO, AX2012, AX 5.0/4.0/3.0—and includes 12 successful end‑to‑end ERP implementations across diverse industries.

As a Technical Architect, I specialise in designing and delivering complex, scalable ERP solutions that align technology with business strategy. My work focuses on automation, secure design, and efficiency, enabling organisations to accelerate deployments, streamline operations, and achieve predictable, high‑quality outcomes.
I am an active contributor to the global Dynamics community—speaking at events, mentoring professionals, and sharing knowledge through bootcamps and user groups. My recognition as a Microsoft MVP in AI ERP and Microsoft Certified Trainer reflects both my technical depth and my commitment to empowering others in the Microsoft ecosystem.