Skip to content
Microsoft Dynamics 3659 min read

How to Evaluate a Dynamics 365 Partner for the Agentic AI Era

By George Brown

Microsoft's 2026 Release Wave 1 embeds agentic AI across every Dynamics 365 module. Traditional partner selection criteria — certifications, project count, industry references — are no longer sufficient. Here's how to evaluate partners for the new reality.

TL;DR

  • Wave 1 introduces autonomous AI agents, embedded Copilot, and Work IQ across all Dynamics 365 modules — this is not an incremental update.
  • Traditional partner selection criteria (certifications, project count) remain necessary but are no longer sufficient.
  • New evaluation dimensions: AI agent design, Copilot enablement, Work IQ integration, AI governance, and MCP extension capabilities.
  • 15 specific questions to ask prospective partners during evaluation.
  • M365 pricing increases (5–33%) take effect July 1, 2026 — the window to lock in partner and implementation strategy is now.

On March 18, 2026, Microsoft announced Dynamics 365 Release Wave 1 — the most significant platform shift in a decade. Agentic AI capabilities are now being embedded across every module: Sales, Customer Service, Finance, Supply Chain, HR, and Commerce. For businesses selecting a Dynamics 365 implementation partner, this changes the evaluation fundamentally.

A partner who excels at traditional ERP configuration may have zero experience designing, deploying, and governing AI agents. And that gap can derail an implementation.

What Microsoft Wave 1 Actually Changes

Wave 1 introduces three foundational shifts that directly impact how implementations are planned and executed.

Autonomous AI Agents Across Every Module

Dynamics 365 now supports agents that can qualify leads, process invoices, prevent customer churn, and execute procurement workflows — independently. These agents use Copilot reasoning and Dataverse data access to break complex goals into steps, execute actions, and report results for human review. This is fundamentally different from traditional automation (Power Automate flows) and requires new design, governance, and testing skills.

Copilot Embedded in Sales & Customer Service

Microsoft 365 Copilot is now embedded in Dynamics 365 Sales and Customer Service, with public preview launching in early April 2026. Copilot draws on CRM data plus Microsoft 365 signals (emails, meeting recaps, Teams conversations) to generate actionable insights, draft communications, and surface relevant knowledge. This collapses separate applications into a unified workflow — but only if the implementation partner understands how to configure and optimize it.

Work IQ: The Intelligence Layer

Work IQ, entering general availability in April 2026, connects Microsoft 365 signals (documents, meetings, chats) with operational data from Dynamics 365 and Power Apps. It follows work across business processes, provides adaptive learning, full auditability, and multi-agent orchestration. Partners need cross-platform expertise spanning the entire Microsoft stack to leverage this effectively.

Market Validation & Cost Pressure

The Forrester Wave™: Customer Service Solutions, Q1 2026 named Microsoft a Leader, validating the platform's AI-first direction. Meanwhile, Microsoft 365 pricing increases of 5–33% take effect July 1, 2026, creating urgency for businesses to lock in partner and implementation strategies before costs rise.

Traditional vs. New Partner Evaluation Criteria

The old partner selection playbook focused on five dimensions: certifications, project count, industry experience, methodology, and references. These still matter, but they no longer tell the full story.

What Still Matters

Core competencies remain essential. You still need a partner with deep module knowledge, proven data migration capabilities, strong change management practices, and a structured implementation methodology. Don't abandon these criteria — build on top of them.

What's New: The AI Readiness Layer

DimensionTraditional (Pre-Wave 1)AI-Ready (Wave 1+)
ScopeModule configuration, data migration, change managementModule config + autonomous agent design and orchestration
AutomationPower Automate flows, workflow rulesMulti-step agents powered by Copilot reasoning
Data ModelDataverse + D365 module dataDataverse + Microsoft 365 signals via Work IQ
GovernanceRole-based security, audit logsAgent approval gates, rollback procedures, incident response
IntegrationStandard connectors, custom APIsMCP connectors extending agents to external systems
Team StructureSeparate M365 and D365 practicesCross-platform specialists spanning the full Microsoft stack
Traditional vs. AI-ready partner capabilities across six dimensions.

On top of traditional evaluation, you now need to assess partners across five new dimensions:

  • AI Agent Design & Orchestration: Can the partner design autonomous agents that execute multi-step business processes? Have they built agents for lead qualification, invoice processing, churn prevention, or procurement? Ask for specific examples and outcomes.
  • Copilot Enablement Experience: Has the partner deployed Copilot in production environments? Do they understand the data quality prerequisites, governance frameworks, and user adoption strategies that determine whether Copilot delivers value or creates noise?
  • Work IQ Integration Readiness: Can the partner work across the Microsoft 365 and Dynamics 365 boundary? Work IQ requires understanding of Dataverse, Power Platform, and Microsoft 365 productivity tools as a unified system, not separate products.
  • AI Governance & Compliance: Does the partner have frameworks for governing autonomous agents? Agents acting on business data create new compliance, audit, and risk management requirements that must be addressed during implementation, not after.
  • MCP & Extension Capabilities: Can the partner build Model Context Protocol (MCP) connectors to extend agents to third-party systems and external data sources? Most real-world implementations require agents that reach beyond Dynamics 365's native data.

15 Questions to Ask Prospective Partners

Use these questions during partner evaluation to assess AI readiness. The quality of the answers — specificity, examples, and honesty about limitations — matters more than the answers themselves.

AI Agent Experience

  1. How many autonomous agent deployments have you completed in Dynamics 365? Describe one in detail.
  2. Walk me through your agent design methodology. How do you identify which processes should be automated with agents vs. traditional workflows?
  3. How do you handle agent governance — approval gates, exception handling, audit trails, and rollback procedures?

Copilot Readiness

  1. What data quality prerequisites do you assess before enabling Copilot? What happens if a client's data isn't ready?
  2. How do you measure Copilot effectiveness post-deployment? What KPIs do you track?
  3. Describe your Copilot user adoption strategy. How do you train end users to validate and effectively use AI-generated outputs?

Work IQ & Cross-Platform

  1. How do you approach Work IQ integration? Do you have experience connecting Microsoft 365 signals with Dynamics 365 operational data?
  2. What is your team's expertise across Power Platform, Dataverse, and Microsoft 365? Do you staff cross-platform specialists or separate teams?

Technical Depth

  1. Have you built custom MCP connectors to extend agents to non-Microsoft systems? Describe one.
  2. How do you approach AI security — tenant isolation, data access controls, and preventing agents from accessing data outside their scope?
  3. What is your approach to testing autonomous agents before production deployment?

Governance & Risk

  1. How do you ensure AI-driven decisions comply with our industry's regulatory requirements?
  2. What is your incident response process when an autonomous agent produces incorrect results or takes unintended actions?
  3. How do you handle the transition from pilot to production for AI features? What gates must be passed?

The Most Important Question

  1. What AI capabilities are you not yet ready to deliver? Where would you bring in additional expertise?

This last question matters most. Partners who claim full readiness across every dimension are either overstating their capabilities or don't understand the complexity of what Wave 1 introduces. The best partners are candid about their gaps and have a plan to address them.

Red Flags in Partner Evaluation

Watch for these warning signs during the evaluation process:

  • "We'll figure out AI as we go." AI agent design requires upfront planning. A partner who treats AI as an afterthought will struggle when agentic capabilities are central to Wave 1.
  • No autonomous agent deployments to date. If a partner has zero production agent experience, your project becomes their training ground. That's a risk you should price into the decision.
  • Copilot treated as "just turn it on." Copilot's effectiveness depends on data quality, governance, and user training. Partners who treat it as a toggle switch underestimate the implementation work.
  • No governance framework for AI. Autonomous agents create new compliance and audit requirements. If the partner can't explain their governance approach, they haven't thought through the risks.
  • Separate teams for M365 and D365. Work IQ requires cross-platform integration. If the partner staffs these as completely separate practices with no overlap, they'll struggle with the unified intelligence layer.
  • Pricing that doesn't account for AI complexity. If the SOW looks identical to a pre-Wave 1 implementation, the partner is either underscoping the AI work or planning to charge for it later.

A Practical Evaluation Framework

TRADITIONAL DIMENSIONS1. Module Expertise2. Industry Experience3. Implementation MethodologyAI-READINESS DIMENSIONS4. Agent Design Capability5. Copilot Enablement6. Work IQ / Cross-Platform7. AI Governance8. Extension Capabilities
The 8-dimension scoring framework: 3 traditional + 5 AI-readiness criteria, each scored 1–5.

Score each prospective partner across these eight dimensions on a 1–5 scale. Weight the AI-specific dimensions based on how central agentic capabilities are to your implementation goals.

Traditional Dimensions (Still Essential)

  1. Module Expertise: Depth of experience in your specific D365 modules (Sales, Finance, SCM, etc.)
  2. Industry Experience: Track record in your vertical with relevant references.
  3. Implementation Methodology: Structured approach with clear milestones, governance, and change management.

AI Readiness Dimensions (New)

  1. Agent Design Capability: Proven ability to design, build, test, and deploy autonomous agents.
  2. Copilot Enablement: Experience deploying Copilot with data readiness assessment and user adoption planning.
  3. Work IQ / Cross-Platform: Ability to integrate Microsoft 365 and Dynamics 365 as a unified system.
  4. AI Governance: Frameworks for agent oversight, compliance, audit trails, and incident response.
  5. Extension Capabilities: Ability to build MCP connectors and extend AI to non-Microsoft systems.

A partner scoring 4–5 on traditional dimensions but 1–2 on AI readiness may still be a strong choice for straightforward implementations with limited AI ambitions. But if agentic AI is core to your business case, AI readiness scores should carry at least equal weight.

Why Acting Now Matters

Apr 2026Wave 1 rolloutbeginsApr 2026Work IQ GAJul 1, 2026M365 pricesrise 5–33%Sep 2026Wave 1 complete
The Q2–Q3 2026 window: Wave 1 rollout and M365 pricing changes compress the partner selection timeline.

Three converging forces create urgency for partner selection in Q2 2026:

  • Wave 1 rollout (April–September 2026): Agentic capabilities are rolling out now. Early movers get implementation support from partners who aren't yet at capacity.
  • M365 pricing increases (July 1, 2026): 5–33% increases across commercial Microsoft 365 suites affect total stack cost. Locking in a partner and implementation plan before July gives leverage on scope and budget.
  • Partner capacity constraints: The most AI-ready partners are in high demand. Waiting means fewer choices and longer timelines.

The window between now and July is the optimal time to evaluate partners, define your AI implementation strategy, and secure engagement commitments.

Use TDP's Free Tools to Evaluate Partners

Top Dynamics Partners provides free, vendor-neutral tools to help businesses evaluate implementation partners with AI readiness in mind:

  • AI-Powered Partner Matching: Our matching tool now incorporates AI readiness criteria to recommend partners based on your specific module needs, industry, and agentic AI requirements.
  • Partner Comparison Tool: Compare up to three partners side-by-side across traditional and AI-readiness dimensions.
  • Evaluation Checklist: A downloadable checklist incorporating all 15 questions from this article, plus scoring rubrics for each dimension.
  • Expert Advisory: Schedule a free consultation with George Brown, a 40-year Dynamics veteran who has evaluated hundreds of partners, to discuss your specific requirements.

Microsoft's 2026 Release Wave 1 changes the Dynamics 365 partner selection equation. Traditional criteria remain necessary but are no longer sufficient. The businesses that adapt their partner evaluation to reflect the agentic AI reality will select partners who can deliver on the full promise of the platform's most significant evolution in a decade.

George Brown
George Brown

Co-Founder & CEO

George Brown has over 40 years of experience in the Microsoft Dynamics ecosystem, including leadership roles at Partner Economics, Jet Global, and Aston Group NA.

Microsoft Dynamics Expert40+ Years ERP Experience500+ ERP Implementations Overseen

Role of Microsoft Dynamics 365 Consultants in Your Digital Transformation

Digital transformation isn’t just a buzzword anymore—it’s a necessity if you want your business to keep up. But let’s be honest: bridging the gap between what tech can do and what your business actual...

Read More

Best Tools for Managing Your Microsoft Dynamics 365 Implementation

Microsoft Dynamics 365 implementation can feel overwhelming without the right tools and strategies. I've seen plenty of organizations get tangled up in complicated deployments that could've been much ...

Read More

Related Content