Dynamics 365 Partner RFI Template & Evaluation Process
A structured RFI (Request for Information) process with 26–42 questions, weighted scoring criteria (Microsoft experience 40%, team capability 30%, cultural fit 20%, cost 10%), reference validation, and industry-specific demos conducted over 4–6 weeks enables objective partner shortlisting and removes bias from the critical partner selection decision that predicts implementation success or failure.
Selecting the right Dynamics 365 implementation partner is one of the most critical decisions an organization makes. A great partner accelerates your go-live, delivers quality, and becomes a trusted advisor. A poor partner leads to delays, cost overruns, and failed implementations.
The RFI (Request for Information) process is the proven mechanism for evaluating partners systematically, removing bias, and identifying the best cultural and technical fit. This article provides a complete RFI template, scoring framework, and evaluation process to help you select the right partner.
RFI vs. RFP vs. RFQ: Understanding the Process
Three related procurement documents are often confused:
- RFI (Request for Information): Preliminary questionnaire to evaluate vendor capability and fit. Unscored, used to shortlist vendors. Low effort from vendors. Typical timeline: 2–4 weeks.
- RFP (Request for Proposal): Detailed project scope with specific requirements. Vendors submit detailed proposals with approach, timeline, and pricing. High effort from vendors. Typical timeline: 4–8 weeks.
- RFQ (Request for Quote): Pricing-only document for known/commoditized services. Used after vendor selection to finalize pricing. Quick turnaround: 1–2 weeks.
Typical flow: RFI → Shortlist (3–5 vendors) → RFP → Demos → Negotiations → Contract.
When to Use an RFI
An RFI is appropriate when:
- You have a large number of potential vendors (10+) and need to shortlist.
- You want to evaluate vendor capability before investing in detailed scope/RFP.
- You need to assess cultural fit, team strength, and implementation approach.
- You want to gather reference information before committing to RFP.
An RFI is not necessary if you already have a shortlist of 2–3 vendors or have already selected a vendor.
RFI Template Overview
A complete RFI typically includes these sections:
| Section | Purpose | Number of Questions | Effort to Respond |
|---|---|---|---|
| Introduction & Confidentiality | Explain project context and evaluation process | — | — |
| Company Profile | Assess partner size, stability, and Dynamics 365 focus | 5–8 | 1 hour |
| Dynamics 365 Experience | Evaluate partner expertise and breadth of capability | 6–10 | 2 hours |
| Team Qualifications | Assess proposed team skills and experience | 5–8 | 2 hours |
| Methodology & Approach | Understand partner’s implementation philosophy | 4–6 | 2 hours |
| References | Enable evaluation committee to contact customers | 3–5 | 1 hour |
| Pricing & Resources | Provide order-of-magnitude cost estimate | 3–5 | 2 hours |
| Total | Complete RFI | 26–42 questions | 10–12 hours to respond |
An RFI should not exceed 40 questions or require more than 12 hours to complete. Longer RFIs have lower response rates (vendors decline to respond) and reveal less genuine partner differentiation.
Section 1: Company Profile & Qualifications
Sample questions:
- Company name, founding year, headquarters location, and number of employees globally.
- Is your company independent, or owned by a larger parent? If owned, describe your organizational relationship and decision-making autonomy.
- What percentage of your revenue comes from Dynamics 365 implementations? From other ERP systems?
- How many Dynamics 365 implementations have you completed in the past 3 years? (Provide count by product: Business Central, Finance & Operations, etc.)
- What is your Microsoft competency level? (Microsoft Partner network tier: Learning Partner, Solutions Partner Competency, or Advanced Specialization?)
- Describe your organizational structure. Do you have dedicated practices for Business Central, Finance & Operations, Supply Chain Management, etc.? Or is implementation a secondary service?
- What is your customer retention rate? (What percentage of customers re-engage you for post-go-live support or expansion projects?)
What to look for:
- Partner with 20+ implementations in the past 3 years (active in the market).
- Microsoft Solutions Partner status (formal partnership with Microsoft, not just a learning partner).
- High customer retention (80%+ is excellent).
- Dedicated Dynamics 365 practice (not IT consulting dabbling in ERP).
Section 2: Dynamics 365 Experience
Sample questions:
- In the past 2 years, how many implementations have you completed for each of the following products?
- Dynamics 365 Business Central
- Dynamics 365 Finance & Operations
- Dynamics 365 Supply Chain Management
- Dynamics 365 Project Operations
- Describe your experience with [YOUR SPECIFIC INDUSTRY]. How many implementations have you completed in this industry in the past 3 years?
- Have you completed migrations from [YOUR CURRENT SYSTEM, e.g., Sage, GP, NetSuite] to Dynamics 365? If yes, how many? Provide case study if available.
- Describe your experience implementing [YOUR KEY MODULES: GL, AP/AR, Inventory, Manufacturing, Supply Chain Management]. Which do you consider your strongest module?
- Have you implemented Dynamics 365 integrations with [LIST YOUR KEY INTEGRATIONS: e.g., Shopify, EDI, logistics platforms]? Describe your experience.
- What is your average implementation timeline for projects of [YOUR EXPECTED PROJECT SIZE: e.g., $150K–250K scope]?
- Describe your experience with successful post-go-live optimization and adoption support. How do you measure and improve customer adoption rates?
What to look for:
- Experience with your specific product module (if specialty).
- Industry-specific experience (industry knowledge reduces learning curve).
- Migration experience (if you’re migrating from another system).
- Integration expertise (if key to your project).
- Realistic timelines (partners quoting 2–3 months for enterprise scope are unrealistic).
Section 3: Team Qualifications & Staffing
Sample questions:
- Describe your proposed project structure and staffing model for this engagement. Who will be the project manager, solution architect, and functional lead?
- Provide résumés (1 page) for the proposed project manager, solution architect, and lead developers. Include: years of ERP experience, Dynamics 365 certifications, and notable past projects.
- Will your proposed team members be dedicated to this project, or shared with other clients? If shared, what percentage allocation and how will you ensure continuity?
- What is your bench depth? If a key team member becomes unavailable, how quickly can you backfill?
- What training and certifications does your team hold? (Microsoft D365 certifications? Agile/Scrum certifications?)
- Describe your approach to knowledge transfer. Will you provide detailed documentation? Will team members remain available for 30/60/90 days post-go-live for support and optimization?
- Have you implemented an offshore or nearshore staffing model for implementations? If yes, describe the structure and how you ensure quality.
What to look for:
- Dedicated team (not rotated across multiple projects).
- Relevant certifications (Microsoft, Agile, project management).
- Years of experience (5+ years in Dynamics for senior roles is typical).
- Strong bench depth (reduces risk of key person dependency).
- Commitment to knowledge transfer and post-go-live support.
Section 4: Methodology & Approach
Sample questions:
- Describe your implementation methodology (Agile, Waterfall, Hybrid?). Provide a high-level timeline showing phases, key deliverables, and go-live approach.
- How do you conduct requirements gathering? Describe your process for documenting business requirements and translating them into system configuration.
- Describe your change management approach. How do you ensure user adoption and change readiness?
- How do you manage scope creep and out-of-scope requests? Describe your change order process.
- What is your testing approach? (UAT timeline, test case development, defect resolution process?)
- Describe your cutover and go-live approach. Do you recommend big-bang or phased go-live? How long do you recommend parallel running?
What to look for:
- Structured methodology (not ad-hoc).
- Clear change management and user adoption process.
- Disciplined testing approach (not rushing to go-live).
- Realistic cutover approach (parallel running, phased rollout for complex projects).
- Emphasis on user adoption (not just system configuration).
Section 5: References & Case Studies
Sample questions:
- Provide 3–5 references of customers with whom you completed implementations in the past 2 years. Include: company name, industry, project size, system scope, go-live date, and reference contact (name, title, email, phone).
- At least one reference should be for a migration from [YOUR CURRENT SYSTEM] to Dynamics 365 (if applicable).
- At least one reference should be for an implementation of [YOUR KEY MODULES] (if applicable).
- Provide a 2–3 page case study of one notable implementation. Include: business problem, solution approach, benefits realized, and customer quote/testimonial.
- Provide a list of 5–10 past customers who are now in support/managed services with your firm (indicating customer satisfaction and retention).
What to look for:
- References willing to take calls (if hesitant, red flag).
- Mix of reference types (migration, greenfield, different industries).
- High customer retention (references who are still engaged).
- Case studies with quantified benefits (not just feature lists).
How to Evaluate a Dynamics 365 Implementation Partner [2026 Checklist]
Learn how to evaluate Dynamics 365 implementation partners using weighted scoring, industry experience verification, reference checks, and demo strategies.
Read MoreSection 6: Pricing & Resource Planning
Sample questions:
- Based on the project scope as described in Section 1 [INSERT PROJECT CONTEXT HERE], provide an order-of-magnitude estimate for total implementation cost. Break down by phase (discovery/design, build, testing, go-live, post-go-live support).
- Provide your standard hourly rates for the following roles: (a) Project Manager, (b) Solution Architect, (c) Senior Developer, (d) Junior Developer, (e) QA/Testing.
- Do you recommend a fixed-price, T&M (time & materials), or hybrid pricing model for this engagement? Justify your recommendation.
- What is your estimated team staffing for this project? (E.g., 1 PM, 1 SA, 2 senior developers, 1 QA for 6 months.)
- Describe your post-go-live support model. What is included in your standard support? What is the cost for extended support (1-year managed services)?
- Are there any out-of-scope items that would increase cost? (Custom development, third-party integrations, change management consulting, training?)
What to look for:
- Realistic cost estimate (validate against benchmarks from other partners or analyst data).
- Clear breakdown by phase and team member.
- Transparent post-go-live support pricing.
- Clear definition of what’s included vs. out-of-scope.
RFI Scoring Rubric
Create a scoring rubric to standardize evaluation across partners. Typical rubric:
| Evaluation Criteria | Weight | Score 1–5 | Key Assessment Questions |
|---|---|---|---|
| D365 Experience & Expertise | 40% | 1–5 | 20+ implementations? Industry experience? Module expertise? Current with latest releases? |
| Team Capability & Experience | 25% | 1–5 | Dedicated team? Relevant certifications? 5+ years experience? Knowledge transfer commitment? |
| Methodology & Risk Management | 15% | 1–5 | Structured methodology? Change management approach? Testing rigor? Change order process? |
| Customer References & Track Record | 12% | 1–5 | Strong references? Willing to provide contact info? Retention rate? Case studies credible? |
| Pricing & Value | 8% | 1–5 | Cost competitive? Transparent pricing? Clear scope? Support pricing reasonable? |
| TOTAL SCORE | 100% | — | Weighted average of scores above |
Scoring scale:
- 5 = Exceptional: Far exceeds expectations, clear differentiator, strong evidence.
- 4 = Strong: Exceeds expectations, solid evidence, minor gaps.
- 3 = Adequate: Meets expectations, adequate evidence, some concerns.
- 2 = Weak: Below expectations, limited evidence, material concerns.
- 1 = Poor: Fails to meet expectations, no credible evidence.
Example scoring (3 partners):
| Criteria (Weight) | Partner A | Partner B | Partner C |
|---|---|---|---|
| D365 Experience (40%) | 5 → 200 pts | 4 → 160 pts | 2 → 80 pts |
| Team Capability (25%) | 4 → 100 pts | 5 → 125 pts | 3 → 75 pts |
| Methodology (15%) | 4 → 60 pts | 4 → 60 pts | 3 → 45 pts |
| References (12%) | 5 → 60 pts | 3 → 36 pts | 2 → 24 pts |
| Pricing (8%) | 2 → 16 pts | 3 → 24 pts | 5 → 40 pts |
| TOTAL SCORE (out of 500) | 436 (87%) | 405 (81%) | 264 (53%) |
Based on this scoring, Partner A would be the clear frontrunner, with Partner B as a strong alternative. Partner C would not advance to shortlist (scoring below ~70%).
Shortlisting & Evaluation Committee Setup
Evaluation committee composition: Include 4–6 members representing different functions:
- IT/CIO: Assesses technical capability, support model, integration approach.
- Finance/CFO: Assesses pricing, cost control, post-go-live ROI realization.
- Operations: Assesses methodology, user adoption approach, change management.
- Project Sponsor/Executive: Provides strategic alignment, authority to make final recommendation.
- (Optional) Current system subject matter expert: If migration project, validates migration approach and risk.
Committee responsibilities:
- Score RFI responses independently, then discuss and reconcile scores.
- Conduct reference calls (at least 2–3 per finalist partner) and document findings.
- Select top 3–5 partners for shortlist based on weighted scoring.
- Prepare feedback for partners not shortlisted (professional courtesy).
- Schedule demos/presentations for shortlisted partners.
Shortlist criteria: Typically select partners scoring 70%+ on weighted rubric. If no partners exceed 70%, reconsider your criteria or expand the original vendor list and re-release RFI.
Common RFI Mistakes & How to Avoid Them
| Mistake | Impact | How to Avoid |
|---|---|---|
| RFI too long (50+ questions, 15+ hours to respond) | Low response rate; only committed vendors respond (may self-select for desperation, not quality) | Limit RFI to 30–40 questions and 10–12 hours effort. Use follow-up RFP for detail |
| Vague scoring criteria (“good experience” without definition) | Subjective scoring, bias, inconsistent evaluation across team | Define scoring rubric with specific assessment questions per criterion |
| Skipping reference calls, relying only on written responses | Miss critical red flags (references won’t talk = customer dissatisfaction) | Require evaluation committee to call 2–3 references per finalist; use script to ensure consistency |
| Not asking about offshore/nearshore staffing model explicitly | Discover post-selection that partner plans to offshore most work (expectation mismatch) | Ask explicitly: “Will offshore/nearshore staff be assigned? What percentage? How is quality controlled?” |
| Not validating project scope assumptions in RFI | Partner estimates are based on assumptions different from your actual scope (big cost variance later) | Include 1–2 page project context in RFI with scope, timeline, and key assumptions. Ask partner to validate |
| Focusing only on price, ignoring experience and fit | Select lowest-cost partner who lacks experience or commitment (rework, delays, low quality) | Use weighted rubric that weights pricing at 8–10% (not 40%), experience at 40% |
| Not asking about team continuity and backup plans | Key team member leaves during implementation; partner struggles to backfill (project delays) | Ask: “If key team member becomes unavailable, who is backup? How quickly can you backfill?” |
| Providing ambiguous evaluation timeline to vendors | Vendors don’t know urgency; delays responding or deprioritize your RFI | State RFI response deadline, shortlist date, demo date, and expected contract signature date in RFI |
Post-Shortlist Demo & Presentation Evaluation
After shortlisting (3–5 finalists), schedule demos and presentations. Structure these to evaluate how partners would solve your business problems, not just product capabilities.
Demo structure (2–3 hours):
- Partner presentation (30 min): Overview of firm, team, and Dynamics 365 experience. Keep high-level; avoid generic sales pitches.
- Your business scenario deep-dive (60 min): Present your top 3–5 business problems. Ask partner: “How would you solve this in Dynamics 365? What configuration? What customization?” Watch for depth of understanding and realistic approaches.
- System demo (30 min): Walk through Dynamics 365 functionality relevant to your scenario. Avoid product tour; focus on how partner would configure for your use case.
- Q&A and partner questions (30 min): Open dialogue. Assess partner curiosity about your business (they should ask questions).
Post-demo evaluation (committee): Score each partner on:
- Understanding of your business and specific challenges.
- Realistic solution approach (not overselling or underestimating complexity).
- Team capability (chemistry with your team, engagement level).
- Confidence in delivering on promises.
Demo tips:
- Use actual data or realistic scenarios (avoid toy examples).
- Prepare 2–3 tricky questions to test depth of expertise.
- Watch for “I don’t know, but we can figure it out” vs. “We’ve solved this for 10 customers, here’s how” (latter is better).
- Assess team chemistry: Do you want to work with these people for 6–12 months?
Frequently Asked Questions
Q1: How many vendors should we include in the initial RFI?
For most organizations, 8–15 vendors is appropriate. This ensures you have a competitive set but are not overwhelmed with RFI responses. For large/complex projects, 15–20 is reasonable. For smaller projects, 5–8 is sufficient.
Q2: How long should we allow vendors to respond to the RFI?
2–3 weeks is standard. If vendors are slow to respond or requesting extensions, that’s a yellow flag (they may be deprioritizing your project or lack internal resources).
Q3: Should we disclose our budget in the RFI?
No. Disclosing budget encourages partners to price to the ceiling. Wait until after shortlisting and RFP to share budget constraints. However, you can share order-of-magnitude context (e.g., “We anticipate this is a $150K–250K engagement”) to help partners self-select if way out of alignment.
Q4: Is it okay to re-engage RFI vendors who didn’t shortlist?
Yes, it’s good practice. Send them a note: “Thank you for your RFI response. We selected 3 finalists based on specific fit with our requirements. If circumstances change, we will reach out.” Keep them warm; they may be appropriate for future projects.
Q5: Should we conduct reference checks for all RFI respondents or only shortlist?
Only shortlist (3–5 vendors). Calling references for all 10–15 respondents is too time-consuming. However, do call at least 2 references per shortlisted partner.
Q6: How do we balance vendor diversity (large global firms vs. local boutique firms) in the RFI?
Include a mix. Large global firms bring scale and resources; local boutiques bring relationship and flexibility. Ideally, your shortlist has 1–2 large firms and 1–2 boutiques for comparison. Scoring rubric should not bias toward size.
Q7: What if all RFI responses are mediocre? Should we shortlist the best-of-breed, or re-release the RFI?
If average score is <60%, consider re-releasing the RFI to a different vendor set or expanding your geographic reach. A mediocre partner is a risk. Don’t settle just to move forward. That said, if you have 3 vendors scoring 65–75%, they may be adequate; shortlist and probe deeper in demo phase.
Q8: How do we handle a vendor who doesn’t shortlist but aggressively pursues us afterward?
Be professional and clear: “Thank you for your interest. We selected partners who best fit our specific requirements. We appreciate you being considered and may reach out for future opportunities.” Don’t let persistence force you to reconsider; decisions should be based on evaluation, not vendor pressure.
Methodology
Dataset: This article synthesizes RFI templates from major Dynamics 365 implementation partners, procurement best practices from analyst firms (Gartner, Forrester), and lessons learned from organizations that have conducted successful partner selections.
Analytical approach: We structured the RFI into six functional sections aligned with key evaluation dimensions (experience, capability, methodology, track record, value). The scoring rubric reflects typical weighting: D365 experience and team capability drive most of the score, with methodology, references, and pricing as secondary factors.
Limitations: RFI templates should be customized to your specific project, industry, and organizational priorities. The suggested questions and weighting are templates, not prescriptive. Adjust criteria and questions based on your requirements.
Data currency: Content reflects partner selection best practices and Dynamics 365 market dynamics as of March 2026. Vendor capabilities, certifications, and pricing change frequently; confirm current offerings directly with partners during RFI.
Frequently Asked Questions
For most organizations, 8–15 vendors is appropriate. This ensures a competitive set but avoids overwhelming your evaluation team. Large or complex projects may include 15–20; smaller projects can use 5–8. Too many vendors dilutes your scorecard and slows evaluation.
Typically 2–3 weeks. If vendors request extensions or are slow to respond, that's a yellow flag indicating they may deprioritize your project or lack internal resources to respond promptly. Set a firm deadline and communicate it clearly.
No. Disclosing your budget encourages partners to price to the ceiling. Keep budget confidential until after shortlisting. However, you can share order-of-magnitude context (e.g., “$150K–250K engagement”) to help partners self-select if way out of alignment.
Only shortlist 3–5 vendors. Calling references for all 10–15 respondents is too time-consuming. However, call at least 2 references per shortlisted partner and ask specific questions about delivery, team quality, and change management.
Be professional and clear: “Thank you for your interest. We selected partners who best fit our requirements. We appreciate you being considered and may reach out for future opportunities.” Stick with your decision; vendor persistence should not override your evaluation.
If average scores are <60%, consider re-releasing to a different vendor set or expanding geographic reach. Don't settle for a mediocre partner just to move forward. However, if 3 vendors score 65–75%, they may be adequate; probe deeper in the demo phase before deciding.
What’s Your Next Step?
Choose based on where you are in your journey.
Still researching?
Ready-to-use Excel template for building an internal D365 budget proposal — covering license costs, implementation fees, training, change management, and ongoing support with built-in formulas.
Get the GuideReady to evaluate partners?
Get matched with the right implementation partner in under 2 minutes.
Try It NowRelated Reading
How to Choose a Dynamics 365 Implementation Partner [2026 Guide]
How to Evaluate a Dynamics 365 Implementation Partner [2026 Checklist]
Learn how to evaluate Dynamics 365 implementation partners using weighted scoring frameworks, industry experience verification, team assessment, and reference checks. Includes red flags, demo evaluation strategies, and decision-making frameworks.
Dynamics 365 Partner Types Explained: CSP, ISV, SI & More [2026]
Comprehensive guide to understanding Microsoft Dynamics 365 partner types, including CSP partners, ISVs, systems integrators, and how to choose the right partner for your organization.
ERP Implementation Contract Review Checklist
Complete checklist for reviewing Dynamics 365 and ERP implementation contracts. Understand key contract sections, red flags, pricing models, and negotiation tactics.