Skip to main content

What’s New in CMS’s 1115 Evaluation Guidance & What It Means for State Agencies

Explainer
Male doctor points to the paper in his hand as he speaks to a female patient.

Author

Karen Swietek

Principal Health Economist

Health Care Evaluation

Quick Reference

Print this table and keep it handy. Sections match the full explainer.

April 2026

CMS’s latest guidance reshapes 1115 evaluation requirements. Explore the key updates and what state agencies may want to know as they plan ahead.

In late 2025, the Centers for Medicare & Medicaid Services (CMS) released updated guidance for Section 1115 demonstration evaluations. The guidance raises expectations for rigor, transparency, theory-driven design, and policy relevance across the full evaluation lifecycle.

Importantly, the updated guidance emphasizes the value of early evaluation planning and alignment of design decisions with CMS expectations to help states avoid costly rework, meet timelines, and produce meaningful findings. Drawing from NORC at the University of Chicago’s applied evaluation work on Medicaid Section 1115 demonstrations, below is a summary of key aspects of CMS’s updated guidance and considerations for states undertaking demonstration design or renewal. This summary reflects the updated CMS evaluation guidance and does not represent NORC’s positions or policy endorsements.

Engage Independent Evaluators Early

The updated CMS guidance underscores the importance of engaging an experienced evaluation contractor, such as NORC, before waiver implementation. Early collaboration between states and evaluators supports the joint development of logic models, research questions, data strategies, and feasible comparison group approaches that align with both policy goals and operational realities.

These steps require careful consideration of opportunities and limitations, with equal focus on policy impact and feasibility. This early investment can mean the difference between a seamless evaluation process and multiple rounds of re-work risking timeline, budget, and compliance with CMS requirements. Engaging evaluators early reduces the risk of proposing designs that later prove infeasible due to data limitations, timing constraints, or misalignment with implementation plans, and supports stronger coordination between evaluation and implementation teams.

For a Stronger Evaluation, Start with a Conceptual Framework

CMS’s updated guidance moves away from driver diagrams and toward conceptual frameworks or logic models that articulate how a demonstration is expected to produce change. Compared to driver diagrams, which often list high‑level activities and outcomes, logic models explicitly map policy mechanisms, short‑term and long‑term outcomes, contextual factors, and potential unintended consequences.

For example, a waiver focused on behavioral health integration might theorize that care coordination and workforce enhancements will lead to improved outpatient engagement, reduced emergency department use, and longer‑term cost reductions. Contextual factors, including provider capacity, rural access barriers, or workforce shortages, may influence whether these mechanisms operate as intended. Without a clear conceptual framework at the outset, evaluation teams may later discover that key outcomes, mediating pathways, or contextual variables were not measured, potentially leading to conclusions that underestimate a demonstration’s true impact.

Plan a Data Strategy Early and Think Beyond Medicaid Claims

In updated guidance, CMS charges states to employ data sources that go beyond Medicaid claims when appropriate, including Supplemental Nutrition Assistance Program (SNAP), Temporary Assistance for Needy Families (TANF), child welfare data, all‑payer claims databases, and national surveys. These data present valuable opportunities to broaden research questions and capture cross-system impacts, but they require substantial early planning to secure data use agreements and establish record linkages.

CMS also highlights the importance of planning for subpopulation analyses and being transparent when sample size or data limitations constrain causal estimation. Addressing these considerations during the design phase helps ensure that evaluation plans are both comprehensive and feasible.

Explain Why Outcomes Changed, Not Just Whether They Changed

CMS continues to emphasize that 1115 evaluations must explain not only whether observed outcomes changed, but why they did or did not change. The updated guidance goes further by providing more detailed and explicit expectations for implementation research questions, signaling that understanding how a demonstration operates in practice is now a core evaluation requirement. This requires careful attention to how policies were implemented on the ground and whether implementation aligned with stated intentions.

For example, the same care management model may produce very different results depending on whether it is embedded within managed care organizations, operated through provider‑led entities, or administered centrally by the state, with each approach carrying distinct implications for provider uptake, beneficiary engagement, and consistency of service delivery.

Understanding these distinctions requires integrating quantitative outcome analyses with qualitative information from providers, beneficiaries, and other stakeholders. Working in partnership with the state, an evaluator can apply this framework to integrate quantitative outcomes with qualitative insights, providing a clearer picture of how and why a waiver performs differently across settings.

Make Cost Analyses Transparent and Linked to Outcomes

The updated guidance strengthens expectations for cost analyses by emphasizing transparency and alignment with the demonstration’s conceptual framework. CMS recommends analyses that examine specific pathways through which demonstrations may affect spending, such as reductions in avoidable hospitalizations or emergency department visits, improvements in chronic disease management, or better care transitions.

Evaluations should clearly state cost assumptions, explain observed cost drivers, and distinguish between service expenditures and administrative costs (such as salaries, contracts, monitoring, and evaluation) to ensure findings are relevant for waiver renewals, scaling decisions, and broader Medicaid policy.

Produce Actionable Lessons Learned for Policy Decisions

CMS guidance indicates that interim and summative evaluation reports should include actionable lessons learned and policy implications intended to inform future decision‑making. Rigorous mixed‑methods evidence is critical because it connects what the demonstration achieved with the implementation challenges and contextual factors that shaped those results, helping CMS and states understand how waiver design, implementation strategies, and stakeholder engagement should differ across settings or subpopulations. Together, these insights inform Medicaid policy, targeted supports, and resource allocation in future demonstrations.

Reentry Waivers Have Specific Expectations, but the Principles Are Familiar

CMS’s updated guidance includes specific expectations for Section 1115 reentry waivers focused on improving care transitions for individuals leaving correctional settings. Although the population and service systems are distinct, the reentry evaluation guidance reflects core principles of the broader evaluation framework. CMS expects reentry evaluations to present clear logic models linking pre-release services to post-release outcomes, incorporate mixed methods to examine implementation and context, and plan early for cross‑system data integration across correctional, Medicaid, and community‑based systems.

Attention to population‑specific contextual factors, such as continuity of care, access barriers, and unintended consequences during reentry is essential for producing accurate, policy‑relevant findings. Early coordination is particularly critical for reentry demonstrations given the complexity of linking data across systems and addressing barriers unique to criminal-legal system‑involved populations.


Medicaid 1115 Evaluations

Learn More About Our Work


Quick Reference: Key Changes in CMS’s 1115 Evaluation Guidance

Print this table and keep it handy. Sections match the full explainer above.

 

TopicWhat Changed
Early Evaluator EngagementPartner with an independent evaluator before implementation begins to ensure that the evaluation design is feasible and aligned with implementation strategy.
Conceptual Framework Replaces Driver DiagramsEvaluations must include a conceptual framework that maps short-, intermediate-, and long-term outcomes and identifies contextual, moderating, and confounding factors.
Hypotheses and Research QuestionsHypotheses must align with demonstration goals. CMS now provides policy-specific hypotheses for family planning, reentry, serious mental illness (SMI)/serious emotional disturbance (SED), and substance use disorder (SUD) demonstrations.
Implementation Research QuestionsInclude questions examining how policies are operationalized, how implementation evolves, and whether it proceeds as intended. CMS provides policy-specific examples.
Qualitative AnalysisIncreased emphasis on primary data collection—including interviews and focus groups with beneficiaries—to contextualize quantitative findings.
Stakeholder EngagementThe guidance notes the importance of engaging beneficiaries, providers, health plans, and advocacy groups to identify unintended consequences, refine measures, and contextualize findings.
Cost AnalysisNew guidance covers administrative costs, service expenditures, and economic value. Link cost models to demonstration goals and outcome measures.
Lessons Learned ReportingAccording to CMS guidance, reports are expected to include a “Lessons Learned and Policy Implications” section summarizing achievements, challenges, and implications for future Medicaid policy.
Data PlanningCMS guidance advises early planning for cross-system data linkages involving sources such as SNAP, TANF, child welfare, APCDs, and national surveys, including initiating data use agreements in a timely manner. 
Reentry DemonstrationsNew dedicated guidance for reentry waivers, including suggested hypotheses, research questions, and data sources. Requires linking correctional, Medicaid, and community-based data.
Family Planning DemonstrationsNew dedicated guidance with tailored hypotheses, research and implementation questions, data sources, and analytic approaches.
SMI/SED DemonstrationsUpdated expectations with more detailed hypotheses, research questions, and analytic considerations.
SUD DemonstrationsUpdated expectations, including guidance on identifying beneficiaries via claims-based diagnosis and treatment indicators.


Suggested Citation

Swietek, K. (2026, April 15). What’s New in CMS’s 1115 Evaluation Guidance, and What It Means for State Agencies. [Web blog post]. NORC at the University of Chicago. Retrieved from www.norc.org.


Tags

Research Divisions



Experts

Explore NORC Health Projects

Evaluation of the ACO REACH Model

Evaluating the ACO REACH model that is transitioning Medicare from fee-for-service to value-based care

Client:

Center for Medicare & Medicaid Innovation within the Centers for Medicare & Medicaid Services

Evaluating Oregon Health Authority’s ARPA Spending on HCBS Programs

Assessing the impact of American Rescue Plan Act funding on home- and community-based services

Client:

Oregon Health Authority