The CQC assessment framework 2026: what changed on 24 March

Your compliance lead has been asked for the one-page version of what just changed at CQC. The honest answer is that the CQC assessment framework 2026 your evidence library maps to is being retired, and the replacement is still in draft. The board wants reassurance and operations want a plan, and neither is best served by a press release summary. They need the operational picture before the marketing layer reaches them, and they need it this week.

On 24 March 2026 the Care Quality Commission published its March update and opened four draft sector-specific assessment frameworks for response. Covering adult social care, mental health care, primary care and community services, and hospitals (secondary and specialist), the drafts replace the Single Assessment Framework introduced under the 2023 revision. Consultation closes at 5pm on 12 June 2026, with pilots running through the summer, a final framework due by the end of that summer, and implementation landing at the end of 2026.

This piece walks through what has changed, what has been removed from the assessment mechanics, what has been retained, and where compliance teams should put their attention between now and year end. All factual claims trace to CQC, gov.uk, and NHS England primary sources, linked inline.

Why the CQC assessment framework 2026 exists at all

Two reports landed on the same day, 15 October 2024. Dr Penny Dash's full report into CQC's operational effectiveness set out seven recommendations, covering operational performance, the provider portal, the regulatory platform, use of performance data, report quality and timeliness, sector chief inspector appointments, and resetting the relationship with providers. Alongside it, Professor Sir Mike Richards' review of the Single Assessment Framework and its implementation recommended retaining the five key questions, suspending (and almost certainly scrapping) the evidence categories and scoring system, reinstating ongoing inspector-to-provider relationships, and reviewing inspection staffing levels. Those recommendations sit at the heart of what became the CQC assessment framework 2026.

CQC's public response on the same day accepted the findings and committed to address them "with urgency", including stopping the scoring of individual evidence categories while retaining the five key questions, appointing at least three sector chief inspectors, and making the assessment framework sector-relevant. That commitment has now produced the drafts.

What followed the reviews was the "Better regulation, better care" consultation, which closed on 11 December 2025 with 1,703 responses. CQC's initial response confirmed that around 95% of respondents agreed or strongly agreed with each proposed framework change, and around 80% supported the inspection and rating methodology changes. Those two numbers do different work. The first says the sector is aligned on redesigning the framework. The second says providers remain wary of how inspections will actually be run. Credibility will be earned or lost on the second.

CQC's March 2026 update took that consultation verdict and turned it into four sector-specific drafts now out for a second round of response. The consultation closes at 5pm on 12 June 2026.

What is actually in the CQC single assessment framework changes

Strip away the structural language and the practical shape is this.

The five key questions remain

Safe, Effective, Caring, Responsive, and Well-led continue to anchor every assessment across every sector. The four rating levels (Outstanding, Good, Requires Improvement, Inadequate) also remain, along with the "I statements" that reflect lived experience at the point of care.

The 34 quality statements are out

The evidence architecture providers have been building for the last two years, mapped across 34 quality statements, is being dismantled. In its place, CQC is re-introducing Key Lines of Enquiry, framed as structured investigative questions sitting under each of the five key questions. The draft Adult Social Care framework carries 24 KLOEs, split Safe 6, Effective 6, Caring 3, Responsive 4, and Well-led 5. The other three drafts carry equivalent structures tailored to sector.

Numerical scoring of evidence categories is out

In CQC's own words from the initial consultation response, "future rating judgements will be made holistically using professional judgement informed by evidence" at key question level. Boards that spent 2024 and 2025 building internal dashboards around evidence-category scores are looking at a tooling and training problem.

Rating characteristics are reintroduced

Anyone who worked under CQC before 2023 will recognise the shape. Each rating level (Outstanding, Good, Requires Improvement, Inadequate) now carries published characteristics describing what it looks like at key-question level. Judgement at that level sits against those characteristics, not against unconstrained discretion.

Sector framing returns

One framework in four tailored shapes, not four separate regulators. Cross-cutting issues still apply, and NHS Confederation has already flagged the risk that narrowly drawn sector frameworks miss integrated care pathways. That tension is live, and compliance teams whose services cross sector boundaries will feel it first.

The CQC reset 2026 timeline compliance teams need on one page

Four dates carry the operational weight. 12 June 2026 closes the consultation. Summer 2026 runs the pilots and publishes the final framework. End of 2026 is stated implementation. And across that same window, CQC has committed in its 2025/26 business plan to complete 9,000 assessments by the end of September 2026 (including 5,013 in adult social care and 1,500 in oral health). Most sectors run to a 50 working day assessment completion window; mental health and secondary specialist care run to 85 working days.

That 9,000-assessment target matters more than most commentary has acknowledged. A material share of providers will be assessed before the final framework lands. The transition is already live, against a framework being retired while its replacement is still in draft.

The leadership context most providers are reading between the lines

CQC's leadership has turned over substantially in eighteen months. Sir Julian Hartley resigned as Chief Executive on 23 October 2025, with Dr Arun Chopra (previously Chief Inspector of Mental Health, appointed March 2025) stepping in as Interim Chief Executive. Sir Mike Richards, appointed Chair on 1 April 2025 following his single assessment framework review, announced on 6 February 2026 that he would stand down once a successor is in post. Chris Dzikiti is acting as Interim Chief Inspector of Mental Health.

None of that changes the framework text, but it shapes how providers should read the regulator's appetite for further change. The HSJ reported on 30 March 2026 that concerns in the sector remain "pervasive" and that "deep wariness" persists about inspection practice. That sits alongside the consultation numbers: 95% on framework, 80% on methodology. Providers are ahead of CQC on the framework design. They are still waiting to see how the drafts translate into live inspection behaviour.

What the CQC quality statements scrapped decision means for your evidence library

Compliance teams that invested in a 34-way evidence taxonomy now have two problems to solve in parallel.

The first is current-framework readiness. CQC assesses against the current framework until implementation. Inspection volume is rising toward the 9,000 target. Evidence built against the quality statements still needs to be current and accessible until the new framework goes live.

The second is transition readiness. Evidence artefacts need to be re-mapped, not rebuilt. The five key questions remain the constant. KLOEs add specificity beneath them. Rating characteristics add clarity on what "Good" and "Outstanding" look like at key-question level. Artefacts that served the scoring mechanism but not the underlying question can be parked. Artefacts that evidence the question directly will carry through.

The operational implication for workforce compliance is that evidence tied to Regulations 5 and 19 (the Fit and Proper Persons requirements for directors and staff) maps cleanly into the new Safe and Well-led key questions. So does evidence tied to the NHS Employment Check Standards (identity, right to work, professional registration, employment history, criminal record checks, and occupational health). Neither regulation has changed. Both still sit under every Safe and Well-led KLOE in the drafts.

Credentially's evidence management material for CQC inspections speaks directly to that mapping problem, and the transition period is where it starts to matter. Automated primary source verification against GMC, NMC, and HCPC, DBS status tracking, and right-to-work re-check schedules produce evidence that sits under Safe and Well-led regardless of the framework version running on the day of inspection.

Continuous compliance readiness as a provider-side operational posture

A quick framing correction, because the point gets lost in sector commentary. CQC has softened on continuous ongoing assessment. The 2025/26 business plan and the draft frameworks describe a risk-based, sector-appropriate schedule that takes into account previous ratings and emerging risk, not a rolling continuous assessment model. Continuous monitoring is no longer being pushed as the regulator's direction of travel.

The operational case for ongoing readiness still stands, but the framing is provider-side. An expired DBS, a missed right-to-work re-check, or a lapsed NMC registration creates a compliance event on the day it happens, not on the day of inspection. Credentially's material on continuous compliance readiness frames the case in those terms. Real-time credential monitoring, configurable onboarding workflows, and automated expiry alerts exist to catch issues as they arise, not to mirror a regulator demand that has moderated.

What to do between now and year end

These are the actions compliance directors can control inside the window.

Respond to the consultation before 12 June 2026. With 1,703 responses on the last round, individual provider submissions carry unusual weight in a sector this large. Care England, NHS Confederation, and NHS Providers are coordinating collective responses; individual providers can respond directly via the CQC response portal.

Audit the existing evidence library against the five key questions, not the 34 quality statements. The artefact set will shrink and sharpen. Evidence that never served the underlying question can go.

Training and competence records deserve a separate review. Completion records are not competence records. Draft rating characteristics are expected to lean harder into demonstrated capability than logged completion, particularly under Safe and Effective.

Plan for the NHS workforce context the frameworks will be read against. NHSE Q2 2025/26 data puts total NHS vacancies at 100,023 (a 6.7% rate) in September 2025, with nursing at 25,500 (6.0%) and medical at 7,250 (4.4%). Vacancy rates are falling, but Well-led rating characteristics on workforce stability will be read against live staffing evidence on the day of inspection.

Credentially's platform is built for that operational picture: platform-managed onboarding steps reduced from 60 days to 5 (third-party checks including DBS run 2 to 6 weeks in parallel), up to 80% reduction in candidate dropout during onboarding, and 68% reduction in admin load on credentialing teams. The mechanics behind the 60-to-5 reduction are documented, and the case for mapping them to the 2026 framework transition is operational rather than theoretical.

Next step

The 2026 transition checklist walks compliance directors through the evidence remapping required between now and year end, aligned to the five key questions and the draft rating characteristics. It is the version most usefully shared with a board or clinical governance committee ahead of the 12 June consultation deadline.

The CQC assessment framework 2026 is being rewritten, and the job between now and year end is making sure the evidence underneath travels with it. If the evidence was sound in 2025, it remains sound: what changes is where it sits against the five key questions and the draft rating characteristics, and that remapping is the live work of the next two quarters.

The CQC assessment framework 2026: what changed on 24 March
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.