11 May 2026 · CareTime
NHS England published guidance on the safe use of AI-enabled ambient scribing products in April 2026, developed jointly with the Information Commissioner's Office and the National Data Guardian. It applies to any AI tool that passively listens to conversations in a health or care setting and produces a written output — consultation notes, summaries, letters. While written principally for clinicians and ambient scribe products, the guidance is now the closest thing the UK has to a published bar for any AI that listens to care conversations, including phone calls.
For care home managers thinking about AI on the phone — call screening, call recording, AI receptionists, Morning Brief tools — the four principles to absorb are transparency to the speaker, output verification before anything enters a record, defined retention of audio and transcripts, and clear governance with a Data Protection Impact Assessment. A product that cannot meet those four bars on a care home phone line will struggle to survive a CQC Well-Led inspection, an ICO complaint, or a family question about what is being recorded.
This post walks through the guidance, what it means for AI on a care home phone, and how to assess any product you're being sold against it.
The guidance, "Using AI-enabled ambient scribing products in health and care settings," covers AI tools that listen to conversations and automatically draft a clinical or care-related output. It does not require explicit consent for use in individual care, but it does require transparency — patients and service users must be told the technology is being used and given the opportunity to object. If they object, the tool cannot be used in that interaction.
Three principles sit at the centre of the guidance:
For organisations, the guidance also names a set of governance steps: complete a Data Protection Impact Assessment, define controller and processor roles clearly, and ensure suppliers meet defined security and compliance standards.
The guidance is written about clinical conversations between a practitioner and a patient. The principles, though, generalise. Any AI on a care home phone line is in the same category: it listens to a conversation between a person from outside the home (a family member, a GP, a sales rep, a council officer) and either staff or the AI itself, and produces a record or summary. The Information Commissioner's Office will reach for the same governance framework when a complaint lands.
This matters now because the AI receptionist and call-monitoring markets are growing fast in the UK in 2026. Most products being marketed to care homes have not been built around health and social care governance principles. A care home manager evaluating any of them needs to be able to ask the four questions the NHS England guidance implies.
These are the practical translations of the NHS England principles into a procurement conversation with a vendor.
The transparency principle says the speaker is told. On a care home phone line, that translates to a recorded statement at call answer — for example: "This call may be recorded for the purpose of screening and summarising calls for the home." A vendor that cannot show you that statement, or who answers calls silently and then transcribes them, is offering a product that is below the NHS England bar.
This is the most-missed principle. The guidance is explicit that AI-drafted summaries must be checked and corrected before being added to a record. In a care home phone context, that means a summary the AI produces (for example, a Morning Brief, a message taken for a manager, or a referral note) must be reviewable — and ideally reviewed — by a person before it becomes the home's record of the call. A product that auto-files transcripts into a care system with no human verification step is exposed.
The guidance says audio and transcripts should typically be deleted once a verified summary has been produced, unless there is a clear justification for retaining them. A care home should know, for every AI call product in use: (a) how long raw audio is held, (b) how long transcripts are held, and (c) whether either can be deleted on request from a caller. "We keep everything indefinitely" is not an answer that survives a Data Protection Impact Assessment under this guidance.
The guidance places the DPIA obligation on the deploying organisation, but a competent supplier will offer a template DPIA, named controller and processor roles, and evidence of their own security standards. A vendor that does not know what a DPIA is, or whose answer is "your data is in the cloud, it's fine," is shipping a product that has not been thought about in a UK regulatory context.
CareTime's Silent Guard is built around exactly these principles. It is worth being transparent about which boxes it does and does not yet tick.
The bar NHS England has set is not a problem for a phone AI product built for the UK care sector. It is a problem for products being repurposed from general-purpose AI receptionist platforms without sector-specific governance work.
Three actions in the next 30 days are sensible regardless of which AI products are in your home:
Does NHS England's AI scribe guidance legally apply to care homes? The guidance was written principally for NHS and primary care settings, but its principles apply to any AI that listens to health or care conversations under UK data protection law. The ICO and CQC will look to it as the published bar when a complaint or inspection question arises.
Do we need explicit consent to use AI on a care home phone line? Not under the guidance as written, but you do need transparency — the caller is told at the start of the interaction, and they have the opportunity to object. A recorded disclosure at call answer is the standard way to meet this.
Is Silent Guard an "ambient scribe"? Silent Guard sits in an adjacent category. An ambient scribe drafts clinical notes from face-to-face conversations; Silent Guard records and summarises phone calls. The governance principles are the same, so we have built Silent Guard against the NHS England guidance even though it is not formally an ambient scribe product.
What happens to call audio under Silent Guard? Raw audio is retained for 30 days by default (reducible on request), transcripts for 90 days for trend analysis, then deleted. Both windows are configurable per home. The retention policy is part of the Data Processing Agreement signed at pilot start.
What is a Data Protection Impact Assessment and do I need one? A DPIA is a written assessment of the risks to people's rights when you process personal data with a new technology. Under UK GDPR, you need one when a new technology is likely to result in high risk — AI listening to phone calls would normally meet that threshold. CareTime issues a template DPIA at pilot start; the home completes the parts only the home can answer.
If you want a phone AI product that is built against the NHS England governance bar from day one, Silent Guard's £49/30 days pilot is the practical way to start. The DPIA template, retention policy, and call-answer disclosure script all ship with the pilot.
CareTime's Silent Guard is available now for a 30-day pilot. £49, 1-page pilot letter — exit by reply-email.
Join the 30-Day Pilot