The NHS Is Modernising. Are We Remembering the Patient?
So… my bloodstream now has a daily subscription service.
After the knee and hip replacement earlier in the year, I was hoping to be on the steady road back to normality by now. Instead, I’ve acquired a new companion: a PICC line that runs directly to the heart. It’s not painful, but it’s ever‑present… a physical reminder that healing isn’t always a straight line.
This means daily visits from the community nursing team for IV antibiotics, and being immunocompromised for a while. So if you see me turning down invites, cancelling plans, or steering clear of anyone with the sniffles… it’s not personal. It’s just practical.
Something happened today made me stop, think, and want to write this. Because as much as this chapter is about recovery, it’s also about trust… and the quiet assumptions we don’t notice until we’re vulnerable.
1. The Phone Call
The phone rings.
No Caller ID.
Which is already a bit of a gamble these days, but okay. I answer.
The voice on the other end immediately asks me to confirm my date of birth.
No introduction. No context. No reason.
Just:
“Can you confirm your date of birth, please?”
Statistically, this was probably the NHS. But here’s the thing…
In 2025, where:
- Phone numbers are easily spoofed
- SMS messages can be faked
- Caller ID can be forged in seconds
- And healthcare systems are constant targets for data breaches…
…being asked to give personal information to an unverified caller isn’t just uncomfortable.
It’s unsafe.
This is the cybersecurity equivalent of someone knocking at your front door and saying:
“Before I tell you who I am, can you give me your National Insurance number?”
Banks figured this out years ago. Most use mutual authentication, meaning:
- You prove who you are
- They also prove who they are
But in healthcare? We’re still operating on a kind of “just trust us” handshake from the landline era.
And when you’re vulnerable, tired, medicated, or worried… that’s when one-way trust is most risky.
2. The Home Visit and the “Magic Notes” App
Later that afternoon, my district nurse arrived. And I want to be clear about this part from the start:
They were kind. They were professional. They were doing their job. And, like many staff, they were also clearly following the training and guidance they’ve been given.
They’re walking into people’s homes every day, under pressure, understaffed, and still managing to look after patients with empathy. Truly… respect.
At one point, they asked if they could use an app called Magic Notes (https://magicnotes.ai), an AI-powered recording tool. When I declined, they pushed again, not forcefully, but in the way someone repeats information they’ve been trained to emphasise.
The explanation was simple and genuine:
“It helps me keep accurate notes so I don’t miss anything.”
Which, in principle, is absolutely fair.
Accurate medical notes are important. And anything that reduces admin burden for healthcare staff is generally a good thing.
So my hesitation wasn’t about the nurse.
It was about the system around the nurse.
I asked the question:
“Before we record anything… where does the audio go, and who has access to it?”
The nurse didn’t know.
Not because they were trying to hide anything… but because they haven’t been given that information either.
And that’s the real issue.
Consent isn’t just saying yes or no.
It’s knowing what you’re saying yes or no to.
I looked to see whether Magic Notes was listed on the NHS Data Security and Protection Toolkit.
https://www.dsptoolkit.nhs.uk/OrganisationSearch
That’s basically the NHS’s approved supplier list, the checklist of companies that are allowed to handle patient data.
I couldn’t find it immediately.
A Note on Magic Notes and BEAM UP LTD
After finishing this conversation, I did some digging. The Magic Notes app isn’t actually listed on the DSP Toolkit under the name Magic Notes, it appears under its parent company:
BEAM UP LTD (Organisation code: U2A6H)
- Primary sector: IT Supplier
- Address: Senna Building, Gorsuch Place, London E2 8JF
- Latest DSPT Status: 2024–25 — Standards Exceeded (Published 18/12/2024)
So on paper, the organisation meets the standards for handling NHS data.
But here’s the important part:
None of that is visible to a patient in the moment they are being asked to be recorded.
The nurse didn’t know where the data goes. I wasn’t given any information to review. And unless you already know how to look up supplier organisations in the DSP Toolkit, there’s no realistic way to verify any of this during a home visit.
So the issue isn’t the technology itself. It’s that the path to informed consent isn’t accessible in practice.
Trust can’t only exist in backend compliance documents. It needs to exist in the moment before someone asks, “Is it okay if I record this?”
Again… that doesn’t mean it’s bad technology. It just means I wasn’t given enough information to make an informed decision.
So I declined.
Not dramatically. Not confrontationally. Not suspiciously.
Just:
“I’m not comfortable being recorded today.”
And we carried on like nothing had happened.
Because this isn’t about creating barriers. It’s about patients feeling safe during moments when we are already vulnerable.
3. Why Should You Care?
When you’re unwell, recovering, or simply trying to get through the day, you shouldn’t also have to be a data protection specialist.
Trust in healthcare isn’t just about the treatment itself. It’s about feeling safe in the process.
Most of us assume:
- The person treating us is allowed to be there
- The equipment being used is approved
- Our information is handled carefully
But technology has quietly changed the shape of the room.
It’s no longer just you and your clinician. Sometimes, it’s:
- You
- Your clinician
- Their device
- A third‑party company processing audio
- A server somewhere storing it
- A machine learning model potentially learning from it
And unless we’re told clearly and upfront who those extra “participants” are… we cannot meaningfully consent.
This isn’t about being suspicious. This is about being informed.
When I asked where the recording goes, I realised… if this were an elderly relative being visited at home, they’d have just said yes without thinking twice. Not because they don’t care about privacy, but because when a nurse in uniform asks you something in your own home, the natural instinct is to trust and comply.
That’s exactly when clear information matters most.
Because the most personal things we ever say often happen in medical settings.
And that deserves care.
4. A Simple Analogy
Imagine you’re chatting to a friend in your own kitchen. Cups of tea. Comfortable. Familiar.
Then, halfway through, they say:
“Oh, by the way, I’ve been recording this and sending it to a company you’ve never heard of… but don’t worry, everyone else is fine with it.”
Most of us would pause. Not because we think our friend has bad intentions. But because we didn’t know.
The issue is not the recording. The issue is the missing conversation.
5. How We Can Do Better
This is entirely fixable.
If the NHS wants to integrate AI‑supported clinical note‑taking (and I believe it should), we need:
- Clear, simple explanations of where data goes and who sees it
- Tools to be visible on the DSP Toolkit, where everyone can check them
- A standard script that supports informed consent, not assumed consent
- A way for patients to say “no” without feeling like they’re making someone’s job harder
This isn’t about slowing things down. It’s about keeping trust intact.
Because trust is a form of care, too.
6. Closing Note
Recovery isn’t linear. Some days are heavy. Some days are hopeful. Some days are held together with antibiotics, determination, and a sense of humour.
If you’re reading this as someone who works in healthcare: thank you. None of this is about blaming you. You’re doing more than most of us will ever understand.
This is about the infrastructure around you. And how we can shape it so that patients, especially the vulnerable ones, feel safe, respected, and part of the conversation.
We can do this better. And we should.
If you’ve had an experience with this, whether as a patient, clinician, or someone working behind the scenes in digital health, I’d genuinely welcome your perspective.
Thanks for reading.
Jason

