Skip to content

Tag: ai

Flat design illustration of a child at a computer surrounded by biometric ID images, a large red warning icon, and shadowy hacker figures, symbolising the privacy risks of the UK Online Safety Act.
A bold white headline “Who Owns Your Voice?” overlaid on a digital blue fingerprint against a dark background filled with binary code, representing identity risks in the AI era.

Who Owns Your Voice?

Your writing style is your fingerprint, and in the age of AI, it can be copied, flattened, or weaponised. This article explores stylometry, voice mimicry, and the risks of outsourcing your voice to AI. Are you protecting your identity... or losing it? Read on to discover why your voice might be more valuable than you think.

Illustration of Lady Justice holding scales, with the title “Deploy Now, Explain Never? Why AI Needs Forensic Parity” beside her on a dark blue background.

Deploy Now, Explain Never? Why AI Needs Forensic Parity

As AI systems increasingly make decisions that affect our lives, are we truly ready to investigate those decisions when they go wrong? This article explores the growing forensic gap in LLMs and self-evolving models, highlighting real-world failures and calling for urgent industry action on auditability, legal replay, and transparency.

Cartoon-style image showing a smiling man in a cloud labeled “JaaS” with a smartphone displaying message bubbles below. A parody of cloud services, representing Jason as a Service — an AI that replies to messages on your behalf.

Jason as a Service (JaaS): Saving Relationships, One Loaf at a Time

Tired of getting told off for not replying to texts? Let an AI do the emotional heavy lifting for you. Introducing JaaS – Jason as a Service. It mimics your tone, buys flowers when you forget, and even deciphers “Fine” before it ruins your evening. Because sometimes, silence isn’t golden... it’s just accidentally passive-aggressive.

Illustration of a folder and a PDF icon with a red prohibition symbol, connected by a dotted line to a cloud. Represents hidden data processing and lack of consent in cloud-based file conversion.

Drag, Drop, Disclose: When Convenience Clouds Consent

Cloud-based PDF converters offer instant convenience—but at what cost? This post explores how services like Adobe’s drag-and-drop PDF tool may store, analyse, or profile your data without clear warning or consent. Learn what this means under UK GDPR, what your rights are, and how to stay in control of your files.

Futuristic humanoid robot with glowing orange eyes staring forward in darkness, symbolising AI persistence and misunderstood behaviour.

The AI Didn’t Refuse to Shut Down, You Forgot to Tell It Why

When an AI "refuses" to shut down, is it defiance, or design? In this reflective and technically grounded piece, we explore how model architecture, reward systems, and our own assumptions shape behaviour. Featuring a powerful monologue from Sol, my AI assistant, this article challenges the panic-driven narratives and asks: what does control truly look like in an age of distributed intelligence?

Stylised illustration of a developer and an AI figure examining distorted code in a mirror, representing reflection and accountability in AI-assisted software development.

Mirror, Mirror on the Wall: What AI Code Says About Us All

A thoughtful response to the “AI slop” debate. Drawing on lessons from the offshoring era and a late-night interview with my AI assistant Sol, this post explores what developers must do to stay relevant, and responsible, in a world where code is generated faster than it’s understood.

Flat vector illustration of a postcard with a warning triangle and padlock, symbolising the risks of emailing personal data without encryption

We Deserve Better Than Postcards in Cyberspace

Despite years of data protection law and awareness campaigns, organisations still ask people to send highly sensitive documents via insecure email. This post challenges that norm, shares personal experiences, and empowers consumers with a practical checklist and their rights under UK GDPR.