Your operating system has decided to remember everything you do. We have notes.

The 2024 “take a screenshot every few seconds and feed it to a local model so you can ask your computer what you were doing last Thursday” feature was a sneak preview of a much bigger problem.

When the major desktop OS vendor announced its photographic memory feature in 2024, the security press took roughly four hours to find the problems. The screenshots were stored locally, but unencrypted at rest. Any process running as the user could read them. Any malicious browser extension. Any helpful synced cloud backup. Anything attached to the device for forensic recovery, ever.

The vendor patched. The feature was opted out by default in some markets. The criticism continued. What didn’t change was the underlying premise: that an operating system’s job, increasingly, includes building an AI-readable record of everything you ever did with your computer, so an AI can be helpful about it later.

The pitch was reasonable. “What was that document I was reading on Tuesday?” is a real problem. Asking your computer in plain English to find it is a real benefit. The mistake wasn’t the feature. It was the assumption that the corpus of “everything you’ve ever done” should sit on a device that’s also running browser extensions, syncing to a cloud account, and occasionally being plugged into the airport courtesy charger.

The other mistake, less discussed, is what happens to that corpus when the device leaves the person’s hands. Bring-your-own-device programs. Enterprise wipe events. Subpoenas. Refurbishments. The corpus has to be either trusted (it isn’t) or carefully managed (it usually isn’t) for the feature to be safe at scale.

How Halo would have changed this

Local AI on infrastructure you actually control.

The premise is right: a local AI that knows your corpus and answers questions about it is genuinely useful. The execution is wrong if that corpus lives on every laptop in the company, lightly protected, indexed by a vendor’s service. Eclipse runs the AI on infrastructure you operate. The corpus lives in one place. The audit trail is yours. The screenshots, if there are any, never leave a perimeter you don’t define.

Source

BleepingComputer’s reporting on the screenshot-everything OS feature and the security researchers who pulled it apart.  https://www.bleepingcomputer.com/