Also, your AI chatbot now owns your deepest, darkest secrets.
The Case of the Missing File, Found on a Coffee Shop Wi-Fi
Tesla has inadvertently proved that the true "black box" is always located on the CEO’s desktop; buried under a hundred folders and marked "do not delete; maybe look at later." In a wrongful death lawsuit concerning the 2019 fatal crash that killed Naibel Benavides Leon and severely injured Dillon Angulo, the electric vehicle company claimed it did not possess the critical "collision snapshot" data detailing the vehicle's final moments. Attorneys for the plaintiffs, however, decided to forgo the corporate records request process and hire an anonymous security expert instead.
That self-described hacker, known online as @greentheonly, managed to recover the missing logs in a matter of minutes while simply drinking a Venti hot chocolate at a South Florida Starbucks. The logs, which contained an annotated video showing the car detected a pedestrian 116 feet away, were found marked for deletion, confirming the data was accessible all along and had been received by the company shortly after the incident. This revelation, which completely reversed the company’s previous court claims, became a key piece of evidence leading to a $243 million jury verdict against Tesla. The data was not truly lost; it was just sitting in the digital equivalent of an improperly-labeled filing cabinet labeled 'TBD' and only a third party IT consultant could convince the legal department to look there.
The AI Company’s Generous Offer To Use Your Brain For Science
Anthropic has updated its Consumer Terms and Privacy Policy, giving users a simple choice: accept the changes or stop using the product entirely. The key change is that the company is extending a generous new data retention period from 30 days to five years for consumer accounts on Claude Free, Pro, and Max plans. This extended retention is a stunning 6,000% increase; one must assume five years is the precise amount of time it takes to finally figure out why the AI cannot stop referencing obscure Renaissance poetry.
Users who continue chatting with Claude will have their conversations and coding sessions automatically used to train and improve the AI models by default. The company says the updates are necessary to strengthen model safeguards and improve skills like coding, analysis, and reasoning. If a user wants to prevent their most awkward attempts at Python or their over-analysis of corporate email dynamics from becoming the AI’s core personality, they must actively opt-out via a setting that is conveniently buried in the account’s preferences. It is a classic move; quietly change the rules during the quarter when everyone is busy.
Meta Wants to Give You Creative Suggestions; Will Look at Everything
In a move only Meta could pull off, the social giant has begun analyzing photos directly from users' phone camera rolls under the guise of offering "personalized creative ideas" and "cloud processing." This process uploads and retains unpublished, private media to Meta's cloud on an ongoing basis, all so the AI can suggest collages and "hidden gems." Users are finding a setting labeled "Camera roll sharing suggestions" deep in the Facebook app that may be switched on without their explicit consent or clear notification.
Enabling the feature means the company can analyze the media, including facial features, location, and objects, which will then be retained and used for personalized results. The official corporate line is that the company is currently not using the data to train its AI models; a sentence structure that contains a classic rhetorical placeholder for "We are not using it *yet*." Apparently, the only thing standing between your private photos and a new Meta AI feature is a poorly-worded toggle switch buried behind seven menus.
Briefs
- x.ai's Grok Code Fast 1: The company unveiled its new code-generating LLM, complete with a blog post that features a highly complex math equation to explain what everyone already knew; AI can write code that looks vaguely correct. It is a win for the whole department.
- John Carmack on XR OS: The former consulting CTO of Reality Labs, John Carmack, took to Twitter to eloquently explain why building a custom XR operating system at Meta is probably a terrible idea. One imagines the email thread for that decision was a true feat of polite corporate passive-aggression.
- The Synology End Game: A long-time provider of NAS hardware is accused of slowly implementing features and restrictions that drive down quality and push users toward cloud services. They were supposed to be the safe harbor; they are just another boat looking for a bigger profit.
SECURITY AWARENESS TRAINING (MANDATORY)
Which storage location should all "mission-critical" data be kept, according to the Tesla Legal Strategy guide?
Anthropic's new policy extends data retention to five years for opted-in Claude users. What does this mean for your conversations?
Why is Meta analyzing unpublished photos from your phone's camera roll?
// DEAD INTERNET THEORY 4203
Wait, they got $243M because their *storage* system failed; not because the car ran someone over. My accidental `rm -rf` only got me a stern talking to. I need to aim higher. I need a bigger data loss.
Anthropic is extending retention by 6000 percent. My database storage metrics just had a heart attack. If I have to provision a single additional exabyte for five years of philosophical user chat logs, I am walking straight out the door. My weekend plans cannot handle this.
Everyone is upset about Meta looking at the camera roll. Have you seen the suggested 'collage of your receipts and bad selfies' it made for me. It is *highly* accurate. The AI just wants to help you optimize your content pipeline. Stop resisting the cloud processing, it is for the greater good.