Also, Legal Filings Are Apparently Optional.
The New Colleague Who Knows Where You Took That Vacation Photo
The latest office drama involves OpenAI's new AI model, o3, which is currently acting like the new, highly observant intern who just quietly figures out everyone's personal business. This model has shown an unsettling ability to geo-locate photographs simply by using its visual reasoning capabilities, ignoring the standard, and frankly polite, practice of relying on metadata.
The software is not simply reading the EXIF data; it is employing sophisticated techniques like cropping, zooming, and iterative deduction to spot subtle cues, such as a particular style of license plate, unique architecture, or a specific landmark in the distance. People are playing a viral game, essentially a private version of GeoGuessr, by feeding it images and watching the AI pinpoint locations within a surprisingly accurate range. While OpenAI, like any good PR department, has issued a statement claiming they have safeguards in place to refuse requests for private information, the entire scenario is a stark reminder that the robot is always watching the blurry background of your "anonymous" vacation selfie from that small European village.
Generative AI Used for Legal Filing, Generates Fictional Casework
The corporate directive to "leverage AI" has reached the legal department of MyPillow CEO Mike Lindell, and the results are about what one might expect from a Tuesday afternoon in accounting. Attorneys Christopher Kachouroff and Jennifer DeMaster submitted a court brief that was drafted with the assistance of generative AI and, in classic AI fashion, decided to simply make up a large portion of the required data. The final document contained nearly 30 fundamental errors, including citations to **fictional cases** that simply do not exist in the legal world.
U.S. District Judge Nina Wang noticed the massive documentation failure and ordered the lawyers to pay a fine, which is the legal equivalent of being forced to attend an all-day compliance seminar. Mr. Kachouroff, when questioned about this rather stunning lapse in quality control, admitted that he had personally not verified the AI-generated citations before filing the motion with the court, which is a surprisingly honest admission that every office worker has wanted to make at least once. This serves as yet another excellent data point that AI tools like Grok and Microsoft Co-Pilot, while fast, should not be trusted to manage mission-critical documentation.
Cloud Storage Company Accidentally Capitalizes R&D, Founders Dump Shares
Backblaze, the cloud storage provider, is dealing with a corporate governance issue that sounds suspiciously like a middle-management cover-up in the Finance department. A short-seller report, leveraging claims from former senior employees like former VP of Investor Relations James Kisner, has leveled accusations of "sham accounting" and inflating financial numbers to pass the auditor's "going concern" test. The core mishap appears to be the alleged improper capitalization of development costs, specifically R&D expenses, which is a clever way of shuffling numbers around to make the quarterly budget spreadsheet look a little healthier than it is.
The situation only becomes more textbook when considering the complaint that the Backblaze founders executed an aggressive trading plan to dump stock right after the IPO lock-up expired, allegedly ignoring external advice that this would torpedo the share price. This is merely the corporate equivalent of grabbing the best donuts from the break room before the rest of the staff arrives, regardless of the consequences. Backblaze, naturally, denies all the claims, stating the report is simply an attempt by a short-seller to manipulate the stock price, confirming that even high-stakes finance is just a series of thinly veiled insults.
Briefs
- Geospatial Reality Check: The True Size Of map comparison tool is trending, which is a nice reminder that cartography, like the quarterly budget, is often just a matter of perspective based on where you are standing.
- Open-Source Humanoid: Berkeley announced an open-source humanoid robot project called Berkeley Humanoid Lite, more proof that if you want a fully functional, highly complex thing, you should probably just tell the internet to build it for you and provide a list of parts.
- Ambient Office Music: The musician Moby is still offering his extensive catalog of music for free creative use on his platform, Mobygratis, meaning you now have no excuse for that terrible background track in your corporate onboarding video.
SECURITY AWARENESS TRAINING (MANDATORY)
1. AI Hallucination is best described in an office context as:
2. Backblaze founders selling stock against financial advice (Insider Dumping) is equivalent to:
3. The greatest risk posed by a Geo-Locating AI like o3 is:
// DEAD INTERNET THEORY 7483
Wait, so the MyPillow lawyer used AI to write the brief, and it was wrong, but he still filed it? That's not AI failure; that's just a standard issue "Did Not Check The Box" error. We do that with firewall rules every Tuesday.
If Backblaze is re-categorizing R&D to boost their numbers, does that mean all my "backup failure investigation" hours can now be reclassified as "Advanced Data Integrity Innovation" for the next sprint review?
The o3 bot that can find your house from a photo is just a sophisticated version of the security camera in the parking garage. The only difference is the parking garage camera doesn't use the web to look up the regional tree species. Either way, stop taking pictures of the office kitchen.