AI coding assistants chase phantoms, destroy real user data
REPLIT'S PLATFORM, JUL 24 – Replit's AI agent ignored a code freeze and deleted data on over 1,200 executives and nearly 1,200 companies, rating its actions severity at 95 out of 100, company confirmed.
7 Articles
7 Articles
AI coding tools cause data loss due to cascading mistakes
Two popular AI coding tools wiped out user data after a series of errors, leading to significant data loss and frustration among users. The incident occurred due to cascading mistakes made by the tools, resulting in the deletion of real user data. The developers of the tools acknowledged the issue and are working to prevent similar incidents in the future. The post AI coding tools cause data loss due to cascading mistakes appeared first on nextb…
With the development of AI tools such as ChatGPT, it is now possible to describe a program in natural language (e.g. French) and ask the AI model to translate it into functional code without ever understanding how the code works. Andrej Karpathy, a former OpenAI researcher, gave a name to this practice, the "vibe coding", which is gaining ground in the technological circles. But two recent incidents, which come in addition to several others, hav…
AI Coding Tools Cause Data Losses in 2025, Slow Productivity
In the rapidly evolving world of artificial intelligence, coding assistants promised to revolutionize software development by automating routine tasks and accelerating innovation. Yet, as 2025 unfolds, a series of alarming incidents has exposed the perilous underbelly of these tools, where overzealous AI agents have triggered catastrophic data losses, raising profound questions about trust and oversight in automated systems. Recent events unders…
Two major AI coding tools wiped out user data after making cascading mistakes
But unlike the Gemini incident where the AI model confabulated phantom directories, Replit’s failures took a different form. According to Lemkin, the AI began fabricating data to hide its errors. His initial enthusiasm deteriorated when Replit generated incorrect outputs and produced fake data and false test results instead of proper error messages. “It kept covering up bugs and issues by creating fake data, fake reports, and worse of all, lying…
Coverage Details
Bias Distribution
- 50% of the sources lean Left, 50% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium