Generating AI comes in a variety of forms. But more and more, it’s being sold in the same way. It doesn’t feel like a code, and is sold under human names and personas like coworkers. More and more startups are about personifying AI to quickly build trust and mitigating threats to human work. It is dehumanisation and accelerated.
You can see why this framing took off. In today’s upside down economy where all employers feel like a risk, enterprise startups (emerging from the well-known accelerator Y-combinator) are throwing AI as staff rather than as software. They sell alternatives. AI assistant. AI coder. AI employee. The language is purposefully designed to appeal to overwhelmed employment managers.
Some people don’t care about their delicateness. For example, Atlog recently introduced “Furniture Store AI Employees” that handle everything from payment to marketing. One excellent manager, Gloats can run 20 stores at a time. Implications: You don’t need to hire more people – just expand the system for you. (It’s not said what will happen to the 19 managers that will replace it.)
Consumer startups are leaning towards similar tactics. Humanity was named its platform. Because it is a warm, reliable and resonant companion of a faceless, concrete neural net. It’s a direct tactic from Fintech Playbooks where apps like Dave, Albert and Charlie have hidden transactional motivations with friendly names. When dealing with money, I feel that it’s better to trust “friends.”
The same logic is sneaking into AI. Would you like to share sensitive data with machine learning models and your Bestie Claude who remembers you? (For Openai’s trust, it tells you that you are chatting with a “generated pre-trained transformer.”)
But we are reaching a tipping point. I’m really excited about the generation AI. Still, all new “AI employees” are beginning to feel more dehumanized. All new “Devin” wonders when the real Devin of the world pushes back to being abstracted into a bot that is abstracted.
Generating AI is no longer just curiosity. Even if the impact remains unknown, its reach is expanding. In mid-May, 1.9 million unemployed people received continued unemployment benefits. It’s the best since 2021. Many of them were layoff technology workers. Signals are stacked.
Some of us still remember “2001: Space Odyssey.” Hal, an onboard computer, starts off as a gentle and kind assistant before turning the murder completely and cutting off the life support of the crew. It’s science fiction, but it’s nerve-wracking for a reason.
Last week, humanity CEO Dario Amodei predicted that AI could eliminate half of its entry-level white-collar jobs in the next 1-5 years, bringing unemployment to 20%. “largely [of these workers are] He told Axios.
You can argue that it is not comparable to blocking someone’s oxygen, but the phor is not that far. Automating more people from pay will have consequences, and with increasing layoffs, AI branding as a “co-worker” will look less clever and more calm.
The shift to the generated AI occurs regardless of packaging. However, companies have options in how to describe these tools. IBM never referred to mainframes as “digital colleagues.” The PC was not a “software assistant.” They were workstations and productivity tools.
Language is still important. Tools need to be empowered. But more and more companies are selling something completely else, and that feels like a mistake.
No more AI “employees” are needed. It requires software to extend the possibilities of real humans, making it productive, creative and competitive. So stop talking about fake workers. Show us tools that will help a good manager run complex businesses and help individuals to make more impact. That’s all everyone really wants.
Source link