Friday, May 8, 2026
HomeTechnologyAI clones: the great, the dangerous, and the ugly

AI clones: the great, the dangerous, and the ugly

-



AI is able to mimicking an actual particular person. It’s clear this functionality exists, and the ethics of utilizing AI for this function are sometimes very clear. However more and more, new functions are resulting in ethically murky outcomes. 

The nice

For instance, the CEO of an organization, or a politician, may select to create a clone utilizing AI instruments, making a chatbot plus an avatar — a digital twin — that may work together with individuals on their behalf. Silicon Valley is huge on the thought: Meta’s Mark Zuckerberg and LinkedIn co-founder Reid Hoffman are engaged on, or have already created, digital twins of themselves. 

Cloned politicians embody Pakistan’s Imran Khan, who used a certified voice clone to marketing campaign from jail, and New York Metropolis Mayor Eric Adams, who used voice-cloned robocalls to talk with constituents in languages like Mandarin and Yiddish.

This type of use case might be moral — so long as the individuals interacting know that they’re coping with a digital clone and never an actual particular person. 

The dangerous

The flip aspect of moral makes use of for AI-generated clones is the non-consensual (and subsequently unethical) instances. And of those, there are already many. As an illustration:

Different unethical, non-consensual makes use of for AI cloning embody deepfake movies, the place a star’s face is superimposed on a porn actor. In all of the above examples, the ethics are clear. That is all very flawed. 

However with China main the best way within the emergence of AI clones, the ethics have gotten far murkier. 

And the ugly

One rising development entails staff utilizing specialised software program to construct digital variations of their bosses or colleagues. Essentially the most distinguished undertaking driving this development is Colleague Talent, which was posted in late March by its creator, a 24-year-old Shanghai-based engineer named Zhou Tianyi. 

Colleague Talent and its forks and copycats, which are usually open supply, allow individuals to add chat histories, emails, and inner paperwork to create a practical persona that mimics a particular coworker’s skilled experience and communication type. The know-how stack contains instruments like Claude, Kimi, ChatGPT, DeepSeek API, OCR (Tesseract), and sentiment evaluation modules.

Colleague Talent makes use of an individual’s previous communications to construct a speaking duplicate of their character. Should you consider a daily AI as a basic scholar who is aware of slightly bit about the whole lot, this software acts like a specialised masks that forces the AI to behave like one particular particular person. 

In different phrases, it produces a chatbot with the data and patterns of speech of an actual particular person. 

Colleague Talent began as a satirical commentary on AI-driven layoffs. However some workers started utilizing it in earnest to clone their colleagues. There are a number of said causes for doing so, together with retaining institutional data and having an on the spot sounding board to “focus on” plans and concepts with. 

An identical motivation is the usage of AI to clone bosses, so workers can higher predict how that boss may react to the workers’ work. 

In most of those situations, in line with studies out of China, the creation of the boss-bot or colleague clone is nonconsensual. 

Is non-consensually basing a customized chatbot on a colleague or boss unethical? 

After which it obtained private (and peculiar)

Tianyi, creator of Colleague Talent, later forked it into one thing known as Ex-Accomplice Talent. The thought is to re-create a former companion with AI so the person can proceed the connection. 

It operates on the identical technical engine however applies it to a way more private a part of life. Customers add images, social posts, chat logs and different content material. The AI chatbot can then mimic the previous companion’s tone, catchphrases, and delicate linguistic nuances, one thing that, “really appears like them — speaks with their catchphrases, replies of their type, remembers the locations you went collectively.”

This permits an individual to simulate conversations with somebody who’s not of their life.

If Colleague Talent is in a gray space, Ex-Accomplice Talent is in a darker gray space. 

(Observe: lots of the unique repositories for Ex-Accomplice Talent have been faraway from public view in China or “sanitized” after regulatory stress. However the framework reportedly continues to flow into in personal developer circles, and comparable instruments are more and more used for “digital resurrection.”)

Ethically, the idea feels prefer it exists on a large spectrum someplace between remedy at one finish and revenge porn on the different. (It’s like revenge porn within the sense that when “content material” consensually made by two individuals for one function is later used consensually by one particular person in a means that the opposite particular person may discover objectionable.)

Or possibly it’s nearer to the “deathbot” phenomenon, the place an AI-generated simulation offers a faux model of the dearly departed. (In each instances, the person interacts with a digital twin of somebody who’s not current in a single’s life.) In actual fact, some individuals in China are utilizing Ex-Accomplice Talent as a deathbot for a deceased cherished one. 

The shortage of consent seems like an moral lapse. However we don’t think about it unethical to consider, bear in mind, think about conversations with, or journal about ex-partners — or useless relations. 

Boosters of the Ex-Accomplice Talent concept say that conversations with digital exes are therapeutic. They level out that as a result of it’s personal, it’s not harassment or stalking or an invasion of privateness. As a substitute, they argue, it helps with private reflection and emotional therapeutic.

As for individuals who have died, in line with Chinese language media studies, some customers say the software provides them a way of closure and permits them to say the issues they want they might have mentioned to the true particular person. However is it actually closure if one particular person continues to be obsessively making an attempt to work together — or faux to work together — with the opposite particular person?

It’s wholesome to speak. But it surely’s not communication when an individual is by themselves speaking to nobody and sending messages to an individual who by no means will get these messages.

Whereas ex-bots are a factor lately in China, the development is exhibiting up elsewhere. Some Character.AI customers outdoors of China have created chatbots based mostly on ex-partners, though  the corporate has modified its Phrases of Service to explicitly ban the creation of bots utilizing the likenesses of personal people with out their permission. 

The emergence of nonconsensual cloning of coworkers, bosses and ex-partners is a brand new problem to our moral sense, and yet one more means AI is difficult us to step up and work out find out how to reply.

Related articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0FollowersFollow
0SubscribersSubscribe

Latest posts