A friendly, yet fake smile.Image: screenshot facebook / emily hart
She’s young, she’s good-looking, and she lives up to Donald Trump with her content on social media. The MAGA community is enthusiastic about “Emily” – much to the delight of one student.
04/24/2026, 07:2504/24/2026, 07:25
Kathrin Martens / watson.de
She poses in a bikini while ice fishing, shoots guns, drinks beer and spreads tough conservative slogans. Millions click, like, follow. For many, “Emily Hart” is the perfect mix of patriotism and sex appeal.
But Emily doesn’t actually exist.
According to a report by “Wired,” a 22-year-old medical student from India is behind the viral account. With the AI-generated images, he has built a small online empire largely on Instagram and Facebook – and claims to have earned thousands of dollars a month with them.
The idea came from AI
The idea apparently arose from a lack of money. During his studies, the aspiring orthopedist was looking for ways to make money quickly. Classic attempts such as YouTube or selling learning materials failed. It wasn’t until he started developing content with the help of Google Gemini that his breakthrough came.
The AI is said to have advised him to specifically create a character for a conservative US audience – a target group that is considered particularly loyal and wealthy.
This is how “Emily Hart” was created, a young, attractive woman, visually based on stars like Jennifer Lawrence and Sydney Sweeney. In her staged online persona, she is a nurse, loves the Bible, guns, beer – and above all, clear political messages. Her posts are deliberately provocative, often radically conservative, religiously charged and anti-liberal. That’s exactly what brought reach.
Within a month, the account collected tens of thousands of followers, and individual videos reached millions of views. The student perfected the system: daily posts with calculated outrage, so-called “rage bait,” which provokes both agreement and disagreement – and thus feeds the algorithm.
Profits with AI “absurdly easy”
The whole thing was quickly monetized. In addition to merchandise, the student also sold exclusive content via the Fanvue platform, which – unlike many competitors – allows AI-generated content. There, users paid for revealing images and direct contact with the alleged influencer.
The operator himself describes the business model opposite Wired Looking back, it was absurdly simple: less than an hour of work per day, but income that he couldn’t achieve with a normal job in his home country.
According to his own statements, he thinks the target group that made him rich is “extremely stupid.” At the same time, he admits that a comparable experiment with a liberal character was a complete failure – the audience recognized the artificial nature of the content more quickly and interacted less.
Experts do not see this as an isolated case. Valerie Wirtschafter from the Brookings Institution warns, according to the New York Postthat AI is making such deceptions increasingly convincing. Politically charged content in particular in combination with attractive characters could specifically attract attention and manipulate users.
He doesn’t see himself as a fraud
The account apparently remained unmolested for a long time, partly because platforms like Instagram did not consistently enforce their own rules for labeling AI content.
The profile was only deleted in February due to fraudulent activity. Other fake accounts, such as an alleged soldier influencer close to Donald Trump, also show how widespread the phenomenon has become.
However, the student does not see himself as a fraud. He says he didn’t harm anyone directly. In the meantime, he wants to concentrate on his medical studies again.