According to Vijay Balasubramaniyan, the images provided by Pindrop Security show the company called “Ivan X” called “Ivan X” called “Ivan X” called “Ivan X”.
Courtesy: Pindrop Security
Voice Authentication Startup When Pindrop Security posted the opening of his recent job, one candidate stood out from hundreds of candidates.
The applicant, a Russian coder named Ivan, appeared to have all the right qualifications for the senior engineering role. However, when he was interviewed in the video last month, the Pindrop recruiter noticed that Ivan’s expression was slightly out of sync with his words.
That’s because the candidate, known as “Ivan X,” was a con man who used Deepfake software and other generator AI tools to be hired by a tech company, says Pindrop CEO and co-founder Vijay Balasubramaniyan.
“The AI General blurred the line between what it means to be human and to be a machine,” Balasbramaniyan said. “What we see is that individuals are using these false identities and false faces and false voices to secure employment.
Companies have long fought attacks from hackers who want to exploit the vulnerabilities of their software, employees, or vendors. Another threat is now revealed. Job seekers who aren’t the ones they say will equip AI tools to create photo IDs, generate employment history, and provide answers during interviews.
According to research and advisory firm Gartner, the rise of AI-generated profiles means that by 2028, one in four job-seeking candidates will be fake.
The risks from a company bringing in fake job seekers may vary depending on the person’s intentions. According to Balasubramaniyan, employers can install malware to request ransoms from the company and steal customer data, trade secrets, or funds. He said that in many cases, deceitful employees simply collect salaries that they can’t otherwise do.
“Major” increase
Industry experts have seen CNBC a recent spike in cybersecurity and cryptocurrency companies into fake job seekers. These people said, as companies often employ them for remote locations, they present valuable targets to bad actors.
Ben Sesser, CEO of Brighthire, said he first heard of the issue a year ago and that the number of fraudulent job seekers has “a significant increase” this year. His company helps more than 300 corporate clients assess employees who evaluate finance, technology and healthcare in video interviews.
“Humans are generally weak links in cybersecurity, and the recruitment process is essentially a human process with many handoffs and many different people,” Sesser said. “That’s become a weakness that people are trying to expose.”
However, this issue is not limited to the tech industry. Over 300 US companies have hired scammers who mistakenly tied to North Korea for IT jobs, including major national television networks, defense manufacturers, automakers and other Fortune 500 companies.
Workers used their stolen American identity to apply for remote jobs and deployed remote networks and other techniques to mask the real place, the DOJ said. They ultimately sent millions of dollars in wages to North Korea to help fund the country’s arms program, the Justice Department argued.
The case reveals just a small portion of what U.S. authorities said, including the ring of suspected enablers, including American citizens, is a vast international network of thousands of IT workers with North Korean ties. The DOJ has since filed more lawsuits involving North Korean IT workers.
Growth Industry
If the experience of CAT Labs founder and CEO Lili Infante is any indication, then we are not raising fake job seekers. Her Florida-based startup is at the intersection of cybersecurity and cryptocurrency, which attracts it to particularly bad actors.
“Every time we list a job, 100 North Korean spies apply to it,” Infante said. “When you look at your resume, they look amazing. They use all the keywords for what we’re looking for.”
Infante said her company is leaning against identity verification companies to eliminate fake candidates who are part of the emerging sector, including companies like Idenfy, Jumio and Socure.
The poster show the FBI wanted said the agency was a North Korean IT worker and officially referred to as the Democratic Republic of Korea.
Source: FBI
According to veteran computer security consultant Roger Grimes, the fake employee industry has expanded in recent years beyond North Korea to include criminal groups in Russia, China, Malaysia and South Korea.
Ironically, some of these illicit workers are considered top performers in most companies, he said.
“Sometimes they do the role well, and then sometimes they do it very well, so I told me some people that I actually felt sorry they had to let them go,” Grimes said.
His employer, cybersecurity company Knowbe4, said he accidentally hired a North Korean software engineer in October.
Workers used AI to modify stock photos and passed a background check that includes four video interviews, combined with valid but stolen US identity, the company said. He was only discovered after the company found suspicious activity from his account.
Fighting Deepfakes
According to Brighthire’s Sesser, despite the DOJ case and several other published cases, employment managers in most companies are generally unaware of the risk of fake job seekers.
“They are in charge of talent strategies and other important things, but being at the forefront of security has not been one of them historically,” he said. “People think they haven’t experienced it, but I think it’s most likely they’re not realising that it’s happening.”
As Deepfake technology improves the quality of its technology, problems will become more difficult to avoid, Sesser said.
Regarding “Ivan X”, Pindrop’s Balasubramaniyan said the startup used a new video authentication program that he created to confirm that he was a deepfake scam.
Ivan claimed to be located in western Ukraine, but his IP address indicated that he was actually a Russian military facility that could be located in a Russian military facility near the North Korean border, and was actually thousands of miles to the east.
Boasted by Andreessen Horowitz and Citi Ventures, Pindrop was founded over a decade ago to detect fraud in voice interactions, but may soon challenge video authentication. Clients include the largest US banks, insurance companies and health companies.
“We can no longer trust our eyes and ears,” Balasbramaniyan said. “Without technology, you’re worse than a monkey with a random coin toss.”

Source link