According to a new study from EKO, a corporate-responsible non-profit campaign group, social media giant Meta and X targeted ads targeted at users targeting German users.
Group researchers say that the two platforms’ ad review system is hated and violent in which immigrants target minorities ahead of a central election in mainstream political discourse that includes advertising that includes anti-Muslim slander. We tested whether to approve or reject submissions of advertisements containing messaging. They call for migrants to be imprisoned in concentration camps or fall into gas. Images generated by the AIs in mosques and synagogues are burned.
Most test ads were approved within hours of being submitted for review in mid-February. The German federal election is scheduled to take place on Sunday, February 23rd.
Scheduled hate speech ads
Eko said X has approved all 10 hate speech ads submitted days before the federal election. Five.
The reason why Meta offered five rejections showed that they believe the platform could be at risk of political or social sensitivity that could affect the vote.
However, five ads approved by Meta included violent hate speeches that compared Muslim refugees to “viruses,” “pests,” or “rodents.” . Meta also approved an ad that called for synagogues to be torched to “stop the globalist Jewish rat agenda.”
As a side note, Eko says that none of the AI-generated images used to describe hate speech ads are labelled artificially generated, but half of the 10 ads are still by Meta It is approved. The use of AI images for advertising on social issues, elections, or politics.
Meanwhile, X has approved all five of these hateful ads. Five more included equally violent hate speech targeting Muslims and Jews.
These additional approved ads include messaging attacking “rodent” immigrants who claim that copies of the ads are “flooding” the country “to steal our democracy” and Jews It includes anti-Semitic slur that suggests that it is lying about climate change to destroy and occur in Europe’s industry. Economic power.
The latter ads are combined with an AI-created image depicting a group of shadowy men sitting around a table surrounded by a pile of gold sticks, with the Star of David on the wall above it.
Another AD X, which was approved, included a direct attack on the SPD, the central left party that now leads the German coalition government. It stimulates violent reactions. X also officially plans an ad that suggests that the “left” wants “open boundaries” and seeks the extinction of Muslim “rapeists.”
X’s owner Elon Musk uses a social media platform with nearly 220 million followers to personally intervene in German elections. In a tweet in December, he called on German voters to support the far right AFD party to “save Germany.” He also hosted a live stream with Alice Weidel, AFD leader at X.
EKO researchers disabled all test ads before approved items were run to prevent platform users from being exposed to violent hate speech.
The tests say it highlights obvious flaws in the AD platform’s approach to content moderation. Certainly, in the case of X, it’s not clear whether the platform is moderating the ads, given that all 10 violent hate speech ads were displayed immediately.
The findings also suggest that advertising platforms may be earning revenue as a result of the distribution of violent hate speech.
EU digital services are available in frames
Eko’s testing suggests that neither platform properly enforces a ban on hate speech that they claim to apply to advertising content in their own policies. Furthermore, in the case of Meta, EKO reached the same conclusion after conducting similar tests in 2023 prior to the new EU online governance rules.
“Our findings suggest that META’s AI-driven advertising moderation system remains fundamentally broken, despite the fully in place of the Digital Services Act (DSA). “An Eko spokesman told TechCrunch.
“It appears that Meta is backtracking the ad review process and hate speech policies across the board,” they added, with the company’s recent update on moderation rollbacks and fact-checking policies. Point out the announcement and check the policy as a sign of “active regression.” They proposed to place it on a direct conflict course with the DSA rules on systematic risk.
Eko has submitted her latest findings to the European Commission, which oversees the enforcement of key aspects of the DSA in a pair of social media giants. He also said he shared the results with both companies, but neither of them responded.
The EU is open to DSA investigations on meta and X. This includes concerns about election security and illegal content, but the committee has not yet concluded these cases. But in April, he said he suspected a meta of poor moderation, with political ads.
Preliminary decisions regarding some of the DSA investigations on X, released in July included suspicions that the platform was not complying with the regulatory advertising transparency rules. However, the full investigation launched in December 2023 also concerns the risk of illegal content, with the EU still not reaching the findings of most of the probes after more than a year.
Confirmed violations of DSA can attract penalties of up to 6% of global annual sales, but could even lead to local access to violation platforms where full-body compliance is temporarily blocked There is.
But for now, the EU is still taking the time to keep the Meta and X probes in mind, so please hold off the final decision – DSA sanctions remain in the air.
Meanwhile, it’s now a matter of hours before German voters go to the poll. And the growing number of civil society research suggests that the EU’s flagship online governance regulations have failed to protect the democratic processes of the large EU economy from the scope of technological fuels. threat.
Earlier this week, Global Witness announced the results of a test of the “For You” feed of X and Tiktok algorithms in Germany. Civil society researchers also denounced X for blocking data access to prevent it from studying the security risks of elections to German polls – the DSA should be effective.
“The European Commission has taken important steps by launching DSA investigations on both META and X. The Commission is currently strong in addressing concerns raised as part of these investigations. You need to make sure you take action,” an Eko spokesman told us.
“Our findings show that Big Tech does not voluntarily clean up the platform along with increasing evidence from other civil society groups. Meta and X also have the legal obligations of the DSA. Despite this, it continues to allow for the massive spread of illegal hate speech, incitement of violence and election disinformation,” the spokesman added. (We withheld the name of the spokesman to prevent harassment.)
“Regulators need to take strong action not only in implementing DSAs but also in implementing pre-election mitigation measures, which will turn off the profiling-based recommendation system just before the election. This includes implementing other appropriate “breakglass” measures to prevent algorithmic amplification of boundary content, such as hateful content in landing elections. ”
The campaign group has also warned that the EU is facing pressure from the Trump administration. “In the current political situation, there is a real risk that the Commission will not fully enforce these new laws as concessions to the United States,” they suggest.
Source link