Ofcom, the UK’s Internet Safety Regulator, has released another new draft guidance as it continues to implement the Online Safety Act (OSA). Threats include harassment, bullying, misogyny, and abuse of intimate images.
The government has said protecting women and girls is a priority in implementing OSA. A certain form of misogynistic abuse – as an enforcement priority, such as sharing intimate images without consent or creating deep-fark porn targeted at individuals using AI tools It is explicitly set by the law.
Online safety regulations approved by the UK Parliament in September 2023 face criticism that they are not left to the task of reforming the platform giant, despite including substantial penalties for non-compliance. Masu.
Child safety campaigners also expressed their dissatisfaction with how long it will take to implement the law, whether it will have the desired effect.
In an interview with the BBC in January, even Technology Minister Peter Kyle inherited the law from the previous government – even called it “very uneven” and “unsatisfied.” But the government is sticking to this approach. Some of the complaints around OSA can be traced back to the long lead time ministers who were allowed to implement the administration.
However, enforcement is expected to begin soon in relation to illegal content and core requirements for addressing child protection. Other aspects of OSA compliance are time-consuming to implement. Ofcom also recognizes this latest practice advisory package and is not fully enforceable until 2027 or later.
We are approaching the start line of execution
“The first online safety law obligation will be in effect next month,” Jessica Smith of Ofcom, who led the development of guidance focusing on women’s safety, told TechCrunch in an interview. “So, ahead of this guidance, we will enforce some of the core obligations of online safety laws. [itself becoming enforceable]. ”
The new draft guidance on keeping women and girls safe online is intended to supplement the previous broad OFCOM guidance on illegal content. This provides recommendations for protecting minors from looking at minors online, for example.
In December, regulators released final guidance on how platforms and services can reduce risks related to illegal content, where child protection is a clear priority.
I also previously wrote safety codes for children. It recommends dialing up age checks and content filtering to online services to prevent children from being exposed to inappropriate content such as pornography. We are also working to implement an online safety system, so recommendations for age-guaranteed technology for adult content websites have been developed. The purpose of porn sites is to take effective measures to prevent minors from accessing age-appropriate content.
The latest guidance set was developed for each OFCOM with the support of victims, survivors, women’s advocacy groups and safety experts. This covers four main areas that women say are disproportionately affected by online harm: online misogyny. Mountains and online harassment. Online domestic abuse; and abuse of intimate images.
Safety through design
Ofcom’s top-line recommendations encourage in-scope services and platforms to take a “safety by design” approach. Smith says he wants to encourage regulators to “take a step back” for tech companies and “think about user experience in the round.” She acknowledged that there are several services being implemented that will help reduce online risks in this area, but she prioritizes the safety of women and girls. He insisted that he had no overall idea about what to do.
“What we really want is a kind of step change in how the design process works,” she said, saying the goal is for safety considerations to be burned into product design.
She highlighted the increase in images generating AI services. This is an example of the “big” of deepfake intimate image abuse, as an example where technicians could take aggressive steps to crimson the risks of tools weaponized to target women and girls. He said it led to “scathing” growth. – That wasn’t the case yet.
“We think there are some wise things that services can do during the design stage and will help us address some of those harmful risks,” she suggested.
Examples of “good” industry practices of guidance include online services that take actions such as:
Remove geolocations by default (to reduce privacy/stalking risk). Perform “abuse” tests to identify how services are weaponized/misused. Take steps to increase account security. Design user prompts aimed at making posters think twice before posting abusive content. Provides an accessible reporting tool that allows users to report problems.
As with all OFCOM OSA guidance, not all measures are related to services of all types or sizes. , forums and messaging apps, give them some names. Therefore, the majority of the scope company’s work is to understand the meaning of compliance in the context of the product.
When asked if Ofcom is currently identifying services that meet the guidelines for guidance, Smith suggested that this is not the case. “There’s still a lot more to do in the industry as a whole,” she said.
She also implicitly acknowledged that the challenges could be heightened given some of the retrograde measures taken by some of the key industry players against trust and safety. Ta. For example, since taking over Twitter and rebranding social networks as X, Elon Musk has hampered trust and safety personnel.
In recent months, Facebook and Instagram owned Meta has ended its 30-party fact-check agreement in support of deploying an X-style “Community Note” system on content disputes. It appears that they have taken some imitation measures, saying that they have been doing so. for example.
Transparency
Smith focuses on using transparency and information gathering power, as operator behavior can be subject to dial-up, online harm rather than attenuated. I suggested that exercise under OSA, explain impact and focus on driving the user.Awareness.
So, in short, the tactic here is set to “Name and Shame” at least in the first example.
“After completing the guidance, I’ll create an a [market] Report…Who is using guidance, who is following what steps, what results they are achieving for users who are women and girls, and what protections are in place across different platforms It really shines a light, so make informed choices about where they spend their time online about what they can do,” she told us.
Smith wants to avoid the risk of not publicly shameful about poor performance in women’s safety, companies turn their eyes to OFCOM’s guidance on “practical steps” on how to improve users’ situations It suggested that it could also address the risk of reputational harm. .
“Platforms operating in the UK must comply with UK laws,” she added in the context of discussions on key platforms that emphasize trust and safety. “It means complying with the protection of illegal harm obligations and child obligations under the online safety law.”
“I think this is where our power of transparency comes in. If the industry is turning direction and the harm is increasing, here we will be shining a light on the UK users, media, MPs, You can share related information with
Techniques for working on deep falak porn
One type of online harm when OSA enforcement actively strengthens recommendations before OFCOM starts actively, as the latest draft guidance detects and removes such abusive images. It suggests detecting and removing previous OFCOM recommendations, as suggesting using hash matching. That’s all for that.
“This guidance includes additional steps beyond what is already set in the code,” Smith updates his previous code to incorporate the changes “in the near future” ofCOM plans. I checked the plan.
“So this is a way to tell the platform that by following the steps set in this guidance, we can preempt that enforceable requirement,” she added.
Ofcom uses hash matching technology to counter intimate image abuse, especially in relation to the abuse of AI generated deepfake images. I recommended that.
“In 2023, there were more deep and intimate image abuse reported than in all previous years,” she said, and Ofcom gave more evidence on the effectiveness of hashmatching to tackle this harm. He added that they are collecting it.
The entire draft guidance will be under consultations from Ofcom to invite feedback until May 23, 2025. We will then create final guidance by the end of this year.
After 18 months, Ofcom will produce the first report reviewing industry practices in this area.
“We’ll enter 2027 before we create our first report on who is doing what [to protect women and girls online] – But there’s nothing to stop the platform from working now,” she added.
In response to criticism that OSA is too long to implement OFCOM, she said it’s right for regulators to discuss compliance measures. However, with the final scale coming into effect next month, she noted that Ofcom is anticipating changes in conversation surrounding the issue.
“[T]In particular, hats really start to change the conversation with the platform,” she predicted, adding that she is in a position to demonstrate advances in needle movement in terms of reducing online harm.
Source link