SoundCloud appears to have quietly changed its terms of use so that users can train AI with audio they upload to the platform.
As discovered by high-tech ethicist Ed Newton Rex, the latest version of SoundCloud terminology provides information and training to the platform for using uploaded content. [or] Development” ai.
“You expressly agree that the Content may be used as input to and to provide the Services as part of the Services and to be used to provide the Services,” read the terminology last updated on February 7th.
These terms are engraved in content under a “individual agreement” with third-party right-hand sholders, such as record labels. SoundCloud has many license agreements with indie labels and major music publishers, including the Universal Music and Warner Music Group.
TechCrunch was unable to find an explicit opt-out option in the platform’s settings menu on the web. SoundCloud did not immediately respond to requests for comment.
SoundCloud, like many large creator platforms, is increasingly embracing AI.
Last year, SoundCloud partnered with almost a dozen vendors to provide AI-powered tools to the platform for remixing, generating vocals and creating custom samples. In a blog post last fall, SoundCloud said they would “securing rights holders” to allow these partners to receive access to the content ID solution. [sic] He pledged to receive appropriate credit and compensation and “support ethical and transparent AI practices that respect the creator’s rights.”
TechCrunch Events
Berkeley, California
|
June 5th
Book now
Many content hosting and social media platforms have changed policies in recent months to enable first-party and third-party AI training. In October, Elon Musk’s X updated its privacy policy to allow external companies to train AI with user posts. Last September, LinkedIn revised the conditions and rubbed user data for training. And in December, YouTube began training third parties to AI with user clips.
Many of these moves encourage backlash from users who argue that AI training policies should be opted in as opposed to opt-out, and argue that they should be credited and paid for their contributions to the AI training dataset.
Updated at 2:22pm Pacific: A SoundCloud spokesman provided a statement via email.
“SoundCloud has not trained AI models using artist content, nor has it developed AI tools or used SoundCloud content for AI training purposes. In fact, we have implemented technical safeguards that include a “no ai” tag on our sites to prohibit any use that is not expressly permitted.
The February 2024 service update was intended to clarify how content interacts with AI technology within SoundCloud’s proprietary platform. Use cases include personalized recommendations, content organizations, fraud detection, and improved content identification with the help of AI technology.
Future applications of AI on SoundCloud are designed to support human artists and enhance the tools, features, reach and opportunities available on the platform. Examples include improving music recommendations, generating playlists, organizing content, and detecting fraud. These efforts are consistent with existing licensing agreements and ethical standards. Tools like [those from our partner] Musiio is used strictly to enhance artist discovery and content organization rather than training generative AI models.
We understand the concerns raised and continue to be committed to opening dialogue. Artists continue to control their work. Our community continues to inform every step of the way, especially as legal and commercial frameworks continue to evolve. ”