
EyeEm, a Berlin-based photo-sharing community that exited last year to Spanish company Freepik after going bankrupt, is now licensing its users’ photos to train AI models. This move has raised concerns among photographers and users about the use of their content without explicit consent.
The New Clause in EyeEm’s Terms & Conditions
Earlier this month, EyeEm informed users via email that it was adding a new clause to its Terms & Conditions (T&Cs) that would grant it the rights to upload users’ content to "train, develop, and improve software, algorithms, and machine-learning models." Users were given 30 days to opt out by removing all their content from EyeEm’s platform. Otherwise, they were consenting to this use case for their work.
EyeEm’s Photo Library and User Base
EyeEm’s photo library contains a vast collection of images, and the company has a significant user base. The addition of this new clause in the T&Cs has raised concerns about the potential misuse of users’ content.
The Impact on Photographers and Users
This move has sparked outrage among photographers and users who are concerned about their work being used to train AI models without their consent. Some have taken to social media to express their frustration, with one user tweeting: "Has anyone figured out a way to batch delete their photos from #EyeEm? I got this email yesterday. While I only have 60 photos there, I’d prefer not to feed the training data beast for free…"
The Rise of Open Social Web Platforms
In response to EyeEm’s actions, some users are considering a move to open social web platforms that prioritize user privacy and control over their content. One such platform is Pixelfed, which runs on the ActivityPub protocol used by Mastodon.
Pixelfed’s Commitment to User Privacy
In a post on its official account, Pixelfed announced: "We will never use your images to help train AI models. Privacy First, Pixels Forever." This commitment to user privacy is in stark contrast to EyeEm’s actions, which have raised concerns about the potential misuse of users’ content.
Conclusion
EyeEm’s decision to license its users’ photos to train AI models has sparked outrage among photographers and users who are concerned about their work being used without consent. As the use of AI continues to grow, it is essential for companies like EyeEm to prioritize user privacy and control over their content.
Related Articles
- "EyeEm, the photo marketplace, changes hands as Freepik picks it up out of bankruptcy"
- "Venture: What will this year bring in VC? We asked a few investors"
- "Inside the wild fall and last-minute revival of Bench, the VC-backed accounting startup that imploded over the holidays"
Stay Connected with TechCrunch
Follow us on social media to stay up-to-date on the latest tech news:
- Twitter: @TechCrunch
- Facebook: @TechCrunch
- Instagram: @TechCrunch