Adobe to start spying on all your images and videos to enforce new content censorship rules
Photoshop maker Adobe recently changed its terms of service to give itself the power to look through your files and existing projects for so-called “content moderation” purposes.
The new policy notes that they “may access your content through both automated and manual methods, such as for content review.”
They are justifying this blatant invasion of privacy by claiming their intention is to detect and remove illegal content such as child sexual abuse material as well as behavior like spam and phishing. They also say that advancements in artificial intelligence technology mean it has become easier than ever to “create realistic images and human-sounding text and audio,” so these checks are necessary for safety reasons.
The new terms will affect more than 20 million global users of Adobe Creative Cloud Site.
Adobe has also made changes to its terms of service that empower it to delete content from accounts that are inactive; they did not specify what length of inactivity would qualify an account for content deletion.
However, one of the biggest concerns is that Adobe can now access work that is generated by people using their platforms, such as Acrobat and Photoshop, and they can do this not only by claiming they’re looking for illegal content but also to train AI platforms. They say that their automated systems could analyze users’ content with machine learning with a view to improving their software, services and user experience.
Many Photoshop users won’t accept Adobe’s invasion of privacy
Not surprisingly, this has prompted a slew of criticism and concern among users. The publication Law Enforcement Today and its affiliates have decided to cut all ties with Adobe, while the founder of Songhorn Studios, Sam Santala, called Adobe out in a posting on X, writing: “So am I reading this right? @Adobe @Photoshop I can’t use Photoshop unless I’m okay with you having full access to anything I create with it, INCLUDING NDA work?”
We are building the infrastructure of human freedom and empowering people to be informed, healthy and aware. Explore our decentralized, peer-to-peer, uncensorable Brighteon.io free speech platform here. Learn about our free, downloadable generative AI tools at Brighteon.AI. Every purchase at HealthRangerStore.com helps fund our efforts to build and share more tools for empowering humanity with knowledge and abundance.
The designer Wetterschneider, who works with clients like Nike and DC Comics, cautioned: “Here it is. If you are a professional, if you are under NDA with your clients, if you are a creative, a lawyer, a doctor or anyone who works with proprietary files – it is time to cancel Adobe, delete all the apps and programs. Adobe can not be trusted.”
It’s easy to see how these new terms could cause a range of problems for people. The idea that the company could use private and sensitive data belonging to users and their clients that is protected by nondisclosure agreements for AI training purposes is tough to accept and could even compromise their livelihood.
While Adobe claims that it only uses Adobe stock images for training its Firefly system, numerous artists have reported that their names are used as search terms in the Adobe stock footage site and results often yield art generated by AI that mimics their style.
Right now, users are unable to use Photoshop until they have agreed to the new terms of service. Moreover, those who want to cancel their subscription because they don’t agree to the terms and conditions are finding that they must actually agree to them before they can even sign in and delete their account.
Adobe has tried to quell some of the backlash by writing a blog post assuring people that their content will still belong to them as creators. They maintain the new terms of use are aimed at product improvement and content moderation for legal purposes. However, the terms of use seem deliberately broad and vague, and many users say they will be seeking alternatives.
Sources for this article include:
LawEnforcementToday.com
Engadget.com
Read full article here