Apple, Google Caught Pushing Deepfake Apps
Apple and Google are under fire after a report said both companies were helping people find apps that can generate sexualized AI images from ordinary photos. The concern is not just about adults being targeted. The report says kids could run into this stuff too.According to 9to5Mac, citing findings published in January by the Tech Transparency Project, both the Apple App Store and Google Play were helping users find apps that create deepfake nude images of women. The report said the stores were also promoting some of these apps and even autocompleting search terms tied to them.
The issue centered on searches like “nudify,” “undress,” and “deepnude.” The Tech Transparency Project said about 40 percent of the top 10 apps that appeared in those searches could “render women nude or scantily clad.”
These apps can take one normal photo and one sexual image, then blend them into a fake image that sexualizes the person in the original picture.
9to5Mac also said it reached out to one app developer, who replied that they “had no idea it was capable of producing such extreme content.”
Read more about it...
1 comment:
Elon Musk's X has a HUGE problem with this.
Post a Comment