|
Earlier this year it was revealed that Apple and Google were offering "nudify" apps on their stores despite having clear policies barring such content. Nearly three months later, such apps are not only still available, but being actively promoted on the iOS App Store and Google Play, according to a new report from the Tech Transparency Project (TTP). Many of those were labeled "E" for Everyone, meaning they can be downloaded by children.
Searching for "nudify," "undress" and other terms in those stores gives users access to apps that can make real people nude or put them into pornographic videos. The new report alleges that "the platforms are key participants in the spread of AI tools that can turn real people into sexualized images," TTP wrote. The app stores even ran ads for similar nudifying apps in the search results. (Engadget has reached out to Apple and Google for comment.)
The group identified 18 nudify apps in Apple's App Store and 20 in Google Play. Some were marketed with sexual images, while others weren't advertised as such but could still be used for deepfakes. Those apps have collectively generated around $122 million in revenue and been downloaded 483 million times, according to the report.
"It's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," TTP director Katie Paul told Bloomberg. "They are actually directing users to the apps
|
|
A group of researchers from across the US and the UK have conducted a study on what AI does to our brains and the results are, in a word, grim. These results were published in a paper called "AI assistance reduces persistence and hurts independent performance" which kind of tells you everything you need to know.
"We find that AI assistance improves immediate performance, but it comes at a heavy cognitive cost," the study declares. Researchers went on to state that just ten minutes of using AI made people dependent on the technology, which led to worsening performance and burnout once the tools were removed.
The study followed people who use AI for "reasoning-intensive" cognitive labor. This refers to stuff like writing, coding and brainstorming new ideas, which are some of the most common use cases.
The researchers recruited 350 Americans, who were asked to complete some fraction-based equations. Half of the participants were randomly granted access to a specialized chatbot built on OpenAI's GPT-5 for help and the others had to go it alone. Halfway through the exam, the AI group had their access cut off.
This led to a steep decline in correct answers by the AI group and many instances of people simply giving up. This result, in which performance and perseverance both dropped, was repeated in a larger experiment with 670 people. Finally, the scientists performed one final experiment with reading comprehension questions, and not math. The results were more of the same.
"Once the AI is taken away from people, it's not that people are just giving wrong answers. They're also not willing to try without AI," Rachit Dubey, an assistant professor at the University of California and coauthor of the study,
|
|