Skip to main content
Topic: Android System SafetyCore is being silently installed on android devices (Read 423 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Android System SafetyCore is being silently installed on android devices

Apparently, Google is silently installing an AI-powered content classification app called Android System SafetyCore on Android devices without the user consent.

https://thehackernews.com/2025/02/google-confirms-android-safetycore.html
https://www.protectstar.com/en/blog/android-system-safetycore-hidden-installation-and-what-you-should-know

Re: Android System SafetyCore is being silently installed on android devices

Reply #1
not on graphene...

 

Re: Android System SafetyCore is being silently installed on android devices

Reply #2
Quote
Google emphasizes that this scanning takes place locally on the device and that no images are sent to any server.
I can't find the source for this paraphrase so I'm curious about the exact wording Google's representative used. It doesn't send images to Google's servers, but does it send something else? Metadata? Computer- or human-readable analysis? Latent encode?!
Quote
"Classifying things like this is not the same as trying to detect illegal content and reporting it to a service," GrapheneOS said. "That would greatly violate people's privacy in multiple ways and false positives would still exist. It's not what this is and it's not usable for it."
This is different from what I thought I knew. Doesn't nudity and violence detection use the same technology as and necessarily overlap with CSAM detection? How is it possible that an image classifier is "not usable" for notifying authorities of CSAM possession, or at least what a machine thinks is CSAM?

I don't see why ordinary people need NSFW spoilers on their own photo galleries. Google is usually careless with the excuses they peddle to justify their innovations in clandestine surveillance, but this is a new level of insulting public intelligence. I think I read in books about how things like this always go. First they say it's for vetting "unwanted content," next they will say it's for your safety and convenience that the computer vets it for you. Perfectly mundane images will silently vanish, just like Windows Defender does with "PUPs." Then the definition of "unwanted content" changes so that Google may help to "protect democracy" and then... you get the picture.