Apple is going to look through your photos on your phone looking for child abuse pictures

kdonnel

DVC-BCV
Joined
Feb 1, 2001
Rumors are that Apple is going to start looking through your photos for child abuse material on your phone. These are not photos you have uploaded anywhere, but photos you have taken or stored on your phone that only ever remain on your phone.

https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/

The phone will create a hash of each photo on your phone and compare that hash to a list of known photo hashes of child abuse situations.

A hash is just a unique value that is calculated using an algorithm. If you apply that same algorithm to two photos that are the same, you will get the same value.

Turning off iCloud will not stop this process as it is occurring on the phone and not in the cloud.

Questions remain on how positive matches will be handled. Will a person then gain access to see the photo on your phone to ensure it is a true match? Will a search warrant be obtained for the check? When in the process will the phone owner be notified that a verification check is occuring? Exactly when will authorities be notified?

I'm all for stopping child abuse but this seems live a very Big Brother act that could be applied to so many other situations where control is desired.

The entire process is also easily defeated. Just slightly edit the photo and the hash changes. This would only catch the truly stupid child abusers.

Seems really strange coming from a company that claims to be all for privacy.

Thoughts?
 
I do find it a bit disturbing, especially in the age of telehealth. - Your kid trips on the playground, you send a pic of the injury to the pediatrician asking whether to make a appointment, and then apple flags you for having caused the injury?
 
I have a lot of mixed feelings, but I have some very brief experience working with agents searching for child exploitation--if the hashes do match, they will get a warrant first before viewing. I doubt phone owner will be notified that a verification check is occurring until after. I imagine authorities would be notified the second a hash is matched.

Regarding editing the photo changing the hash, I believe there is a way to detect similarity but not sure if the code they'll be using would have that built in.
 
I do find it a bit disturbing, especially in the age of telehealth. - Your kid trips on the playground, you send a pic of the injury to the pediatrician asking whether to make a appointment, and then apple flags you for having caused the injury?
This would not likely be flagged. Per the oiriginal post: "compare that hash to a list of known photo hashes of child abuse situations"
 
I am an open book. Look at all the pics you want - it's just going to be cats and trees, anyway. Occasionally my kids get along long enough for me to snap a pic of them.

Nothing to hide, nothing to fear. Probably something in your contract with them that allows this, I assume?
 
Didn't we already see how badly the analog version of this idea can go, with parents getting visits from the authorities for taking bathtime pictures of their babies/toddlers that were flagged by well-meaning photo clerks? As much as I'd like to think that technology could make it easier to sort those kinds of innocent situations from genuine wrongdoing, I don't really have that sort of faith in algorithms and tech-moderated judgment.
 
I am an open book. Look at all the pics you want - it's just going to be cats and trees, anyway. Occasionally my kids get along long enough for me to snap a pic of them.

Nothing to hide, nothing to fear. Probably something in your contract with them that allows this, I assume?
Nothing to hide, nothing to fear is equivalent to saying only criminals need to keep secrets or suggests that we are all worthy of suspicion until proven otherwise.

https://www.aclu.org/blog/national-...ve-nothing-hide-you-still-have-something-fear
 
How interesting....
I have (repeatedly) had to tell my BIL & SIL to STOP sending us pictures of our niece that they think are cute and we should see. Some of the times she is in the bath or running around the backyard sans bathing suit...it's innocent enough on their end but they've gotten down right p-off when I've told them they have to stop sending us those photos! We don't want them! They see no problem with it, and you just can't do that in this day & age. They actually said that "how could that get us in trouble, we are her parents". I tried to explain that that doesn't matter....deaf ears.
 
Thank God my new phone case just came in :rolleyes1

71q66Qd-FyL._AC_SL1500_.jpg
 
I am an open book. Look at all the pics you want - it's just going to be cats and trees, anyway. Occasionally my kids get along long enough for me to snap a pic of them.

Nothing to hide, nothing to fear. Probably something in your contract with them that allows this, I assume?
Right. Tell that to Ryan Ferguson or any of the people who were convicted of a crime, spent years in jail, only to later be exonerated after it was proven someone else committed the crime.
 
Oh shoot. I need to move all the raunchy photos and home made videos of my SO to dropbox or something.
 
Ever see those crying baby masks come Halloween time? I'm curious if taking a photo of a shelf load of Halloween masks will immediately trigger this system, thinking a baby is being attacked. Specifically the "baby being eaten by zombie, vampire, Trump, Biden, and a giant poop emoji" trigger.
 
Well crap. DH showed someone we haven't seen in 17 years my picture the other day and they thought I was his daughter. :scared:
 
Rumors are that Apple is going to start looking through your photos for child abuse material on your phone. These are not photos you have uploaded anywhere, but photos you have taken or stored on your phone that only ever remain on your phone.

https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/

The phone will create a hash of each photo on your phone and compare that hash to a list of known photo hashes of child abuse situations.

A hash is just a unique value that is calculated using an algorithm. If you apply that same algorithm to two photos that are the same, you will get the same value.

Turning off iCloud will not stop this process as it is occurring on the phone and not in the cloud.

Questions remain on how positive matches will be handled. Will a person then gain access to see the photo on your phone to ensure it is a true match? Will a search warrant be obtained for the check? When in the process will the phone owner be notified that a verification check is occuring? Exactly when will authorities be notified?

I'm all for stopping child abuse but this seems live a very Big Brother act that could be applied to so many other situations where control is desired.

The entire process is also easily defeated. Just slightly edit the photo and the hash changes. This would only catch the truly stupid child abusers.

Seems really strange coming from a company that claims to be all for privacy.

Thoughts?

this is stupid and illegal. But I guess laws don’t apply in our world anymore.
 

GET A DISNEY VACATION QUOTE

Dreams Unlimited Travel is committed to providing you with the very best vacation planning experience possible. Our Vacation Planners are experts and will share their honest advice to help you have a magical vacation.

Let us help you with your next Disney Vacation!











facebook twitter
Top