Apple will soon scan your phone photos

To start off, I applaud Apple on trying to protect children and do something about Child Sexual Abuse Material (CSAM). I just find it scary their approach and the ramifications it might have
In iOS 15 and iPadOS 15, Apple will put a list of hashes of know CSAM material and when you decide to upload these photos to the iCloud, Apple will scan through them and mark them if the hash matches known CSAM material. Apple say's it will never scan photos on your device, unless you upload them to iCloud
If you edit a photo, it will get a new hash. So Apple will scan your photos in a different way to try do determine if it's the same photo and there's an error rate. Apple claims; "the likelihood that the system would incorrectly identify any given account is less than one in one trillion per year."
If you've enough material flagged, Apple have not specified their threshold, then it will be reported to Apple. Who then will review the flagged material and if found to be correct, report it to the Authorities
Backlash
Because of all the backlash Apple have received about this, they've released a six page FAQ [↗], indicating Apple wasn't clear enough in their initial communicationWhere they've inform that the hash list will be auditable, they'll never allow the system to be used for anything else than for CSAM. Apple will only uses hashes that intersect by two or more child safety organizations
If you turn of iCloud Upload, it will also not scan your photos for CSAM hashes. If on, it will also scan existing iCloud photos and all new photos uploaded. It will only work in pair with iOS/iPadOS and iCloud
Apple have said this feature will be limited to U.S. region
What about the other cloud companies?
Other cloud storage companies already scans for CSAM material, on everything store on their servers. If Apple just amended their iCloud terms, without making this big announcement about this and more clearly branded this as an iCloud feature (which it is). I believe there wouldn't have been as much misunderstanding and backlashMy opinion
It's a good thing that they try to catch predators and try to stop the spread of CSAM, but what else does this open for down the line? Today it's CSAM, tomorrow it's terrorist and then it's political opinions, etcWhile I might not be to afraid by Apple themselves going after political opinions and such, but if Apple starts doing this and especially how it was initially portrayed. It sets a precedence, where other companies might follow suit or even be required by law to follow suite