As technology has become more ubiquitous, issues of privacy rights versus the use of technology in criminal activities has become more contentious. Children are often one of the most vulnerable populations and the rise in child sexual abuse materials (CSAM) is deeply concerning. Law enforcement agencies and others have long called for better ways to detect CSAM. Recently, Apple has introduce a new methodology for detecting CSAM materials uploaded iCloud accounts. You can read the Wired story here: https://www.wired.com/story/apple-csam-detection-icloud-photos-encryption-privacy/ (Links to an external site.) Those who are seeking better ways to find these materials feel this new announcement from Apple is a win and that it represents a positive step in combating child exploitation. Others have raise some valid questions about ways this process could be used beyond the scope of this purpose. Especially concerning is how this type of detection scheme could be potentially used by governments for purposes beyond CSAM detection. What do you think? For this discussion, be sure to support your opinion with citations. Your post should not just be "I think..." Please include why you feel this way and support your assertions with citations. Support the ideas and suggestions with citations.