Apple Inc. plans to scan iPhones in the U.S. for imagery showing child abuse, the Financial Times reported Thursday. The system represents a powerful use of technology to catch violent and sexual crimes but also raises startling questions about privacy and corporate surveillance of millions of people’s phones.

The company outlined its proposed tool, known as neuralMatch, to U.S. academics this week, the newspaper reported, citing two unidentified security researchers. It will alert human reviewers to potentially illegal images, and that team would notify law enforcement, according to the report. Apple didn’t immediately respond to a request for comment.