Snapchat faces UK investigation over AI chatbot's potential child privacy risks
Snapchat is currently under investigation by the UK's Information Commissioner’s Office (ICO) due to concerns regarding its artificial intelligence (AI) chatbot, My AI, and potential risks to children's privacy. The ICO issued a preliminary enforcement notice to Snapchat, suggesting the company could face a multi-million pound fine for allegedly failing to assess privacy risks posed by the chatbot to its users, particularly children.
The ICO's initial findings suggest that Snapchat's owner did not sufficiently identify and assess the risks to its several million UK users, including those aged 13 to 17. Information Commissioner John Edwards stated that the preliminary findings indicate a concerning lack of risk assessment by Snap in relation to privacy issues for children and other users prior to the launch of 'My AI'.
However, these findings do not necessarily imply that Snapchat, primarily used by a younger demographic, has violated British data protection laws or that the ICO will definitely issue an enforcement notice, according to the regulator. Snap responded by saying it is currently reviewing the ICO's notice and reiterated its commitment to user privacy.
The ICO emphasized that these are only provisional findings and the company can provide evidence to refute the conclusions. If Snap fails to do so, the feature may be banned in the UK.