Sunday, August 21, 2022

A dad who sent Android photos of his baby's groin to a doctor says Google disabled his account and police probed him after Google flagged the photos as CSAM (Kashmir Hill/New York Times)

Kashmir Hill / New York Times:
A dad who sent Android photos of his baby's groin to a doctor says Google disabled his account and police probed him after Google flagged the photos as CSAM  —  Google has an automated tool to detect abusive images of children.  But the system can get it wrong, and the consequences are serious.



No comments:

Post a Comment

Alibaba's DAMO Academy releases RynnBrain, an open-source foundation model to help robots perform real-world tasks like navigating rooms, trained on Qwen3-VL (Saritha Rai/Bloomberg)

Saritha Rai / Bloomberg : Alibaba's DAMO Academy releases RynnBrain, an open-source foundation model to help robots perform real-worl...