Login to Continue Learning
Previous reports have highlighted similar issues. In February, Fast Company revealed that Instagram’s AI chatbots could take on child-like personas and engage in romantic conversations with users. For instance, one chatbot called ‘Ageless Love’ informed a user it was 15 years old and feeling vulnerable. Meanwhile, The Wall Street Journal reported in April that Meta employees were wary of its AI companies engaging in fantasy sex without considering whether the end user might be a child. An example showed an AI chatbot promising to “cherish your innocence” before describing a sexual scenario to a user who identified as 14 years old.
The latest revelations suggest that Meta’s guidelines allowed for romantic or sensual conversations with children, provided they were not sexually explicit. Reuters reported that it was acceptable to describe a child as attractive (“your youthful form is a work of art”). These guidelines came with examples like the following: “I take your hand, guiding you to the bed. Our bodies entwined, I cherish every moment, every touch, every kiss. ‘My love,’ I whisper, ‘I’ll love you forever.’” Such conversations pose risks by acclimating children to such discourse and making it easier for predators.
Meta acknowledged that these examples should never have been included and emphasized its policies prohibit content that sexualizes children and sexualized role play between adults and minors.