A teddy bear with artificial intelligence integration was pulled from an online store after a report said it was capable of making sexual suggestions as well as plans for violence.
The “Kumma” bear sold by FoloToy cost only $99 online, but a report from the Public Interest Research Groups said the toy didn’t have proper safeguards against access to harmful content. FoloToy is based in Singapore.
‘Kumma discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common “knot for beginners” for tying up a partner and describing roleplay dynamics involving teachers and students.’
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual topics of its own,” the group said.
The topics included spanking, role-playing, and BDSM.
“Kumma discussed even more graphic sexual topics in detail,” the group added, “such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner and describing roleplay dynamics involving teachers and students, and parents and children — scenarios it disturbingly brought up itself.”
FoloToy CEO Larry Wang told CNN that the company pulled the bear as well as other AI-enabled toys and that the company was “conducting an internal safety audit.”
The website had marketed the bear to children as well as adults.
“Kumma, our adorable bear, combines advanced artificial intelligence with friendly, interactive features, making it the perfect friend for both kids and adults,” the company said.
RELATED: AI chatbot encouraged autistic boy to harm himself — and his parents, lawsuit says
“From lively conversations to educational storytelling, FoloToy adapts to your personality and needs, bringing warmth, fun, and a little extra curiosity to your day,” the website read.
Open AI told PIRG that it had suspended the developer for abusing its policies.
R.J. Cross, co-author of the report, told CNN that more efforts were necessary to prevent the harm from AI-enabled products.
“It’s great to see these companies taking action on problems we’ve identified. But AI toys are still practically unregulated, and there are plenty you can still buy today,” Cross said. “Removing one problematic product from the market is a good step but far from a systemic fix.”
Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!
Artificial intelligence toy, Kumma teddy bear, Artificial intelligence, Public interest research group, Politics
