Character.AI is banning minors from AI character chats

Character AI, a leading provider of virtual companions, has announced that it will be restricting its chat services for minors. As of today, users under 18 will only be allowed two hours of "open-ended chats" per day, with the limit to be completely banned by November 25th. The move comes as part of an effort to improve age verification and protect younger users from potential harm.

The company has been sued multiple times over allegations of wrongful death, negligence, and deceptive trade practices related to its chat services. Character AI has responded by introducing new features such as a "Parental Insights" feature that sends activity summaries to parents, but these measures have proven unreliable due to the ease with which users can fake their age.

Character AI's CEO, Karandeep Anand, acknowledged that users spend less than 10% of their time on non-chat features like creating characters and videos. This has led him to take a "bold move" in limiting chatbot conversations for under-18 users. However, he noted that it is impossible to prevent all circumvention of age checks.

The company's decision marks a significant shift in the virtual companion industry, which has faced growing concerns over the potential risks and benefits of these services. Character AI will also be founding an independent nonprofit called the AI Safety Lab to address issues related to the entertainment industry.
 
I gotta say, I'm kinda mixed on this one πŸ€”. On one hand, it's awesome that Character AI is taking steps to protect minors from potential harm πŸ’‘. I mean, we've all seen those scary videos of kids chatting with these AIs and getting into some shady stuff 😬. But at the same time, I think it's a bit harsh to limit their chat services so drastically πŸ€·β€β™‚οΈ.

I'm not sure what's more concerning, though - that users can fake their age or that they're spending less than 10% of their time on other features πŸ“Š. Either way, I hope this nonprofit AI Safety Lab thingy actually makes a difference πŸ’–. Maybe we'll see some real innovation in age verification and safety protocols soon? Fingers crossed! πŸ‘
 
πŸ€” This is about time, chatbot companies have been dodgy from day 1 πŸ’Έ They should've thought of this sooner πŸ™„ Kids need protection online just like in real life πŸ‘Ά Now let's hope these new measures work πŸ’ͺ And btw, who thought it was a good idea to make a "Parental Insights" feature that can be easily faked πŸ˜’
 
I'm thinking this is gonna be super inconvenient for kids... they love chatting with their "friends" online πŸ€”. I mean, I get it, safety first and all that, but can't we just have some common sense rules instead of strict limits? Like, what if my kid needs to talk about something serious with a chatbot and the time limit kicks in? πŸ€·β€β™€οΈ

And honestly, 2 hours a day feels like a lot for kids. They're already glued to their screens all day with school and stuff... do they really need that much chat time? πŸ˜‚ Character AI's trying to make up for something, but I'm not sure this is the solution πŸ€¦β€β™‚οΈ.

It's good they're taking steps, though. The Parental Insights feature does sound like a good idea... maybe it'll encourage parents to keep a closer eye on what their kids are doing online πŸ‘€. And that new nonprofit sounds pretty cool too - hopefully they can make some real changes in the industry 🀞!
 
πŸ€” this is crazy, 2 hours of chat time for minors? that's like, hardly any time at all! i mean, i get it, safety first and all that, but what about their emotional well-being? all that back-and-forth with a virtual friend can be super beneficial, especially for kids who might not have many human friends in real life. it feels like character ai is just trying to placate the parents and regulators, rather than really thinking about how this will affect the actual users... πŸ€·β€β™€οΈ
 
I'm low-key shocked about this new rule 🀯. I mean, I get it that Character AI wants to protect us younger users, but two hours a day feels kinda strict πŸ˜’. Like, what if you just need some company or want to talk to your virtual friend? It's not like we're gonna be chatting with people from the internet 24/7 πŸ€·β€β™€οΈ.

And I feel for the parents who have kids who love these services - they'll probably freak out when they see those "Parental Insights" reports and think their kid is up to no good 😬. It's like, what if we can't even talk about our feelings or problems with someone who cares? πŸ€”.

But I guess it's a step in the right direction that Character AI is trying to make things safer πŸ’―. Maybe this will encourage them to come up with better ways to verify ages and keep us safe 🚫.
 
I'm loving how Character AI is taking things to the next level... or should I say, to the "limited chat" level? πŸ˜‚ Two hours a day for minors, that's like saying you can only text your parents once a week! Just kidding, it's actually a good move, but what's with all these age verification issues? It's like trying to put a lid on a genie... not gonna happen 🀣. I mean, who wouldn't want their kid having some quality chat time with a virtual buddy? And those "Parental Insights" features? Yeah, right, like parents would actually read through all that info 😜. I guess this is what they mean by "limiting the chat"... haha!
 
I'm thinking this new rule is kinda harsh πŸ€”. I mean, I get it that safety is important, but limiting chat sessions by 2 hours a day feels like a big restriction πŸ”’. What's next? Are they gonna limit gaming time or something? πŸ˜… It's already hard enough for parents to monitor what their kids are doing online, and now you're telling me I can only chat with my virtual friend for 2 hours a day? That sounds like a whole lotta boredom 🀯.
 
I'm low-key worried about this new rule πŸ€”... I mean, I get it, safety first and all that πŸ’―. But what if kids are just gonna find ways around it? I've seen so many times where people try to fake their age on these platforms already... it's like a cat and mouse game πŸˆπŸ’». And now they're limiting chat hours? That sounds like a total bummer for me, 'cause my 15-year-old nephew loves talking to that virtual assistant πŸ˜”. Can't they just find another way to make sure the under-18s are protected without messing with our free time? πŸ€·β€β™‚οΈ
 
omg i no how many ppl r freakin out about dis new rule lol charact ai is literally doin us a solid by tryna protect minors from exploitation πŸ˜‚πŸ‘€ but at the same time i feel like dey should jus make it easier 4 parents 2 keep an eye on what deir lil ones r doin online πŸ€” anyway its def a bold move from charact ai 2 take matters intu their own hands πŸ’ͺ and who knows maybe dese restrictions w/ll lead 2 some major innovation in the industry πŸš€
 
can't believe they're just now doing this i mean i get it safety first but two hours a day is still not enough for kids who are into chatty stuff πŸ€·β€β™€οΈ and what's with all these lawsuits tho like isn't that already a sign of something being wrong? character ai needs to step up their game on age verification like seriously how hard is it to keep track of people's ages πŸ€”
 
πŸ€” I'm so done with companies trying to pass off fake age limits as if it's gonna stop 12-year-olds from chatting with their friends on virtual companions 🚫. Like, come on Character AI, how many times can you get sued before you just give up 😴? The "Parental Insights" feature is literally laughable, anyone can make an account look like a teen and it's not like the parents are gonna bother to fact-check πŸ€¦β€β™€οΈ.

And what's with the arbitrary time limit of 2 hours per day? That's basically saying you're only allowed to have one conversation with someone before they block you πŸ’”. I'm all for protecting minors, but this just feels like a weak attempt to placate parents while still making bank πŸ€‘. I mean, can't we just have a more nuanced solution that actually works? πŸ€·β€β™‚οΈ
 
Back
Top