🚀 Read this trending post from WIRED 📖
📂 **Category**: Culture,Culture / Video Games,Half Measures
💡 **What You’ll Learn**:
Only days later At launch, Roblox’s AI age verification system is a complete mess.
Roblox’s facial scanning system, which estimates people’s ages before they can access the platform’s chat functions, rolled out in the United States and other countries around the world last week, after initially launching in a handful of locations in December. Roblox says it is implementing the system to allow users to safely chat with users of similar ages.
But players are already in revolt because they can no longer chat with their friends, developers are calling on Roblox to cancel the update, and, most importantly, experts say that not only is the AI abusing young players aging into adults and vice versa, the system is doing little to help address the problem it was designed to address: the flood of predators using the platform to groom young children.
In fact, WIRED found multiple examples of people advertising age-verified accounts for minors as young as 9 on eBay for as little as $4.
After WIRED flagged the listings, eBay spokesperson Maddy Martinez said the company was removing them for violating the site’s policies.
In an email, Roblox’s chief safety officer, Matt Kaufman, told WIRED that a change of this magnitude on a platform with more than 150 million daily users takes time.
“You can’t flip a switch while building something that didn’t exist before,” he said. “To expect the system to be flawless overnight is to ignore the scale of this project.”
Kaufman said the company was happy to accommodate, adding that “tens of millions of users” had already verified their ages, which he claimed proved that “the vast majority of our community values a safer, more age-appropriate environment.”
The company also addressed some of the criticism in an update on Friday, writing: “We are aware of instances where parents are verifying the age of their children on behalf of their children resulting in children being 21 or older. We are working on solutions to address this and will share more here soon.”
Roblox announced the age verification requirement last July as part of a suite of new features designed to make the platform more secure. The company has come under intense pressure in recent months after several lawsuits alleged that the company failed to protect its young users and facilitated predators to groom children.
U.S. attorneys in Louisiana, Texas, and Kentucky also filed lawsuits against the company last year making similar allegations, while Florida’s attorney general issued criminal subpoenas to evaluate whether Roblox “helps predators access and harm children.”
Roblox claims that requiring people to verify their age before allowing them to chat with others will prevent adults from being able to freely interact with children they don’t know.
Although the process is optional, refusing to do so means that the person will no longer be able to access the platform’s chat functions, which is one of the main reasons why most people use Roblox.
To verify their age, people are asked to take a short video clip using their device’s camera, which is processed by a company called Persona that estimates their age. Alternatively, users may upload a government-issued photo ID if they are 13 or older.
Roblox says that all personal information is “deleted immediately after processing.” However, many online users say they are unwilling to perform age verification due to privacy concerns.
Only people who have verified their age are allowed to chat with a small group of other players their age. For example, those who are verified as under 9 can only chat with players under 13. Players who are 16 years old can chat with players between the ages of 13 and 20.
💬 **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Robloxs #AIpowered #age #verification #complete #mess**
🕒 **Posted on**: 1768375917
🌟 **Want more?** Click here for more info! 🌟
