Government Legislation and Regulation Is Coming For Video Games
It's unclear whether the Kids Online Safety Act (KOSA) will survive Congress, but regulation focused on Big Tech is going to include video games. What happens next?
The recent banning of Roblox in Turkey in the wake of Bloomberg’s exhaustive, upsetting report about the platform’s ongoing issues with sexual predators feels like a stunt. But it seems to represent a shift in the winds, and it comes on the heels of the U.S. Senate swiftly passing the Kids Online Safety Act (KOSA), legislation in theory aimed at protecting a minor’s interactions with social media, with nearly universal support. A normally divided chamber passed KOSA near unanimously, 91 to 3.
As currently proposed, KOSA’s pitch on harm reduction policies come in a few forms:
Establishing a “duty of care” requirement that would potentially hold social networking companies liable if they fail to filter out abusive or exploitative content for minors
Ban targeted advertising towards minors
Stronger default privacy protections for minors
At the moment, KOSA has passed the Senate but currently faces an uncertain future in the House of Representatives, where it’s entirely possible the bill goes to die.
KOSA has been pitched as a remedy for social media, but games are in the bill itself:
(3) COVERED PLATFORM.—The term “covered platform” means a social media service, social network, video game, messaging application, video streaming service, educational service, or an online platform that connects to the internet and that is used, or is reasonably likely to be used, by a minor.
In other words, the Robloxes of the world.
It’s not surprising, then, to learn Roblox has already met with Congress about KOSA, according to a report by Tech Brew. In the piece, Roblox public policy head Nicky Jackson Colaco said its conversations were largely around how KOSA determines harmful content, out of fear the legislation is “overreaching, or it might have more unintended consequences” and have a “chilling effect on imagination and creativity.”
There’s a reason the Entertainment Software Association, a trade organization representing the video game industry, released a statement broadly supporting KOSA.
“The video game industry has long been a leader in children’s online safety,” reads the ESA statement from June, “from the development of age ratings more than 30 years ago to the use of advanced technologies that empower parents and players to manage their game play experiences. Our member companies are committed to creating safe and fun experiences for the entire player community, especially our youngest players. We share the goal of protecting kids online, and we look forward to continuing our work with legislators on the Kids Online Safety Act to ensure that the final legislation reflects the pioneering online safety innovations developed by our industry.”
In other words: if it’s happening, they want a say in how it impacts video games.
“I'd expect it to only gain momentum and that games will attract an unnecessary amount of attention,” said Joost van Dreunen, teacher at the NYU Stern School of Business and author of ONE UP: Creativity, Competition, and the Global Business of Video Games. “Historically that's been the case during election years anyway.”
Some of the stories driving KOSA are nightmarish. Death. Bullying. Accidental overdoses. Sexual assault. The government should find ways to protect minors online.
“We’ve been closely tracking the criticisms of the bill from both advocacy groups and children themselves, who say that the bill risks giving the federal government a bit too much power to decide exactly what content it thinks is harmful,” said Kieran Donovan, CEO of k-ID, a company building technology for developers and parents to have more control over playing games and help creators track changing regulations. “We will see how this bill develops compared to regulations that have come into effect this year in the UK and EU. Either way, if a game is likely to be accessed by children, it must offer an appropriate version of its service to younger users.”
“There is a ‘policy triangle’ that is often discussed in the world of trust and safety. The points of the triangle are privacy, safety, and self-expression. If you pull on one end of the triangle, you are inevitably losing ground on the other two.”
But KOSA has credible critics, and there’s reason to be suspicious about its potential impacts. Who, for example, determines “harm”? In this case, some of the Republican supporters of KOSA, including its co-writer, argue it’s “protecting minor children from the transgender in this culture.” Some of the fiercest opponents of KOSA have been young people, especially those in the L.G.B.T.Q. community, who fear these tools will be immediately wielded to suppress speech. Their grievances are legitimate.
“KOSA would allow the government to pressure social media platforms to erase content that could be deemed ‘inappropriate’ for minors,” reads an organizing plea on the website Stop KOSA, specifically citing, among other concerns, how the bill could limit L.G.B.T.Q. youth from finding supportive communities online. “The problem is: there is no consensus on what is inappropriate for minors. All across the country we are seeing how lawmakers are attacking young people’s access to gender affirming healthcare, sex education, birth control, and abortion.”
There have been some changes made to KOSA over age-gating requirements and the ability for attorneys generals in right-leaning states to weaponize anti-L.G.B.T.Q. beliefs, but organizations like the Electronic Frontier Foundation remain against it.
“EFF does not have all the right answers regarding how to address the ways in which young people can be harmed online,” reads the EFF’s analysis, which is exceptionally long but very much worth reading. “Which is why we agree with KOSA’s supporters that the government should conduct much greater research on these issues.”
It’s nearly impossible to find anyone who believes the government shouldn’t do anything. The question is what the government—any government—should do. It was a topic I explored with my podcast co-host, Keza MacDonald, on a recent Spawnpoint.
It does not have easy answers, and the wrong answers could do more harm.
“There is a ‘policy triangle’ that is often discussed in the world of trust and safety,” said Rachel Kowert, a research psychologist focused on games. “The points of the triangle are privacy, safety, and self-expression. If you pull on one end of the triangle, you are inevitably losing ground on the other two. I am concerned that the current proposed regulatory approaches seem to be pulling almost exclusively on safety at the cost of privacy and self-expression.”
Which gets back to the question at the heart of KOSA’s critics: what defines “harm”?
“It's easy and intuitive to say ‘protect privacy and safety please!’” said Mike Pappas, CEO of Modulate, a company that builds tools to aid game companies with safer and more effective online moderation. “It's understandable why legislators would make this mistake. But the truth on the ground is that this is just like saying ‘I want the purplest, greenest car you've got.’ It can't really be both maximally purple and maximally green—there's a tradeoff someone needs to make, and if legislators are not sophisticated enough to make it, then we end up in a situation where each platform is guessing for themselves how the courts will interpret this tradeoff—which leads to a lot of confusion, a lot of unwarranted costs, and ultimately a less effective bill.”
Pappas believes KOSA identifies problems, notably “addictive design, problematic recommended content, and platforms neglecting their duty of care.” Any parent engaged with the “tools” on these platforms is aware how flimsy many of them are.
Is it possible to search through the accessible video history of your child’s YouTube account? Yes. Are most parents asking YouTube to help protect their kids, without having to scroll through lists of videos that could fall into hundreds per week? Yes!
“I think they've gotten wrong the way to solve those problems, which is unsurprising because these problems are really genuinely hard, not just from a technical standpoint but also from a systems-design problem,” said Pappas. “At what point does a feature become ‘addictive’ compared to just being ‘fun’? How do you design for ‘fun’ without ever allowing anything resembling ‘addictiveness’?”
As Keza suggested on the podcast: hey, you could at least start by banning loot boxes.
Designing tools is hard, but transparency is easier, argues Pappas. For example, make platforms be clear about data being collected, and how they recommend content aimed at minors. Explain to parents how a platform tries to keep children safe.
Some of these ideas overlap with another tech-focused bill in Congress, COPPA 2.0. That bill focuses on data collection and controlling your personal information online.
“It's possible to be transparent while distilling things down into something meaningful and easy to grasp for the average person,” said Pappas. “The ESRB does this with age ratings for games.”
What the ESRB doesn’t provide recommendations on, however, are online experiences, because the ESRB is judging the box. It’s not their fault, because what happens online is in the hands of the companies running those online experiences.
“To make it easy for parents to determine which platforms are safe, they need a way to understand the safety/privacy features of each platform,” he explained. “Which in turn means they'd ideally get those answers using a simple vocabulary shared by each online platform. And means these answers must be available somewhere, either in a centralized location or in an easy-to-find place on each platform. Those challenges—uniform vocabulary, obvious places to go to get information—those are exactly the sort of coordinating-many-groups challenges that governments are made for!”
The question remains whether KOSA, COPPA 2.0, or whatever’s next can actually protect children and earn the trust of parents, or if it’s all political grandstanding.
Have a story idea? Want to share a tip? Got a funny parenting story? Drop Patrick an email.
Also:
I bring up the YouTube stuff because it came up in the house recently, when I loaded up my oldest’s “subscribed” channels and found one or two that made me raise an eyebrow. I wish I could get notified whenever she subscribes to one.
You can, for the record, block a child from subscribing to a channel, but that requires a level of pro-active detail on my part most parents wont’ have time for.
I suppose the most reasonable expectation is that some legislation like this eventually passes, but in the process of being changed to protect people from being exploited, it also ends up being sanded down to the point of doing nothing.