House Energy and Commerce Examines Online Children’s Privacy During Pandemic

A key subcommittee turns to children online privacy issues that could be part of federal privacy legislation.

As part of its examination of privacy issues and online education during the pandemic, the House Energy and Commerce’s Consumer Protection and Commerce Subcommittee will a hearing titled “Kids Online During COVID: Child Safety in an Increasingly Digital Age.” The focus was on how well the Federal Trade Commission’s (FTC) regulations and enforcement of privacy statutes like the “Children’s Online Privacy Protection Act” (COPPA) are meeting the challenges awaiting children online in their educational and recreational online environments. It is possible that proposed changes to COPPA and/or FTC authority that result this and other hearings get folded into a national privacy bill, especially in light of the fact this subcommittee would be the primary venue in which such legislation would likely be developed and written.

In a briefing memorandum prepared for the hearing, the majority staff laid out the policy backdrop of the hearing:

  • Children and teens are particularly vulnerable to negative effects of digital marketing and persuasive design. Children have difficulty distinguishing ads from other content, and today’s digital ecosystem often intentionally blurs content and advertising adding confusion for kids. Frequent use of interactive media online is associated with addiction, anxiety, depression, suicide, sleep deprivation, obesity, and aggression.
    • Children’s Privacy. Despite existing regulations, data collection and third-party tracking is prevalent on children’s apps and apps commonly used by children, which allows for profiling and behavioral advertising targeting children. Like many adults, children and teens do not fully understand how their personal information may be collected and used by companies, but may be less aware than adults of the consequences of such collection.
    • Manipulation: Persuasive Design, Dark Patterns, and Influencer Marketing. Dark patterns are design features used to manipulate users into behavior profitable for a company but contrary to user intent. Many kids’ apps include features such as cartoon characters, free versions with teasers of the paid version (“freemium apps”), video ads that interrupt play, encouragement of in-app purchases, and prompts for social media sharing. Another example is “gamification” in which elements of game play (e.g., point scoring, competition, etc.) are used as a marketing technique, often to encourage in-app purchases.14“Unboxing” videos featuring so-called “kidfluencers” opening packaged items before discussing them in detail are also popular and often lack disclosures for sponsored content.
    • Exposure to Harmful Content. Exposure to advertising is associated with unhealthy behaviors—such as consumption of unhealthy food and beverages, tobacco products, electronic cigarettes—yet, unhealthy items feature heavily in online videos popular with kids. Inappropriate content has appeared on video websites intended for children, including children’s characters in violent or sexual situations, conspiracy theory videos, and other age-inappropriate content. In addition, cyberbullying has increased with increased online time during the pandemic.

However, either Republicans did not receive this memorandum or worked from the old political tactic of answering the question not as asked but as you wanted it asked, for they focused on the decision in a number of states to keep schools closed during pandemics. The top Republicans paid scant attention to privacy and online issues as they relate to children and instead sought to emphasize what they framed as the harm being suffered by children during online education. Their witness did not address the topic of the hearing either and instead advocated for “school choice” and blamed teachers unions for school closures.

Chair Frank Pallone Jr (D-NJ) (watch his opening statement and read his full written statement) stated:

  • As this subcommittee has heard time and time again, consumers online face manipulative advertising, disinformation, harassment, dark pattern manipulation, and privacy intrusions. For adults, these dangers are extremely hard to manage, but for children such practices are downright predatory. Children do not possess the same levels of cognitive development to defend themselves and are often uniquely vulnerable to any negative effects.
  • The online world can affect children’s mental and physical health. Growing bodies of research confirm the link between increased digital media use and depression and higher instances of addiction, anxiety, sleep deprivation, and obesity. We also have seen harmful behaviors such as cyber bullying increase during the pandemic.
  • Unfortunately, many companies are well aware that children are spending more time online and they are taking advantage of that by proactively targeting, manipulating, and monetizing our children.
  • Despite laws to protect children’s privacy, data collection and tracking of children is disturbingly prevalent. Many apps for kids on mobile devices are notorious for collecting personal information from children. Their personal information is then bought and sold, resulting in targeted advertising designed to influence and manipulate children.
  • Congress granted the Federal Trade Commission (FTC) rulemaking authority under the Children’s Online Privacy Protection Act, or COPPA, precisely so it could update the safeguards for children online as technology advanced. The internet has experienced a sea-change since the last updates to the COPPA Rule in 2013, and it’s clear those rules are out of date and no longer provide the intended protections for our kids.
  • While the FTC has started the process of updating its rules under COPPA, we also must examine whether the statute should be updated and whether other practices targeting children should be regulated. We can’t leave it all to parents.

Subcommittee Chair Jan Schakowsky (D-IL) (watch her opening statement and read her full written statement) asserted:

  • Children are spending twice as much time online compared to before the pandemic. This time is increasingly spent on digital platforms not designed with children in mind. Although we all hope that kids will be able to safely return to school soon, we should not be naive and believe that in-person schooling will mean that companies stop targeting our children online. Techniques honed by companies during the pandemic, and online habits developed by kids, will continue long after they are back in school.
  • Many online platforms are addictive by design, grabbing attention, and maximizing profits. Children are especially vulnerable to addictive or manipulative technologies. They are more susceptible to coercive advertising and have trouble resisting attention-grabbing features. The more time children spend online, the more likely they are to be subjected to harmful or age-inappropriate content.
  • There are few effective barriers protecting children and teens from the harmful content and hateful speech that plague our online discourse. Nor are they shielded from the loss of privacy that has become a feature of online platforms. Platforms that are intended for general audiences aren’t required to protect children’s privacy. Many of the most popular platforms say they do not allow children under the age of 13 but do almost nothing to enforce their minimum age requirement.
  • The harms that children and teens experience online have very real and lasting side effects offline. More screen time has been associated with higher levels of anxiety, depression, sleep deprivation, obesity and even suicide.
  • Children need tailored protections from privacy infringements and manipulative marketing practices. Children’s privacy must be protected by updating COPPA, the current law, for our increasingly complex and connected digital word.

Ranking Member Cathy McMorris Rodgers (R-WA) (watch her opening statement and read her full written statement) claimed:

  • We need to leave politics at the door and have a serious discussion about what is happening to our kids online, their mental health and safety, and what needs to happen to reopen schools immediately. Yesterday, we heard from four doctors who wrote in USA TODAY. Quote: “keeping schools closed or even partially closed, based on what we know now, is harming our children.” They said the Biden administration misinterpreted their research and science when creating the CDC guidance and it ultimately led to harmful policy that hamstrung states to reopen schools quickly.
  • Viral transmission is minimal in schools. Children are not a significant risk either of poor outcomes from COVID-19. It’s time to reopen immediately and listen to the experts who are saying loud and clear, follow the science school closures are harming children. It’s more than just a homework gap. There are serious health and mental risks associated with our children spending more time online.
  • These are stories, I’m hearing from parents who are pleading for schools to reopen. I understand that our focus today is child safety in an increasingly digital age. For the safety of our children, surely we can all agree science—not fear—should dictate how we protect them and build a better future with hope. We can mitigate a lot of the harms and risks we are talking about today by not letting another day go by of school closures. That’s what will give our children the best chance to succeed and thrive in life.
  • Now, specifically regarding their protection online, I am always open to updating and modernizing our laws. I’m committed to bipartisan work for data and privacy protections, especially children’s privacy. I sincerely hope those efforts resume soon, and this committee does the hard work of legislating in a bipartisan way again.
  • As we look to the future of building a better world for the next generation, I want to be clear. America can lead a new era of technological innovation. We must lead with our values for freedom, human rights, and human dignity. But we are failing with closed schools and this year long experiment of remote learning, more screen time, and more isolation. Technology should add to education. It’s not a substitute for everyday learning.

Subcommittee Ranking Member Gus Bilirakis (R-FL) (watch his opening statement and read his full written statement) contended:

  • Now think about what it is like to be our kids. This is their new reality, and it is a sad one.T he COVID-19 pandemic has caused so many Americans to become isolated in their homes, especially our kids. Without opportunities for children to interact in person with their friends directly, many turn to social media to fill the void. Sadly, this has led to a cascade of negative effects for them. I believe this hearing can serve as an important alarm bell for safely reopening our schools and getting students and teachers back in the classroom and reverse this trend.
  • To be fair, at the beginning of the pandemic there was much unknown about the virus, and virtual school was a seemingly viable bridge to educating students. Distance learning can certainly be a positive tool for some students –but the facts now make clear that as a primary means of instruction, it just doesn’t work for advancing our kids’ education, especially those with disabilities.
  • That’s why I was pleased that earlier this year President Biden pledged to reopen schools by his 100th day in office and CDC Director Walensky relayed that data indicated schools can begin to safely reopen. Still, we are all alarmed by recent contradictory statements to the science behind these commitments, so it will be interesting to find out what changed. Hopefully the panel will have some insight there. I also want to note as privacy protections are on the agenda today that I want to be part of a real solution.

Common Sense Media Senior Counsel Ariel Fox Johnson (watch her opening statement and read her full written statement)

  • Technology’s original promise to foster connection has been lost, but it is not too late for us to right the ship.” It is high time to right the ship. To do so, Congress must take a holistic view of the interconnected harms young people experience online and support a variety of related solutions that overall can improve the online landscape for kids both today and tomorrow. And technology leaders should do what they can immediately to improve their products.
  • Congress has long recognized children’s special vulnerabilities–including in media and technology specifically–and provided additional protections for kids and families. The Children’s Television Act requires a certain amount of educational programming during hours children are likely to be watching, limits commercial time, and requires clear delineations between ads and other content. Congress has also provided for special protections for young children on the internet, with the Children’s Online Privacy Protection Act, which sought to re- insert parents as gatekeepers of their children’s lives and protect against unwanted contact from strangers and marketers–in an era when children were dialing up the internet on bulky desktops shared with the whole family. Now, the internet is always on, and always within reach in a pocket, atop a bedside stand, or on a kitchen counter. “Simply existing” can subject a child or teen to surveillance and data collection, simply pressing a few buttons or saying a few words can broadcast a child into countless homes, connect a child with someone on the other side of the world, enable a child to watch everything from live streams of other kids playing video games or, much more distressingly, of rapes or murders. There is incredible opportunity, for good and for bad.
  • First, Congress must act to better protect young people’s privacy. This includes passing a comprehensive national privacy law, which will limit the impact of inadvertent gaps in child- specific laws, ensuring at least some coverage for children and teens on any site or service. That said, given children and teens’ unique vulnerabilities, they need special protections–no behavioral ad targeting, data minimization by design and default–and, consistent with international precedent and more recent state laws like California’s Consumer Privacy Act, those special protections should definitely include teens. Current federal law, COPPA, stops providing protections when a child turns 13, and has other limitations even with respect to children. Happily, there are already good models. The PRIVCY Act88 would address many of COPPA’s shortcomings: preventing sites from turning a blind eye to young people using their services, offering special protections to and empowering teens to make decisions for themselves, emphasizing the importance of age-appropriate controls and language, and providing bright line rules prohibiting certain particularly problematic practices like behavioral marketing to young children. It would also enable stronger enforcement, allowing parents to sue on behalf of their kids and enhancing the Federal Trade Commission’s powers. Enhancing regulators’ ability to enforce privacy violations in general is critical, and the Commission should have more resources.
  • Protecting privacy will go a long way in improving children’s overall experiences–companies will be less able to microtarget kids with inappropriate content and advertisements, for example, or use what they have learned to further manipulate children to stay online longer or spend more money. But more is also needed–children’s exposure to unhealthy content via social media and algorithmically-curated feeds should be limited, and companies should not be incentivized to push inappropriate ads and disturbing and even illegal content onto kids.
  • Congress should pass a Children’s Television Act for the internet age. Again, a model exists: the and commercialization by creating rules that limit how commercial content can be recommended to kids. It also requires platforms to provide families with better guidance on kid healthy content, label and identify healthy content, and supports no-cost and ad-free access to this material.
  • We additionally need to control algorithmic amplification and UX/UI design that undermines users’ choices and amplifies negative content. The bipartisan Deceptive Experiences To Online Users Reduction Act (DETOUR Act) in the Senate targets practices by large online platforms to use deceptive and manipulative design practices known as “dark patterns.”

American Academy of Pediatrics Council on Communications and Media Chair Nusheen Ameenuddin, M.D., M.P.H., M.P.A., F.A.A.P. (watch her opening statement and read her full written statement) stated:

  • In the face of this extraordinarily complex digital ecosystem, the question then becomes how we as a society move forward to make real progress for children and families. We must be bold in our thinking and ensure that government action on technology addresses the most pervasive and concerning industry practices that harm children and teens while preserving and enhancing the positive aspects of technology for young people.
  • It is critical that Congress act to improve and strengthen laws that are designed to protect children online. Updating the Children’s Online Privacy Protection Act is a good place to start. An enhanced COPPA should, at a minimum, be expanded to protect all children under the age of 18 and cover the wide array of devices that collect data from children, including mobile devices, internet-connected toys, and others. Technology platforms should also be required to set the highest level of privacy protections as the default.
  • As a general rule, data collection should be considered an opt in practice for young people, if the practice is even allowed at all. In the event that data is collected, mandated disclosures with information on what data will be collected, how the data will be used, with whom data might be shared, and the risks and benefits to the consumer should be prominently provided at appropriate literacy and developmental levels. This must also include information about blocking this data collection and how young people can go about deleting their personal information permanently. Congress must also act to close the loophole that has allowed technology companies to evade COPPA regulations by claiming to be “general audience” rather than “child-directed” platforms.
  • We as pediatricians understand that young people are particularly vulnerable to deceptive or unfair marketing practices. We call on Congress to ban targeted (i.e., data-driven behavioral) advertising to all individuals under the age of 18. These invasive and extraordinarily effective ads have no place in our society targeting young people who may not fully understand that they are being sold a product or why. We further request that Congress ban all commercial advertising to children younger than 7 years, and limit advertising to older children and teenagers in light of developmental considerations. Advertising for products with demonstrated health effects on young people, like unhealthy food, alcohol, and cannabis, need additional attention from Congress.
  • Congress should also fund digital literacy curricula in schools to ensure that children and teens are equipped with the skills they need to navigate an increasingly complex digital ecosystem. Congress can also fund efforts to promote digital equity by expanding access to broadband internet and devices, while also targeting digital media practices and marketing tactics that disproportionately impact youth of color. Lastly, we need more research to better understand how digital media impact children’s health and development and ultimately how we can create a digital ecosystem that is most beneficial to young people.

Reason Foundation School Choice Director Corey A. DeAngelis (watch his opening statement and read his full written statement) said:

  • There have been substantial costs associated with keeping schools closed in terms of students losing ground academically, mentally, and physically – and many of these negative effects have disproportionately impacted less advantaged groups, leading to inequities. Meanwhile, the evidence has generally indicated that schools can safely reopen for in-person instruction and that school reopenings (sic) are not associated with major increases in overall Covid-19 transmission or hospitalizations.
  • In addition to the science, actions by several teachers’ unions – and the stark contrast in the response to the pandemic from the private versus public sectors – suggest that reopening decisions have had more to do with political partisanship and power dynamics than safety and the needs of families.
  • Private schools have been open for most of the past year – or have been fighting to reopen in that time. In fact, private schools in Kentucky took the fight to the Supreme Court in an attempt to provide in-person services, and private schools in states such as Ohio and Michigan took similar legal actions. A private school in Sacramento even rebranded itself as a daycare to try to get around the government’s arbitrary closure rules. But many teachers’ unions have been fighting to remain closed by shifting the reopening goalposts every step of the way. That’s not because of a difference in intentions or benevolence on the part of the employees between the two sectors. The difference is one of incentives. One of these sectors receives children’s education dollars regardless of whether they open their doors for business.

© Michael Kans, Michael Kans Blog and, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and with appropriate and specific direction to the original content.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s