
The Wavelength is moving! Next week, you’ll get the same great content, but from the Ghost platform. The move will allow for a price decrease because of reduced costs for me.
This week, the United States (U.S.) House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee held the third in a series of hearings “focused on holding Big Tech accountable and follows a December Subcommittee hearing on several legislative proposals intended to build a safer, more transparent, and accountable internet ecosystem.” The hearing focused on these bills:
- The “Banning Surveillance Advertising Act of 2022” (H.R. 6416)
- The “Algorithmic Accountability Act of 2022” (H.R. 6580)
- The “Cooperation Among Police, Tech, and Users to Resist Exploitation Act” (H.R. 6755)
- The “Increasing Consumers’ Education on Law Enforcement Resources Act” (H.R. 6786)
- The “Digital Services Oversight and Safety Act of 2022” (H.R. 6796)
Democrats and Republicans engaged in limited skirmishing about which parts of the online world require regulating and reform. And yet, many Members of both parties called for the enactment of a United States (U.S.) data privacy regime, and some explicitly and implicitly blamed the other party for the lack of progress in the development of a bipartisan bill. Indeed, going back to late last year, Members of the committee has spoken publicly about discussion drafts being traded between the two sides along with stakeholders. Naturally, the state of these negotiations is not clear. Additionally, some Members stressed the need to focus on heightened data privacy and protection for children and teens in light of the mounting evidence that online platforms pose risks to many.
Having said all that, one might be forgiven for thinking that sophisticated deepfake technology was used to splice together footage from two different hearings in light of the disparate angles taken by Members of the two parties. And while this may seem comical, it illustrates the gulf between the two parties on how they view the pressing issues of social media and online platforms and consequently the best remedy for these problems. One could also be forgiven for thinking a data privacy law is not in the cards for this Congress, recognizing that legislating difficult matters gets harder as an election looms closer. And with Republicans currently projected to retake both the House and Senate as parties not holding the White House usually do in mid-terms, they may decide to wait until next year to work on a bill more to their liking.
On other side of the coin, Democrats may soon decide that an imperfect bill under a legislative process they control is better than a Republican bill through a process controlled by a new majority. Finally, it bears note that the landscape continues to shift nationally as Utah is on the verge of enacting the fourth state data privacy law, and given that three of these states have enacted bills closer to what industry stakeholders want to see, Democrats may make the tactical decision that a stronger bill at the national level that does not give them everything they want may be better than no bill. Additionally, should enough states enact bills that privacy and civil liberties groups think are weak, state preemption, one of the Democratic sine qua nons, could become a non-issue for its many Members.
Generally, Democrats extensively referenced their three bills before the committee and the issues they purport to address: algorithms used in decision making or content moderation, transparency and formalized content moderation processes, and surveillance advertising. Likewise they asked questions almost exclusively of their three witnesses. Only Democrats asked about the impact that “Big Tech” have on people of color, minorities, and women.
Broadly, Republicans were very focused on the needs of law enforcement and potential harm to children that is being allowed through insufficient authority for law enforcement agencies. They almost exclusively asked questions of their witness, a representative from Florida’s law enforcement community. More than one Republican referenced TikTok’s refusal to appear before the subcommittee, but, interestingly, this criticism was not levelled at any of the Chinese company’s competitors. This suggests that Republicans may have been intentionally focusing on TikTok as part of their larger policy agenda that the U.S. address more forcefully the threats posed by the People’s Republic of China (PRC).
Subcommittee Chair Jan Schakowsky (D-IL) (watch her opening statement) stated:
- We are done with apologies and denials from tech companies.
- We are done turning a blind eye while billionaires build economic empires by feeding Americans an ever-increasing diet of disinformation.
- It is time to regulate.
- We will consider five bills including Ms. Eshoo’s bill, the “Banning Surveillance Advertising Act,” which I am proud to co-lead, to ban targeted advertising that can track individual users across the internet.
- Ms. Clarke’s bill, the “Algorithmic Accountability Act,” will require tech companies using artificial intelligence to assess the impact of their algorithms on consumers.
- Discrimination of protected classes has no place in our digital world.
- Ms. Trahan’s bill, the “Digital Services Oversight and Safety Act,” improves transparency for consumers and ensures researchers can measure the impact of social media on our society.
- Mr. Bilirakis’s bill, the “CAPTURE Act,” studies whether law enforcement has the resources to keep us safe online.
- Mr. Mullin’s bill, the “Increasing Consumers’ Education on Law Enforcement Act,” empowers consumers to protect themselves.
- As we refine these proposals, I have no doubt they can help create a fairer, safer internet. One that protects consumers. And whose business model isn’t rooted in discrimination.
I look forward to hearing how we can improve these proposals to achieve that goal.
Subcommittee Ranking Member Gus Bilirakis (R-FL) (watch his opening statement) stated his hope that the next hearing in the series will be on data privacy and security. He noted TikTok, a company at the center of concerns about “Big Tech” declined the majority’s invitation to testify at the last hearing and the joint invitation to appear at today’s hearing. Bilirakis said that while he shares many of the Democrats’ concerns, he worries about proposals that miss the mark and hurt other parts of the economy. He claimed the Democratic bills before the committee would lead to worse consumer experiences, less innovation, and entrenching the largest tech companies, especially Google. Bilirakis discussed the two Republican bills on the agenda that would help law enforcement agencies in the fight to protect children from online predation, namely “Big Tech’s” abuses that allow such things to occur. He reiterated his hopes that the committee will tackle data privacy and security legislation and called on Democrats to stop “beating around the bush” with “one-off” bills. Bilirakis said he and Ranking Member Cathy McMorris Rodgers (R-WA) stand ready to work with anyone to work on comprehensive legislation that would institute one privacy standard for the U.S.
After Bilirakis finished, Schakowsky said she looks “forward to dealing with very soon a comprehensive privacy bill.”
Chair Frank Pallone Jr (D-NJ) (watch his opening statement) asserted:
- These hearings come after years of repeated, bipartisan calls for social media companies to change their ways. Since 2018, the Committee has held eight hearings on this subject. We’ve examined these issues from all sides and now it’s time for us to come together and to act. We’re committed to working with our Republican colleagues on legislation to increase transparency, limit online manipulation, and improve online safety.
- We all know how important social media is to our daily lives. It allows us to connect with family and friends, to organize, and stay safe.
- There’s no question that fast moving current events can be difficult for social media companies to respond to quickly, but that is their responsibility and they must be held accountable. We must ensure they are transparent, and their incentives align with the good social media can do for people, not the bad.
- Today, we will discuss five bills that target different parts of the social media ecosystem to make platforms safer for users.
- One of the best ways to make these companies more accountable is to make them more transparent. We will discuss legislation that establishes an “Office of Independent Research Facilitation” at the Federal Trade Commission (FTC). This new office would help facilitate academic research on social media platforms, to help us get the data we need on how these companies are targeting users.
- Another bill requires companies that use automated decision making to conduct impact assessments on their systems and regularly report the results to the FTC. These assessments will help us ensure that machine learning is being employed in a fair and non-discriminatory matter.
- We’ll consider a bill to ban the practice of targeted advertising, which includes a provision prohibiting advertisers from using information that identifies a consumer as a member of a protected class for advertising purposes.
- Finally, we’ll consider bills that will help social media companies work better with federal, state, and local law enforcement to protect users who feel their safety has been violated online.
- These proposals, along with the proposals we considered in the two previous legislative hearings, are collectively major steps in addressing the real harms caused by Big Tech.
- Another part of tech accountability is protecting people’s privacy, especially our children’s privacy, as more and more apps are used by and targeted to our kids. I think every member of this Committee agrees that more must be done on privacy, and that’s why we have been working since last Congress on a bipartisan staff draft. Our work on that legislation continues and I hope that the Republicans will work together with us on that as well.
- The bills before us today are important bills to address tech accountability. The time to act is now and these bills can help us make the internet a safer place.
Ranking Member Cathy McMorris Rodgers (R-WA) (watch her opening statement) criticized platforms like Twitter for allowing Russia to use them to wage information war and said this was another example of needing to hold “Big Tech” accountable. She said that “Big Tech” platforms are her biggest fear as a parent and all the conversations she has with parents are about the same things. Rodgers said “Big Tech” does not care about children, and they are being manipulated for profit and induced to self-harm. She argued that if the committee is to address these issues, it must impose transparency on these companies for how they are collecting and using personal information, especially from children. She referenced the data privacy and security framework she and her colleagues released last year Rodgers stated Republicans had asked that the bipartisan “Preventing Real Online Threats Endangering Children Today (PROTECT) Kids Act” (H.R.1781) be included in the hearing but it was not. She asserted that the committee needs to examine in much greater depth issues related to artificial intelligence (AI) before regulating a new realm in which the U.S. can and must lead the world. Rodgers said one of the bills would give the Federal Trade Commission (FTC) sweeping new authority, and Republicans are open to some new authority, but before Congress can broaden the agency’s mandate, there must be an assessment of accountability and transparency of the agency’s uses of its current power. Rodgers decried TikTok’s failure to appear before the committee and the harm it is visiting on children and teens.
Consumer Reports Policy Analyst Laurel Lehman (watch her opening statement) said:
- As a general rule, and especially in the wake of FOSTA-SESTA, eliminating Section 230 (c)(1) immunities—immunities for hosting and disseminating third-party content—by way of subject matter exemptions alone ) would be irresponsible. Subject matter exemptions alone are wont to drive brunt, over-broad platform responses without addressing existing failures in platform moderation systems. However, there may be room to explore narrow subject matter exceptions when they are in combination with additional narrowing factors, such as a platform’s mechanism of delivering the content. Such proposals would be along the lines introduced by the Protecting Americans from Dangerous Algorithms Act, which opened narrow civil liability for certain existing civil rights claims when platforms amplified the content in question.
- Regulators also may be already able to take action against platforms that fail to take reasonable measures to protect users from the harmful acts of others on their platforms. Section 5 of the FTC Act broadly prohibits companies from committing “unfair or deceptive acts or practices” in the marketplace. Certainly if a company commits to taking certain acts to remediate bad activity, the FTC could find that failure to follow through on those commitments constitutes a deceptive practice. Moreover, in some situations, failure to enforce clear platform rules against bad actors could be considered deceptive, as the existence of rules in the first place may be reasonably interpreted as an implicit promise to reasonably enforce them.
- The other half of the FTC’s general purpose consumer protection law is “unfairness.” To constitute an unfair business practice, it might (1) cause consumers significant injury, (2) not be reasonably avoidable by consumers, and (3) is not offset by countervailing benefits to consumers or competition. Historically, the FTC has brought many cases against companies for failing to reasonably police the behaviors of others. For example, since 2002, the FTC has brought over 80 cases against companies for failure to institute reasonable security measures to protect consumers’ personal information. Even though the threat in those cases was from hackers and other malefactors, the FTC found that companies’ failure to take cost-effective means to prevent those bad actors from accessing consumers’ data constituted an unfair practice: it exposed consumers to the risk of significant injury, the poor security was not reasonably avoidable by consumers, and the failure to institute safeguards was not outweighed by other consideration. Similarly, the FTC could find that platforms’ failure to protect consumers from bad actors by other users of the platform could also constitute an unfair business practice.
- Consumer Reports believes that we need increased regulation regarding algorithms which include increased transparency measures (on what kinds of data an algorithm uses and how the algorithm comes to a decision) to both the public and regulators, improved testing and auditing standards for algorithms used in areas with significant legal effects, and clearer restrictions on how and when certain algorithms can and should be used. We also need regulation that better outlines the rights of consumers and citizens when using the algorithms or when an algorithm makes a decision about an individual — including more individual agency of the use of algorithms, being given clear explanations about how a particular algorithmic decision works or how an algorithm arrived to its outcome, and being able to contest decisions.
- We support the goals of the Banning Surveillance Advertising Act. Consumers overwhelmingly object to being tracked across different websites, apps, smart devices, and even in the physical world by hundreds of different companies just to show them relevant ads. We appreciate that the bill is framed as a straight prohibition on surveillance advertising instead of conditioning it on opt-out or opt-in consent. In practice, opt-out rights under the California Consumer Privacy Act (CCPA) have proven to be burdensome and unworkable —it is not practical to expect consumers to navigate and manage hundreds or thousands of individual opt-outs for every site, app, or store they visit.44 On the other hand, mandating opt-in for tracking creates burdens as well — in response to GDPR and the ePrivacy Directive, many sites forced users through tedious consent screens every time they visited a site, often using confusing language and “dark patterns” to get a user to ostensibly provide “consent” to having their data shared with hundreds of companies. A simple prohibition on a universally despised practice is a better approach.
U.S. Duck Duck Go, Inc. Senior Public Policy Manager Katie McInnis (watch her opening statement) stated:
- Contextual advertising contrasts with behavioral advertising, which is based on personal profiles from data collected both on and offline about the person viewing the ad. Have you ever searched for something online only to see an advertisement for that very thing show up again in ads on apps or other websites? Do you ever feel like your phone is listening to you based on the ads you see online? That is surveillance advertising. It is the invasive ads that follow you around. Rather than finding these ads useful, multiple studies have shown that the majority of people in the U.S. think this this kind of advertising is an inappropriate use of their personal data.
- The covert collection of personal information for surveillance advertisements allows companies to granularly target individuals, violating user privacy and expectations on data collection. But what many users are not aware of is that this data collection also results in additional harm like discrimination, identity theft, fraud, and filter bubbles. The same personal information that enables a business to target potential customers with creepy ads also enables companies to discriminate against users in the presentation of critical opportunities, like education, employment, housing, and financial services.
- And these harms are not just theoretical. The Department of Housing and Urban Development charged Facebook with engaging in housing discrimination by allowing advertisers to restrict who sees ads based on race, religion, and national origin, a harm made possible thanks to data collection for surveillance ads. Online surveillance also enabled advertisers to use Google’s ad targeting algorithm to present more men than women with ads for higher-paying jobs. The practice of data collection therefore does not just affect a user’s right to privacy, but also their ability to access critical opportunities and services.
- Consumers should have an easy and effective way to avoid being surveilled online. That’s why DuckDuckGo supports bills that would protect consumers from online and offline collection of their personal information. The Banning Surveillance Advertising Act would effectively protect users from companies collecting their personal data for the purpose of serving them with targeted advertisements. DuckDuckGo has also supported state measures that protect consumers. For example, we have been a lead supporter in the creation of the browser-based Global Privacy Control setting under the California Consumer Privacy Act.
- As our history makes clear, Internet companies can be successful and profitable with a contextual advertising business model and without surveilling individuals online. In addition, studies from researchers like Alessandro Acquisti have found that behavioral advertising does not result in a significant increase in revenue for publishers. Although dominant tech companies may make significant revenue through surveillance advertising now, there is no reason why contextual advertising couldn’t be as relevant as behavioral advertising, by drawing on the context of the website, video, or audio being consumed.
AI for the People Chief Executive Officer Mutale Nkonde (watch her opening statement) said:
- Machine learning protocols rely on the use of historical data to predict what will happen in the present. The predictions made by these algorithmic actors then create a feedback loop that changes the course of society. For example, social media platforms use recommendation algorithms to predict the type of content we will retweet, share or comment on, a process referred to as engagement. The fact advertisers can use social media recommendation algorithms to reach their target audience makes them an extension of the market square. This is resulting in social platforms being used to spread divisive and racist content by bad actors who use targeted advertising to recruit people who engage in this type of content.
- For example the website Open Sea came under fire in February 2022, when Twitter users noticed it was promoting the sale of Black people on its NFT site. NFT’s are non fungible tokens, which are units of data stored on a blockchain, that can be sold and traded. These Black NFT’s were called Metasalves and part of a group of 1865 NFTs offered to the site’s users. The number 1865 is a reference to the 13th amendment that was passed that year to abolish slavery. And Open Seas’s refusal to take this down, but instead add Asian and other non white avatars to the lot speaks to the underlying racist sentiment of people who buy these tokens. This is just one egregious example of how algorithmic actors can be used to promote division among the American people, and while a meta slave has no utility it works to divide Americans against each other on the basis of race. A tactic of Special Counsel Robert Mueller uncovered during his investigation of the Russian Internet Agency during the 2020 election. This is a form of voter suppression, facilitated by algorithmic advertising.
- This use of algorithmic actors to undermine the rights of people from protected classes goes beyond social media and includes but is not limited to facial recognition that is being discussed during this hearing. I am happy to discuss facial recognition in more depth during questioning, but for the interests of time will continue in this vein. The harm algorithmic actors have on all members of the protected classes is why we need the passage of the Algorithmic Accountability Act. Not only does it create an agency of the FTC who can protect the rights of online consumers but it raises the larger question of whether we are users or consumers of algorithmically driven systems as they are used in every element of public life.
Florida Department of Law Enforcement Special Agent Supervisor Mike Duffey (watch his opening statement) stated:
- What I would like to convey to the subcommittee today is that when crimes are committed – both online and in the physical world – technology companies possess a large amount of the essential evidence law enforcement needs to do our job. But the lack of a regulatory framework that enables efficient access to that evidence means that we cannot be as effective at reducing online criminal threats or seeking justice for crime victims.
- As the popularity of tech platforms continues to grow, and as new competitive options enter the market every day, those companies struggle to build out content moderation teams and maintain law enforcement outreach teams that are robust enough to handle our growing needs. To be clear, there are instances where some companies have shown a willingness to be more helpful with compliance because they understand their users are vulnerable to serious personal safety and security risks. But we have seen them hold back because they have no clear legal or regulatory framework that levels the playing field and enables them to protect their users’ safety and bring bad actors to justice while avoiding liability or reputational risk.
- It is past time for policymakers – together with stakeholders from industry, law enforcement, privacy communities – to generate the rules of the virtual playground and a system for enforcing them.
- Most tech companies will insist that they are routinely providing law enforcement information when it is really needed in emergency – or exigent – circumstances. But determination of “exigency” is actually in the hands of the tech companies. We in law enforcement have the most relevant facts and context to determine exigency, yet the companies are the ones who have final say. This presents issues when law enforcement agencies receive complaints of individuals making online statements about causing harm to themselves or others. In one example, an individual made comments online regarding being “excited for July 9th and wanting to do what Nikolas Cruz did.” Cruz was the murderer who took 17 innocent lives and injured 17 others at Marjory Stoneman Douglas High School in Parkland, Florida in 2018. This individual’s social media postings indicated that they idolized Cruz and appeared to have visited the location where Cruz was arrested. Upon review of the information that law enforcement provided to the social media provider when asking for relevant information about the user, the company unilaterally made a determination that that they did not think the situation was an “imminent threat at this time”.
- Another ongoing challenge stemming from the lack of a standard framework governing the exchange of legal process between law enforcement and service providers is what we call the “word game” or “guess the magic word.” Specifically, unless the terms we use in formal legal process documentation to obtain content from providers match their own unique corporate terms, law enforcement must engage in a lengthy back-and-forth that costs valuable time in an investigation – for example, in an online child sexual abuse investigation. A word that we think describes a specific type of data that we’re looking for may be interpreted by the provider as something different, resulting in the provider telling us that they have no information that is responsive to our legal demand. The lack of uniform terminology therefore results in frustrating and dangerous delays in our investigations.
Hearing Roadmap

- Schakowsky asked why Duck Duck Go has eschewed targeted advertising, whether the company is still profitable, and whether consumers would benefit from giving FTC whistleblowers protection.
- Bilirakis expressed concern about the sale of illegal goods online, notably controlled substances like fentanyl, and asked how “Big Tech” failed to help in fighting these crimes..
- Pallone asked about the types of harms that can be expected in the metaverse and how marginalized communities would be affected.
- Rodgers urged Democrats to work together on national privacy legislation and asked whether Duck Duck Go still has a relationship with Russian search engine, Yandex. She asked whether Duck Duck Go has ever accepted Russian advertisements.
- Representative Bobby Rush (D-IL) asked about the use of facial recognition software and the difficulty of identifying correctly people of color. He also asked about the role of algorithms in these erroneous outcomes in facial recognition.
- Representative Fred Upton (R-MI) asked about how “Big Tech” is impacting the mental health of children and how well they respond to requests from law enforcement agencies.
- Representative Kathy Castor (D-FL) stated it is urgent to move legislation to protect online privacy issues, especially as it relates to children and teens. She asked if passing a comprehensive privacy bill a crucial part of holding “Big Tech” accountable.
- Representative Bob Latta (R-OH) asked how difficult it is for law enforcement to work with “Big Tech.”
- Representative Lori Trahan (D-MA) asked why transparency requirements are important for online platforms.
- Representative Brett Guthrie (R-KY) asked whether online platforms have formal processes to work with law enforcement agencies in policing online crime.
- Representative Jerry McNerney (D-CA) asked how a ban on surveillance advertising would change the online world.
- Representative Larry Bucshon (R-IN) asked whether platforms being more transparent with their cyberbullying moderation policies can prevent greater harms to children and teens.
- Representative Yvette Clarke (D-NY) asked for examples of how algorithmic decision making has resulted in discrimination against people of color.
- Representative Neal Dunn (R-FL) asked how Duck Duck Go detects foreign interference campaigns.
- Representative Tony Cardenas (D-CA) asked if forcing companies to publish their content moderation policies in other languages would help fight disinformation in those languages.
- Representative Debbie Lesko (R-AZ) asked about resources for Americans when their safety is violated online.
- Representative Debbie Dingell (D-MI) asked whether online platforms are actively making the choice to prioritize profits and engagement over fighting disinformation and shielding children from harm.
- Representative Greg Pence (R-IN) asked how small businesses have fared in advertising on Duck Duck Go that does not engage in behavioral advertising.
- Representative Robin Kelly (D-IL) asked if impact assessments would have more value than examining an algorithm.
- Representative Darren Soto (D-FL) asked about the dangers online.
- Representative Angie Craig (D-MN) asked how law enforcement agencies work with platforms like Snapchat in fighting illegal substances being sold on their platforms.
- Representative Tim Walberg (R-MN) asked how companies like TikTok can better design and use behavioral advertising to shield children from harm.
- Representative Anna Eshoo (D-CA) asked whether contextual advertising less effective than behavioral advertising.
Other Developments
- A court in Germany ruled in favor of Meta/Facebook and Google in finding that the Network Enforcement Act (NetzDG), a law designed to address online hate speech, violated European Union law.
- The chair of the Australian Competition and Consumer Commission (ACCC) made his annual address on “the ACCC’s enforcement and compliance policies for 2022-23,” which include “Consumer and fair trading issues relating to manipulative or deceptive advertising and marketing practices in the digital economy;” “Competition and consumer issues arising from the pricing and selling of essential services, with a focus on energy and telecommunications;” and “Competition and consumer issues relating to digital platforms.”
- The National Farmers Union, Iowa Farmers Union, Missouri Farmers Union, Montana Farmers Union, Nebraska Farmers Union, Ohio Farmers Union, Wisconsin Farmers Union, Farm Action, the U.S. Public Interest Research Group, the Illinois Public Interest Research Group, the Digital Right to Repair Coalition, and iFixit filed a complaint asking that the Federal Trade Commission (FTC) “(1) investigate Deere’s anticompetitive tying scheme, as well as its monopolization and/or attempted monopolization of the market for repairs to Deere equipment; (2) take appropriate action to enjoin Deere from continuing to withhold information in its possession that is necessary for the repair and maintenance of Deere equipment; and (3) take appropriate action to afford any other relief necessary to remedy Deere’s unlawful practices described herein.”
- United States (U.S.) Senators Edward Markey (D-MA) and Bill Cassidy (R-LA) wrote Secretary of Commerce Gina Raimondo to commend President Joe Biden’s call for legislation to better protect children online and pointed towards their bill, the “Children and Teens’ Online Privacy Protection Act” (S.1628) as accomplishing much of what the administration wants to achieve.
- United States (U.S.) Senator Elizabeth Warren (D-MA), Senate Intelligence Committee Chair Mark Warner (D-VA), Senate Banking, Housing, and Urban Affairs Chair Sherrod Brown (D-OH), Senate Armed Services Committee Chairman Jack Reed (D-RI) and other Senators wrote to Secretary of the Treasury Janet Yellen “raising concerns regarding the potential use of cryptocurrency to evade sanctions, which have become even more urgent amid the sanctions imposed on Russia after their invasion of Ukraine.”
- Singapore’s Ministry of Communications and Information (MCI) “outlined plans to build a Digitally Secure, Economically Vibrant, and Socially Stable Singapore in its Committee of Supply (COS) speeches delivered in Parliament.”
- The United States Government Accountability Office (GAO) published a report titled “Cybersecurity: Internet Architecture is Considered Resilient, but Federal Agencies Continue to Address Risks.”
- Australia’s eSafety Commissioner issued a report titled “Women In The Spotlight: How online abuse impacts women in their working lives.”
- Canada’s Privacy Commissioner Daniel Therrien wrote a letter to the Standing Committee on Access to Information, Privacy and Ethics to provide his “views on your study, Collection and Use of Mobility Data by the Government of Canada.”
- The United States Government Accountability Office (GAO) published a report titled “Critical Infrastructure Protection: CISA Should Improve Priority Setting, Stakeholder Involvement, and Threat Information Sharing”
Tweet of the Day
Further Reading
- “Here’s a list of all the tech companies taking action against Russia” By Sophie Foggin and Helen Li — Rest of the World
- “DOJ officials criticize Senate-passed cyber bill” By Ines Kagubare — The Hill.
- “Chinese tech companies now have to tell users about their algorithms” By Shen Lu — Protocol
- “BBC resurrects WWII-era shortwave broadcasts as Russia blocks news of Ukraine invasion” By Jon Porter — The Verge
- “Cyber in the Biden administration’s latest emergency funding request” By Martin Matishak — The Record
- “I’ve Dealt With Foreign Cyberattacks. America Isn’t Ready for What’s Coming.” By Glenn Gerstell — New York Times
- “Toyota stops production in Japan after a cyberattack at a supplier.” By Ben Dooley and Hisako Ueno — New York Times
- “Modernized IT Dashboard Set to Launch mid-March” By Adam Mazmanian — Nextgov
- “The first TikTok war: how are influencers in Russia and Ukraine responding?” By Chris Stokel-Walker — Guardian
- “How China’s social media handles fake news about Ukraine” By Shen Lu — Protocol
- “How shunning Russia could offer the U.S. tech giants an easy win” By Emily Birnbaum — Politico
- “Google, Meta face penalties in Russia as deadline passes to open local offices” By Paresh Dave ‚— Reuters
- “Watching the World’s “First TikTok War”” By Kyle Chayka — The New Yorker
- “U.S. tech dominance could offer leverage over Russia — or backfire” By Craig Timberg — Washington Post
Coming Events
- 8 March
- The United States (U.S.) Senate Armed Services Committee’s Cybersecurity Subcommittee will have a closed briefing on the Department of Defense’s cybersecurity operations.
- The United States (U.S.) House Intelligence Committee will hold a hearing will hold a hearing on worldwide threats.
- 9 March
- The United States (U.S.) Securities and Exchange Commission (SEC) will hold an open meeting to “consider whether to propose amendments regarding cybersecurity risk management, strategy, governance, and incident disclosure.”
- 9-10 March
- The Information Security and Privacy Advisory Board (ISPAB) will hold a quarterly open meeting and the agenda is expected to include the following items:
- Briefing from NIST on recent activities from the Information Technology Laboratory,
- Presentation from NIST on the Artificial Intelligence Risk Management Framework,
- Discussion on Cryptographic Brittleness and issues in implementations,
- Presentation from NIST on Open Source Cybersecurity Assessment Language (OSCAL),
- Discussion on the United States Government participation in National and International Standards Development Organizations,
- Briefing on NIST Cybersecurity Updates,
- Public Comments.
- The Information Security and Privacy Advisory Board (ISPAB) will hold a quarterly open meeting and the agenda is expected to include the following items:
- 10 March
- The United States (U.S.) Senate Intelligence Committee will hold open and closed hearings on worldwide threats.
- 11 March
- The United States (U.S.) Federal Communications Commission (FCC) will hold the “first of a series of virtual public hearings as a part of its broadband consumer labels rulemaking proceeding” “will be part of the record in response to the FCC’s recent Notice of Proposed Rulemaking which sought comment on a requirement that broadband providers display simple-to-understand labels that disclose, at the point of sale, accurate information about prices, introductory rates, data allowances, broadband speeds, and management practices, among other things.”
- 16 March
- The United States Federal Communications Commission (FCC) will hold an open meeting with this agenda:
- Preventing Digital Discrimination. The Commission will consider a Notice of Inquiry that would commence a proceeding to prevent and eliminate digital discrimination and ensure that all people of the United States benefit from equal access to broadband internet access service, consistent with Congress’s direction in the Infrastructure Investment and Jobs Act. (GN Docket No. 22-69)
- Resolving Pole Replacement Disputes. The Commission will consider a Second Further Notice of Proposed Rulemaking that would seek comment on questions concerning the allocation of pole replacement costs between utilities and attachers and ways to expedite the resolution of pole replacement disputes. (WC Docket No. 17-84)
- Selecting Final Round of Applicants for Connected Care Pilot Program. The Commission will consider a Public Notice announcing the fourth and final round of selections for the Commission’s Connected Care Pilot Program to provide Universal Service Fund support for health care providers making connected care services available directly to patients. (WC Docket No. 18-213)
- Restricted Adjudicatory Matter. The Commission will consider a restricted adjudicatory matter.
- National Security Matter. The Commission will consider a national security matter.
- The European Union’s Parliament’s Committee on the Internal Market and Consumer Protection will hold a hearing titled “Risks from the use of Dark Patterns for consumers and the Digital Single Market” that “will provide valuable input to the discussions and negotiations on the draft legislation being amended in IMCO, such as the Digital Services Act, the Digital Markets Act and the Data Act.”
- The United States Federal Communications Commission (FCC) will hold an open meeting with this agenda:
- 15-16 May
- The United States-European Union Trade and Technology Council will reportedly meet in France.
- 16-17 June
- The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”