Other Developments, Further Reading, and Coming Events (8 April 2021)

Other Developments

  • The European Data Protection Board (EDPB) wrote the European Union Agency for Cybersecurity (ENISA or EUCS) “to provide feedback on the EUCS candidate scheme, more specifically in relation to the potential synergies between the EUCS scheme with a view to supporting Cloud Service Customers (CSC) and Cloud  Service Providers (CSP) to comply with the principles  and rules established in the Regulation (EU) 2016/679 (GDPR) with regard to the protection of personal data.” The EDPB stated:
    • The EDPB considers it important to identify and define synergies between the different tools that support data protection compliance and those that support information security. In this context, it is also important to ensure that the controls and requirements in the cybersecurity certification scheme do not conflict with the rules and principles of the GDPR. By doing so, adherence to these tools by the concerned stakeholders would also be facilitated.
    • The EDPB made these recommendations:
      • In order to provide an added value to the EUCS scheme and to ensure that -insofar as personal data are concerned-no inconsistencies with the definitions and concepts of the GDPR occur the EDPB suggests to introduce personal data and special categories of personal data as categories of data into the EUCS scheme, taking into account that derived data (not directly provided by data subjects) may be personal data. These two categories of data should also be taken into account to guide CSPs and CSCs when applying for an appropriate assurance level.
      • The EDPB suggests that the EUCS scheme encourages CSPs to choose for their certification a level of assurance that already takes into account, where applicable, the type of personal data likely or intended to be processed and the risks to the rights and freedoms of natural persons resulting from the processing carried out.
      • The EDPB recommends including the requirements about the location of data processing in all assurance levels. Not including these requirements in the basic assurance level, as it is currently the case, would exclude this assurance level of the EUCS scheme to facilitate the compliance with GDPR for any personal data processing.
      • Therefore, the EDPB suggests clearly indicating in the scheme that the portability control in the EUCS scheme should not be confused with the right of portability in the GDPR.
      • In this regard, the EDPB recommends the involvement of the CSP’s Data Protection Officer (DPO) in the early stages of the designing of the service to assist the CSP to monitor internal compliance with the GDPR. By doing so, the security by design of the cloud service will be reinforced by contributing to the obligation of ‘data protection by design’.
      • The EDPB recommends envisaging this possibility not only for the “High level” of assurance, but also for the “Basic” and for the “Substantial” levels of assurance. Not including this requirement in the “Basic” and “Substantial” assurance levels would exclude these assurance levels of the EUCS scheme from facilitating compliance with GDPR Article 28 (3) (h).
      • The EDPB suggests adapting the wording of the EUCS requirements to reflect the GDPR requirements for personal data breaches and to make these requirements mandatory for all assurance levels of the scheme. Not including these requirements in the “Basic” and “Substantial” assurance levels would preclude the use of these assurance levels of the EUCS scheme to facilitate compliance with GDPR Article 33 (Notification of a personal data breach to the supervisory authority).
      • Therefore, the EDPB suggests that all levels of the EUCS can benefit from equivalent proofs of compliance by allowing a fully independent third-party assessment. Not including this requirement would exclude the EUCS certificates or audit report based on the self-assessment methodology to facilitate compliance with GDPR accountability tools.
  • In a switch of focus, the top Republicans on the House Energy and Commerce Committee are pressing social media platforms about the mental health effects children and teens experience in using these services. It is unclear whether this signals a shift from the by now customary and unproven claims (at least one study has found quite the opposite) platforms are biased against conservative figures, viewpoints, and content. If so, it would dovetail with the emphasis on the effect of online learning Republicans discussed at a recent hearing on child safety online during the pandemic. Ranking Member Cathy McMorris Rodgers (R-WA), Communications and Technology Subcommittee Ranking Member Bob Latta (R-OH), Consumer Protection and Commerce Subcommittee Ranking Member Gus Bilirakis (R-FL), and Oversight and Investigations Subcommittee Ranking Member Morgan Griffith (R-VA) wrote Facebook CEO Mark Zuckerberg, Alphabet and Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey asking them to submit “any documents and related information regarding any internal research or study [your company] has conducted on the effect its products…have on children’s mental health:”
    • 1. Please produce complete copies of the following:
      • Any internal research or study [your company] has conducted on the effect [your company’s] products have on children’s mental health.
      • Any internal research or study [your company] has conducted on the effect [your company’s] products have on children’s mental health for ages under 13.
      • Any internal research or study [your company] has conducted on the effect [your company’s] products have on children’s mental health for ages 13 to 18.
      • Any internal research or study [your company] has conducted on the effect [your company’s] products have on users’ mental health for ages 18 and older.
      • Any internal research or study [your company] has conducted on the effect [your company’s] products have on the health and well-being of children, including risks of child exploitation and trafficking.
      • Any internal communications, including memorandums, emails, or other internal communications among [your company’s] employees, including outside contractors (e.g., content moderators) related to the effect of [your company’s] products on children’s mental health for ages under 13.
      • Any internal communications, including memorandums, emails, or other internal communications among [your company’s] employees, including outside contractors (e.g., content moderators) related to the effect of [your company’s] products on children’s mental health for ages 13 to 18.
      • Any internal communications, including memorandums, emails, or other internal communications among [your company’s] employees, including outside contractors (e.g., content moderators) related to the effect of [your company’s] products on users’ mental health for ages 18 and older.
    • 2. Please identify any outside entity [your company] has contracted with, is in the process of contracting with, or has plans to contract with to conduct research or produce studies on the effect [your company] products have on users’ mental health for each age range delineated above.
    • 3. Please produce complete copies of any research or study conducted by outside entities on behalf of [your company] regarding the effect of [your company] products on users’ mental health for each age range delineated above.
    • 4. Please provide any research or study [your company] has conducted on the impact competitors’ products have on children’s mental health for ages 13 and under.
    • 5. Please provide any research or studies [your company] has conducted on the impact competitors’ products have on children’s mental health for ages 13 to 18.
  • House Transportation and Infrastructure Committee Chair Peter DeFazio (D-OR) wrote acting Federal Communications Commission (FCC) Chair Jessica Rosenworcel to express “his continued strong opposition to sharing the 5.9 gigahertz (GHz) radio frequency band (or Safety Band) with unlicensed Wi-Fi” that “has been reserved for dedicated short-range communications since 1999 to enable vehicle-to-everything (V2X) communications” per his press release. DeFazio argued:
    • Unfortunately, in its actions to date, the FCC appears more concerned with faster Wi-Fi than transportation safety. The parameters in the FCC’s First Report and Order (R&O) will make V2X communications vulnerable to harmful interference and leave V2X with little dedicated spectrum.[2] If the interference issues are not resolved, V2X may have no usable spectrum at all.
    • I remain deeply disturbed that the FCC’s R&O did not address the U.S. Department of Transportation’s (DOT) many technical concerns about the safety impacts of the FCC’s proposal.[3] On October 15, 2020, Secretary Elaine Chao wrote to the FCC in response to their draft R&O, saying that the FCC “ignored or rejected DOT’s previous comments in this proceeding, and has failed to give sufficient weight to the Department’s expertise in matters of transportation safety.”[4]
    • The FCC’s decision ignored the safety concerns raised by DOT, bipartisan opposition from 38 Members of Congress, every state Department of Transportation in the nation, and the entire transportation stakeholder community. Instead, the Commission, “which is not an auto safety expert,” according to FCC Commissioner Michael O’Rielly, approved an unsafe proposal that stands to undermine roadway safety.[5]
    • As Chair of the Committee on Transportation and Infrastructure, ensuring that the investments Congress makes in our transportation networks generate safety improvements is among my top priorities and responsibilities. I have discussed this issue with DOT Secretary Buttigieg, as one of the top transportation priorities of the Biden administration is improving safety. It is my hope that the FCC will take a more measured approach to the 5.9 GHz band under the Biden administration and provide the proper consideration to the impacts that this decision stands to have on the lives of the traveling public.
  • A group of experts are offering their recommendations to the Biden Administration “On Regulating Disinformation and Other Harmful Content on Social Media.” Many of these experts are affiliated with the Harvard Kennedy School and New York University’s Business School’s Center for Business and Human Rights, and they explained “[t]he recommendations fall into six categories:
    • I. Industry standards and regulatory infrastructure: The social media industry has not developed adequate standards and processes for curtailing disinformation and other harmful content. Moreover, no existing government body pays sustained attention to social media. In light of these gaps, the Administration should work with Congress to create such a regulatory body, possibly as a new Digital Bureau within the Federal Trade Commission. Authorizing legislation could require the industry to collaborate with the bureau to develop industry standards of conduct, which the bureau would then enforce.
    • II. Platform liability and incentives for more vigorous content moderation: Section 230 of the Communications Decency Act of 1996 needs to be updated. We recommend that the Administration collaborate with Congress to retain the law’s liability shield for social media platforms but add important exceptions, or “carve-outs,” for certain areas, such as civil rights infractions and cyber-stalking, where the shield would not apply. Limiting the shield in this manner would incentivize platforms to police those areas more vigorously. Modifications of Section 230 would need to be rationalized with the industry standards outlined in Section I.
    • III. Executive branch actions: In some areas, the Administration can act without Congress to improve collaboration between the Executive Branch and industry. For example, the Administration should encourage social media companies to participate more energetically in information-sharing programs, with a commitment to disseminate corporate intelligence on foreign and domestic disinformation activity. The industry also should provide this intelligence to federal law enforcement and intelligence agencies.
    • IV. Financial incentives to encourage desirable company behavior: The Administration should work with Congress to develop a system of financial incentives to encourage greater industry attention to the social costs, or “externalities,” imposed by social media platforms. A system of meaningful fines for violating industry standards of conduct regarding harmful content on the internet is one example. In addition, the Administration should promote greater transparency of the placement of digital advertising, the dominant source of social media revenue. This would create an incentive for social media companies to modify their algorithms and practices related to harmful content, which their advertisers generally seek to avoid.
    • V. Transparent advertising: We recommend that the Administration push for an enhanced version of the previously introduced Honest Ads Act. Rather than focus only on online political advertising, the act would apply new disclosure requirements to all advertising. This would obviate the need for endless debate about what constitutes a “political” ad.
    • VI. Support for credible local news organizations: The Administration should take steps to strengthen credible news organizations, especially at the local level, because of their importance to the functioning of our democracy. The reporting done by these outlets serves as a crucial counterweight to disinformation. But over the past 15 years, social media companies have siphoned off a huge portion of the advertising revenue that had sustained local journalism. The Administration should develop and support legislation that would help local news outlets survive.
  • Representative Jennifer Wexton (D-VA) and Senator Mazie Hirono (D-HI) introduced the “COVID-19 Disinformation Research and Reporting Act” (H.R.2182/S.913) to address misinformation and disinformation in the context of the COVID-19 pandemic. However if the legislation were enacted and means founds to stem the flow of this type of content, it would have obvious applications beyond just pandemic misinformation. Wexton and Hirono asserted:
    • Disinformation and misinformation has been rampant and dangerous during the COVID-19 pandemic. Vaccine disinformation remains prolific online and on social media, despite platforms’ efforts to limit its spread, and experts fear it will continue to be a barrier to stopping the spread of COVID and moving past this crisis. The spread of false information during a public health emergency like this can erode trust in science, government officials, and public health experts and make it more difficult to get accurate information to vulnerable communities, particularly vaccine outreach efforts.
    • They explained their bill:
      • The legislation would examine the roles disinformation and misinformation have played in the public’s response to the COVID-19 pandemic, including the public acceptance of vaccines, and the sources of COVID-19 disinformation and misinformation, including the mechanisms by which they influence public debate. The examination would also explore possible financial incentives from the spread of false information, the role of social media in promoting these narratives, and strategies to limit its negative impacts. The bill authorizes $1 million to the National Science Foundation (NSF) to partner with the National Academies of Science, Engineering, and Medicine (National Academies) to conduct the study. 
  • The Department of Energy’s (DOE) Office of Cybersecurity, Energy Security, and Emergency Response (CESER) “announced three new research programs to safeguard the U.S. energy system from growing cyber and physical hazards” as explained in their press release. CESER claimed its “new portfolio will ramp up protections by addressing potential global supply chain security vulnerabilities, protecting critical infrastructure from electromagnetic and geomagnetic interference, and building a research and talent pipeline for next-generation cybersecurity.” CESER claimed “new programs will:
    • Secure against vulnerabilities in globally-sourced technologies: Just as modern consumer electronics are the product of engineers, suppliers, and factories from all over the world, so too is much of the hardware and software deployed in our energy sector. But the complex global supply chains the U.S. relies on in producing this technology creates openings for security vulnerabilities. CESER is joining with Schweitzer Engineering Laboratories in the Cyber Testing for Resilient Industrial Control System (CyTRICS‚Ñ¢) program, to use state-of-the-art analytics to test the various digital tools used by energy sector partners for security issues. This testing will make it easier to identify and address potential vulnerabilities within industrial control systems before bad actors can exploit them.
    • Develop solutions to electromagnetic and geomagnetic interference: Energy sector players recognize they must anticipate risks posed by electromagnetic pulse (EMP) attacks and more likely, but typically less devastating geomagnetic disturbance (GMD) events-both of which could overload and damage energy systems. DOE is now collaborating with various utilities and labs on efforts to test, model, and assess systemic vulnerabilities to electromagnetic and geomagnetic interference. Nine pilot projects are already underway as part of DOE’s Lab Call for EMP/GMD Assessments, Testing, and Mitigation. This research will inform development of methods for protecting and mitigating impacts on energy infrastructure.
    • Cultivate research on cybersecurity solutions and new talent needed to deploy: Through CESER’s Cybersecurity for Energy Delivery Systems (CEDS) division, DOE is tapping into the innovative capacity of American universities to develop new cybersecurity technologies and train the next generation of cybersecurity experts employed by the energy sector. Next month, CESER will announce a new funding opportunity to support university-industry partnerships around cyber and physical solutions.
  • Fortnite’s maker, Epic Games, revealed in a press release it has filed its second antitrust case overseas against Apple for its App Store practices that take 30% of all in-app purchases. Epic Games filed a complaint with the United Kingdom’s (UK) Competition and Markets Authority (CMA), which launched its own investigation of Apple’s practices with app developers last month. Epic Games has also filed a complaint with the European Commission (see here for more detail and analysis.) Moreover, Epic Games received a split decision from the Competition Appeal Tribunal (CAT or Tribunal), which found the UK not to be the forum for the company to bring an antitrust action against Apple. However, in the same decision, the CAT found that the UK might be the right forum for Epic Games’ suit against Google’s two Irish subsidiaries but not against Google and Alphabet. Epic Games filed suit against both Apple and Google for their app store practices in United States federal court, and a trial in the Apple case could start next month. In announcing this most recent action, the company contended:
    • The complaint alleges that Apple’s anticompetitive behavior and prohibitively restrictive rules governing the distribution of apps and payment processing constitute a clear violation of the UK Competition Act of 1998. It also illustrates Apple’s monopolistic practices, which forbid users and developers respectively from acquiring or distributing apps through marketplaces other than Apple’s App Store, while simultaneously forcing any in-app purchase to be processed through Apple’s own payment system. 
    • Epic Games added:
      • Epic has also commenced legal proceedings against Apple in the U.S. and Australia, and has filed an antitrust complaint against Apple in the European Union in support of its ongoing investigation into Apple’s harmful App Store conduct.
      • Similar to the cases Epic has filed around the globe, the company is not seeking monetary damages. Instead, Epic is pursuing regulatory remedies that will prevent Apple’s intentional distortion and manipulation of the market and ensure fair access and competition for consumers and developers in the UK and around the world. 
    • In the press release announcing the filing of a suit in an Australian federal court, Epic Games claimed:
      • In the claim Epic alleges that Google’s anti-competitive conduct breaches the Australian Consumer Law as well as various sections of the Competition and Consumer Act 2010 (Cth). The company states that Google abuses its control over the Android operating system (“Android OS”), restricting competition in payment processing and app distribution on the Google Play Store. This harmful conduct stifles innovation, reduces consumer choice and inflates prices. 
  • The American Bar Association (ABA) wrote Secretary of Housing and Urban Development Marcia Fudge to reverse a Trump Administration interpretive rule that would, in the organization’s view, make it harder for minorities to challenge discriminatory algorithms in housing. The ABA argued:
    • On September 24, 2020, the HUD issued an update to an interpretive rule concerning section 100.500(d)(2)(i) of the FHA (“the 2020 FHA Rule”) that potentially permits hidden discrimination to occur against millions of Black and Hispanic families seeking access to mortgage loans on fair terms. Specifically, the 2020 FHA Rule creates a defense to a discrimination claim under the FHA where the “predictive analysis” tools used were not “overly restrictive on a protected class” or where they “accurately assessed risk.” By creating these threshold defenses, the Rule would make it all but impossible to challenge discriminatory algorithms. Any algorithm has the potential to perpetuate biases. Therefore, the question should not be whether it “accurately assessed risk,” but whether it reliably prevents the systemic violation of an applicant’s legal rights. Instead of creating a legal defense based on accuracy in predicting risk, the agency should instead subject lenders to a legal presumption of violation any time its algorithm produces disparities involving protected classes of Americans. That presumption can be overcome by the lender’s ability to demonstrate that every exclusion of an individual was based on factors already recognized as permissible. That is a high bar, and it should be given what is at stake.
    • Lenders should not be empowered to use mathematical models that have the effect of discriminating on prohibited bases and in ways that humans would not be permitted to under the Disparate Impact Standard of the FHA. While algorithms can be designed in ways to account for and counteract existing biases, many of the systems are not designed to do so. Indeed, many algorithmic decision-making systems have been shown to replicate, intensify, or create new biases at key decision-making points in the lending process. These systems threaten to amplify already widespread and pernicious practices of racial and ethnic discrimination in mortgage lending.
    • When businesses adopt and use algorithms to achieve ends otherwise prohibited under the FHA, the individuals subject to discriminatory acts should have recourse. Instead, the 2020 FHA rule makes it nearly impossible for victims of algorithm bias to hold companies accountable for demonstrable harm. This lack of accountability, in turn, may incentivize the adoption of tools that potentially threaten the dignity and economic stability of individuals in minority communities who are simply seeking housing security for their families.
  • A group of civil rights organizations wrote officials in five United States (U.S.) jurisdictions charged with protecting consumer welfare, asking that they investigate the major pharmacy and grocery stores that are vaccinating Americans which are requiring participants to register online and provide sensitive personal information. The coalition wrote officials in California, Illinois, Massachusetts, New York, and the District of Columbia that some of these chains “are requiring patients seeking access to the vaccine to register through their existing customer portals, which in turn exposes patients to broad personal data collection and marketing.” The organizations asserted:
    • We are specifically concerned about the collection and use of personal data for commercial purposes unrelated to the administration of these life-saving vaccines. Pharmacies are requiring patients seeking access to the vaccine to register through their existing customer portals, which in turn exposes patients to broad personal data collection and marketing. The line between the public health task of administering vaccines and the commercial practices of the administering companies is being blurred. State and federal governments share a common public health goal of promoting vaccinations and should seek to remove barriers to accessing the COVID-19 vaccine and promote public trust in the distribution process.
    • We request that your offices promptly investigate the data collection practices for the vaccine distribution programs…Pharmacies should not require individuals to submit to systems that broadly use the personal information of patients as a prerequisite for receiving the COVID-19 vaccine. There are already reports that some pharmacies plan to use this personal information to market their products to COVID-19 vaccine recipients. Patients should not have to trade unrestricted use of their sensitive personal information for a life-saving vaccine. We believe these practices are unfair and deceptive and should be halted immediately. In order to promote a robust and effective vaccine distribution program, we believe pharmacies should do the following: refrain from automatically enrolling vaccine registrants in their marketing databases; collect and use only the minimum personal data that is necessary to facilitate administration of the vaccine; and segregate vaccine registrant data from all commercial and marketing databases.
  • In a blog posting, New Zealand’s Office of the Privacy Commissioner (OPC) articulated privacy concerns about the use of proposed COVID-19 vaccine passports. The OPC is calling for privacy by design in any technology or application, full disclosure to users, a limitation of how these data are used in that health data would be used for travel potentially, and serious means of addressing cybersecurity concerns. The OPC asserted its misgivings about vaccine passports:
    • As you might expect, there are some major questions to think about. Do governments intend to use these for international travel or merely for giving holders greater freedoms? If so, will governments opt for immunity passports (to also include those who have had the virus and recovered) or strictly vaccination passports? Furthermore, what use will an immunity passport be should the vaccines prove ineffective against one of the many new strains that seems to be cropping up every day?
    • From a human rights perspective, vaccination passports could be seen as problematic. Recently, the Royal Society voiced their unease in a report about the potential for discrimination against the pregnant, the young who will be vaccinated last and others who can’t be vaccinated for medical reasons. It added that it will be crucial for governments and developers to be clear about whether these passports will be used for greater domestic freedoms or international travel.
    • What about privacy? Is privacy being taken seriously in the development, deployment, regulation and running of these Covid-19 passports?
    • According to a recent report by the open internet advocacy website, TOP10VPN, 82 percent of the Covid-19 certificate apps in operation globally have inadequate privacy policies. Moreover, 41 percent can monitor users’ precise location. The report’s author, Samuel Woodhams, is concerned that these passports have been rushed and don’t have the protections you would expect considering the sensitivity of the information they collect. He says many of these apps have generic, boilerplate privacy policies that don’t specify what information is being collected and don’t tell you how long it’s going to be stored for.
    • International experts have voiced their concerns. The Ada Lovelace Institute convened an urgent expert deliberation to consider how governments should act in response to vaccination passports. In its 17 February report, the panel considered that there was a real future risk of normalising health status surveillance by creating long-term infrastructure created to respond to a temporary crisis. The panel also expressed the view that digital identity systems could be introduced as part of an emergency infrastructure, but used for a different or expanded purpose, or what is known as “scope creep”.
    • There is serious concern that people’s personal information might be used more broadly than intended. Information might flow to third parties, and personal data may be repurposed.

Further Reading

  • Platforms vs. PhDs: How tech giants court and crush the people who study them” By Issie Lapowsky — Protocol. An interesting piece on the delicate dance between social media platforms and researchers. Facebook, Google, Twitter, and others have made some data available to academics studying online extremism, disinformation, and other topics potentially unflattering for the platforms. The social media giants make some of their data available often under conditions of limited utility for research, and so academics have employed workarounds, some of which the platforms claim violate their terms of service.  Frequently these companies argue the researchers are violating the privacy of users, a claim somehow made with a straight face considering the harvesting of personal data some of these companies engage in. My suspicion is the companies do not want researchers getting a free run of the data for fear of what they would turn up that would subsequently be used against them. 
  • Clubhouse is being investigated by a French privacy watchdog group” By Shelby Brown — c/net. The Commission nationale de l’informatique et des libertés (CNIL), France’s data protection authority (DPA), is investigating Clubhouse, the newest Silicon Valley social media sensation, for possible violations of the General Data Protection Regulation (GDPR). Because Clubhouse does not have a European Union presence, CNIL does not need to defer to the DPA of the EU where the company is headquartered. CNIL revealed other DPAs are also investigating. 
  • TikTok banning some accounts in Myanmar in attempt to stop the spread of violent videos” By Kim Lyons — The Verge. The short video platform belatedly cracked down on violent content and misinformation posted mostly by Myanmar’s military during the ongoing coup d’etat. It remains to be seen whether TikTok will ban the military and other state entities like Facebook did when it removed them from Facebook and Instagram.
  • How a Stabbing in Israel Echoes Through the Fight Over Online Speech” By David McCabe — The New York Times. A lawsuit that ultimately failed to hold Facebook and its algorithms accountable for their alleged role in the death of an American in Israel may succeed in getting changes to either 47 USC 230 or case law about the treatment of algorithmic content. 
  • Amazon’s Twitter Army Was Handpicked For “Great Sense Of Humor,” Leaked Document Reveals” By Ken Klippenstein — The Intercept. There seems to be evidence that the Amazon workers were coached and groomed as defenders of the company, especially against Senator Bernie Sanders (I-VT), and those who echoed his criticism of the company’s pay and labor practices. Internal documents showed the effort was thought up and coordinated and not organic and spontaneous as the company likely hoped it would appear.

Coming Events

  • The Senate Appropriations Committee’s Commerce, Justice, Science, and Related Agencies Subcommittee may hold a hearing on FY 2022 budget request for the National Science Foundation and the competitiveness of the United States on 13 April.
  • The Senate Appropriations Committee’s Defense Subcommittee may hold a hearing on the Department of Defense’s innovation and research on 13 April.
  • On 14 April, the Senate Intelligence Committee will hold open and closed hearings with the heads of the major United States intelligence agencies and Director of National Intelligence Avril Haines on worldwide threats.
  • The House Veterans’ Affairs Committee’s Technology Modernization Subcommittee may hold a hearing on the Department of Veterans Affairs” Electronic Health Record Modernization Program on 14 April.
  • On 15 April, the House Intelligence Committee will hold a hearing with the heads of the major United States intelligence agencies and Director of National Intelligence Avril Haines on worldwide threats.
  • The Federal Communications Commission (FCC) will hold an open meeting on 22 April with this draft agenda:
    • Text-to-988. The Commission will consider a Further Notice of Proposed Rulemaking to increase the effectiveness of the National Suicide Prevention Lifeline by proposing to require covered text providers to support text messaging to 988. (WC Docket No. 18-336)
    • Commercial Space Launch Operations. The Commission will consider a Report and Order and Further Notice of Proposed Rulemaking that would adopt a new spectrum allocation for commercial space launch operations and seek comment on additional allocations and service rules. (ET Docket No. 13-115)
    • Wireless Microphones. The Commission will consider a Notice of Proposed Rulemaking that proposes to revise the technical rules for Part 74 low-power auxiliary station (LPAS) devices to permit a recently developed, and more efficient, type of wireless microphone system. (RM-11821; ET Docket No. 21-115)
    • Improving 911 Reliability. The Commission will consider a Third Notice of Proposed Rulemaking to promote public safety by ensuring that 911 call centers and consumers receive timely and useful notifications of disruptions to 911 service. (PS Docket Nos. 13-75, 15-80; ET Docket No. 04-35
    • Concluding the 800 MHz Band Reconfiguration. The Commission will consider an Order to conclude its 800 MHz rebanding program due to the successful fulfillment of this public safety mandate. (WT Docket No. 02-55)
    • Enhancing Transparency of Foreign Government-Sponsored Programming. The Commission will consider a Report and Order to require clear disclosures for broadcast programming that is sponsored, paid for, or furnished by a foreign government or its representative. (MB Docket No. 20-299)
    • Imposing Application Cap in Upcoming NCE FM Filing Window. The Commission will consider a Public Notice to impose a limit of ten applications filed by any party in the upcoming 2021 filing window for new noncommercial educational FM stations. (MB Docket No. 20-343)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Roger Starnes Sr on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s