Other Developments
- Two provisions were attached to the “Violence Against Women Reauthorization Act of 2021” (H.R.1620) the House passed that would address the sharing of non-consensual sexual material. The fate of this bill is likely not positive in the Senate, however. Nonetheless, as summarized by House Majority Whip James Clyburn’s (D-SC) office:
- The amendment offered by Delegate Stacey Plaskett (D-VI) “[e]stablishes a civil cause of action against a person that discloses an intimate image of an individual without the depicted individual’s consent, if the person disclosed the image with knowledge of or reckless disregard for such lack of consent.”
- The amendment offered by Representatives Jackie Speier (D-CA) and John Katko (R-NY) “[a]dds the Stopping Harmful Image Exploitation and Limiting Distribution Act (the “SHIELD Act”) to the bill, which makes the malicious sharing of private, intimate images, known as “nonconsensual pornography” or “revenge porn” unlawful.”
- In their press release, Speier and Katko asserted “[t]he SHIELD Act would:
- Ensure that the Department of Justice has an appropriate and effective tool to address these serious privacy violations.
- Narrowly establish federal criminal liability for individuals who share private, sexually explicit or nude images without consent.
- Strike an effective balance between protecting the victims of these serious privacy violations and ensuring that vibrant online speech is not burdened.
- Prosecution under the SHIELD Act would require proving that the defendant was aware of a substantial risk that the person depicted in an image expected the image would remain private and that the person did not consent to the image’s distribution. A prosecution would also have to prove that no reasonable person would consider the shared image to touch on a matter of public concern.
- Though 46 states and the District of Columbia have enacted statutes in this area, they offer incomplete and inconsistent coverage.
- During debate, Plaskett stated the amendments (and she seemed to be talking about both) are
- the culmination of a years’ long effort to authorize explicit Federal legal action against the nonconsensual disclosure and public transmission of intimate visual imagery, following the lead of dozens of the States.
- Nobody, under any circumstances, should have private intimate imagery shared on the internet without their consent. The pain that is caused by perpetrators who knowingly share sexually explicit or nude images of someone without their consent has ruined lives and, in many instances, the lives of their family as well. It is weaponized to humiliate, harass, intimidate, and even exploit people who are primarily women.
- I am proud to support this amendment that will give prosecutors and victims important tools to bring perpetrators to justice and further deter offenders from committing such a terrible and egregious violation of privacy.
- The United Nations (UN) Open-ended working group (OEWG) on developments in the field of information and telecommunications (ICT) in the context of international security has issued its “Final Substantive Report.” This is one of two UN bodies created by resolution to study international issues in technology and cybersecurity. The OEWG’s creation was championed by the Russian Federation, the People’s Republic of China (PRC) and other nations dissatisfied with a body established per the advocacy of the United States (U.S.) and its allies, the Group of Governmental Experts (GGE). The latter group of nations saw the former group’s efforts as an attempt to put the UN’s imprimatur of their policy priorities, including allowing each nation to police the internet within its borders in ways human rights activists claim violate international law. The OEWG made a number of recommendations, which include:
- States should not conduct or knowingly support ICT activity contrary to their obligations under international law that intentionally damages critical infrastructure or otherwise impairs the use and operation of critical infrastructure to provide services to the public. Furthermore, States should continue to strengthen measures to protect of all critical infrastructure from ICT threats, and increase exchanges on best practices with regard to critical infrastructure protection.
- States in a position to do so continue to support, in a neutral and objective manner, additional efforts to build capacity, in accordance with the principles contained in paragraph 56 of this report, in the areas of international law, national legislation and policy, in order for all States to contribute to building common understandings of how international law applies to the use of ICTs by States, and to contribute to building consensus within the international community.
- The OEWG chair also issued a summary of the OEWG’s sessions:
- All the sessions of the OEWG were characterized by substantive, interactive exchanges among States, as well as with civil society, the private sector, academia and the technical community. The commitment demonstrated by States and other stakeholders throughout the work of the OEWG, with growing engagement even as some of its meetings transitioned to a virtual format, is an undeniable indication of the increasingly universal relevance of the topics under its consideration as well as the growing recognition of the urgent need to collectively address the threats to international security posed by the malicious use of ICTs.
- This summary is issued under the responsibility of the Chair and reflects his understanding of the main points that were discussed during the meetings of the Open-ended Working Group. It may not reflect the full contributions of all delegations and should not be seen as reflecting the consensus view of States on any specific points covered in it.
- The U.S. Department of State issued a statement on the final report and asserted:
- This final report is not perfect in our opinion. And we continue to have reservations about the need for the new OEWG to run until 2025. However, we recognize that we are not alone in our disappointment; many states have said they wanted to see more issues important to them addressed in the report. We therefore support the initiative to share a two-part chair’s summary on the extent of our discussion and the many proposals from member states.
- As we have indicated throughout our negotiations, the United States cannot subscribe to calls for new legal obligations. If some states refuse to explicitly affirm essential elements of existing international law and are unwilling to comply with the affirmed voluntary norms, what possible confidence could we gain from negotiating a new treaty instrument? We remain of the view that ICTs are simply not susceptible to traditional arms control arrangements. It would be futile – and a tremendous distraction – to spend a decade or more negotiating a new legally binding instrument.
- The Federal Communications Commission’s (FCC) Public Safety and Homeland Security Bureau (Bureau) has issued its list of equipment and services “that are deemed to pose an unacceptable risk to the national security of the United States or the security and safety of United States persons” as required by a recently enacted law according to the agency’s public notice. Universal Service Funds cannot be used to buy equipment or services from designated firms, and the FCC and Congress are seeking to use a bar on federal funding as a means of stopping internet services providers and telecommunications companies from buying what they consider unsafe and suspect technology. Not surprisingly, all of the equipment and services thus designated are provided by major technology firms from the People’s Republic of China (PRC). The FCC has created a webpage with these firms that currently lists Huawei, ZTE, Hytera Communications Corporation, Hangzhou Hikvision Digital Technology Company, and Dahua Technology Company. The Bureau explained:
- The Secure Networks Act (P.L.116-124) became law in March 2020. Among other things, the Secure Networks Act established a process to prohibit the use of federal subsidies to purchase equipment or services deemed to pose national security risks, as well as a reimbursement program that provided for the replacement of communications equipment or services that posed such risks. The Commission has implemented the requirements of the Secure Networks Act in a series of recent orders. Sections 2(b) and (c) of the Secure Networks Act requires that the Commission place on the Covered List “any communications equipment or service that poses an unacceptable risk to the national security of the United States or the security and safety of United States persons” based exclusively on any of four sources for such a determination and that such equipment or services possess certain capabilities as enumerated in the statute.
- The Supply Chain Second Report and Order adopted rules governing the publication of the Covered List and tasked the Bureau with both publishing and maintaining it on the Commission’s website in accordance with the Commission’s rules. These rules contain criteria for inclusion on the Covered List. Further, the Commission specifically found that the Secure Networks Act’s mandate to include telecommunications equipment and services defined in section 889(f)(3) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019 required the inclusion of the following services and equipment on the Covered List: (1) “telecommunications equipment produced or provided by Huawei [Technologies Company] or ZTE [Corporation] capable of the functions outlined in sections 2(b)(2)(A), (B), or (C) of the Secure Networks Act”; and (2) video surveillance and telecommunications equipment produced by Hytera Communications Corporation, Hangzhou Hikvision Digital Technology Company, and Dahua Technology Company “‘to the extent it is used for public safety or security,’ capable of the functions outlined in sections 2(b)(2)(A), (B), or (C) of the Secure Networks Act.”
- The European Union Agency for Cybersecurity (ENISA) issued a new report “EU Cybersecurity Initiatives in the Finance Sector” that “outlines such European cybersecurity initiatives in the sector and it is a first depiction of the complex landscape of initiatives related to cybersecurity at an EU level.”
- ENISA explained:
- In this document, we have included initiatives of European scope, i.e. ones that are implemented in at least two EU Member states. This document focusses on financial entities, EU institutions, bodies and agencies of the Finance sector, as well as the finance community at large. The document was created in an effort to shed light on the initiatives and to guide interested parties in engaging with them and benefit from their produced results. Furthermore, it aims to make cooperation between the initiatives and their different groups work more seamless. The document may facilitate future assessments on the complementarity, overlaps and gaps of the respective initiatives and identify synergies.
- In the agency’s press release, ENISA stated:
- The need for a strengthened cooperation between the key actors of the finance sector at the European level has become urgent now, as the sector faces larger-scale cyber challenges of a more harmful nature.
- The information presented in this document seeks to add more clarity and improve the cooperation between the different groups involved in these initiatives. In presenting to what extent the initiatives complement or overlap with one another, it provides the possibility of identifying potential gaps and existing synergies. It also helps to draw attention to existing initiatives and their results (guidelines, standards, legislation, etc.).
- The European cyber initiatives in the finance sector are grouped according to topics defined in the Cybersecurity Act, namely:
- Development and implementation of policy;
- Information sharing and capacity building;
- Cyber crisis management;
- Awareness-raising and training;
- Standardisation and certification;
- Research and innovation.
- ENISA explained:
- The Information Regulator (IR) South Africa noted its concern about “the statement released by WhatsApp, detailing changes that a user will face if they ignore Facebook’s terms by the May 15 deadline.” The IR asserted:
- The IR has written to Facebook South Africa and provided an analysis of some of the concerns that it has about the privacy policy of Facebook as it relates to South Africa. For example, it is the IR’s view that the processing of cellphone numbers as accessed on the user’s contact list for a purpose other than the one for which the number was specifically intended at collection, with the aim of linking the information jointly with the information processed by other responsible parties (such as Facebook companies) does not require consent from the data subject, but prior authorisation from the IR.
- Accordingly, WhatsApp cannot without obtaining prior authorisation from the IR in terms of section 57 of Protection of Personal Information Act (POPIA), process any contact information of its users for a purpose other than the one for which the number was specifically intended at collection, with the aim of linking that information jointly with information processed by other Facebook companies.
- The IR has also raised as a central concern that citizens of the European Union will receive significantly higher privacy protection than people in South Africa, and Africa.
- New Zealand’s Office of the Privacy Commissioner (OPC) and the Independent Police Conduct Authority (IPCA) published “the Terms of Reference for their current joint inquiry into Police conduct involving the photographing of members of the public” that will answer, in part, whether this practice violates the nation’s new privacy law. The OPC and IPCA asserted:
- The joint inquiry will focus on the practice of photographing members of the public who were not being detained or suspected of committing an offence.
- The intention to jointly inquire into this practice was announced in December 2020 following substantial media publicity about Police taking photographs of Māori young people in Wairarapa in August 2020. While the photographing of youth remains a key focus of the inquiry, the terms of reference are broadened and include the photographing of other members of the public.
- The key issues for consideration include:
- Whether Police actions with respect to the Wairarapa incidents complied with Police policy, the Privacy Act, and any other legislation
- The extent to which, and the reasons why, Police are photographing members of the public in public places
- The variations in practices in this respect across Police districts
- What Police policy and practice in this area should be, including the extent to which any specific restriction or requirement ought to govern the photography of children and young people
- Any compliance and enforcement actions which are required if it is found that Police breached the privacy of the individuals involved.
- The United Kingdom’s (UK) new Digital Regulation Cooperation Forum (DRCF) issued “its priorities for the coming year, marking a step-change in coordination of regulation across digital and online services” in its workplan for 2021/2022. The DRCF is an entity three British regulators established in July 2020 “to ensure a greater level of cooperation, given the unique challenges posed by regulation of online platforms” by the Competition and Markets Authority (CMA), the Information Commissioner’s Office (ICO) and the Office of Communications (Ofcom). In December 2020, these entities recommended that Prime Minister Boris Johnson and his Conservative government remake digital regulation in the UK, especially with respect to competition policy. Specifically, the called for the establishment of a new Digital Markets Unit (DMU) inside CMA that would be particularly focused on policing potential harm before it occurs. Thus far, this entity has not been established, however. Moreover, another British regulator, the Financial Conduct Authority (FCA) will join the DRCF as a full member next month. In the workplan, the DRCF explained “will focus on three priority areas.”
- A. Responding strategically to industry and technological developments: establish joint strategic projects where our cooperation will help to provide clarity for businesses and digital service users; and regulatory coherence. This includes service design, algorithmic processing, digital advertising technologies (with the Advertising Standards Authority, (ASA) and service encryption. We will also jointly horizon scan to identify future areas for cooperation.
- B. Joined-up regulatory approaches: develop approaches for delivering coherent regulatory outcomes where different regulations overlap, such as the ICO’s Age Appropriate Design Code and Ofcom’s approach to regulating video-sharing platforms. This work will also consider how planned new regimes for online regulation may interact with wider existing regulation such as financial regulation, intellectual property rights, and content regulation (including advertising content regulated by the ASA).
- C. Building skills and capabilities: develop practical ways of sharing knowledge, expertise, capabilities and resources, for example in artificial intelligence (AI) and data analysis.
- The DRCF added “we will also be taking steps to strengthen our wider stakeholder engagement and transparency, and to further develop the functioning of the DRCF to support our ambitions:
- D. Building clarity through collective engagement: through our workplan we will seek to build better clarity for our stakeholders, including through our planned joint public documents and engagement with them. We will also use our joint work to strengthen our existing domestic and international engagement, for example by attending international forums to exchange knowledge and share best practice.
- E. Developing the DRCF: we will continue to build the operational capabilities of the DRCF to ensure it is fit for purpose.
- none of your business (noyb), the advocacy organization established by Maximillian Schrems and others, announced that a lawsuit seeking to find Facebook’s consent mechanism under the General Data Protection Regulation (GDPR) is illegal has advanced to Austria’s highest court. The organization revealed it had also requested that the case be referred to the Court of Justice for the European Union (CJEU), the court in which two earlier cases brought by Schrems resulted in the invalidation of two adequacy decisions that allowed for the data of European Union citizens to be transferred to the United States. noyb also framed the appeal in the context of what it is characterizing as foot-dragging by Ireland’s Data Protection Commission, the lead supervisory agency over Facebook under the GDPR, to act against the social media giant. noyb stated:
- As the Austrian Press Agency (APA) and Der Standard report, a case that may determine the legality of Facebook’s business in Europe has reached the Austrian Supreme Court (OGH). Facebook and Mr Schrems have both filed appeals against an earlier judgment by the Higher Regional Court of Vienna (OLG Wien). Among other issues, the alleged “bypass” of the strict GDPR’s consent rules became central in the case. The Supreme Court was asked to refer the case to the European Court of Justice (CJEU) for clarification.
- noyb added:
- Facebook’s “consent bypass”. When the GDPR came into effect, one big benefit was the duty to have a clear opt-in consent when companies want to process user data. In addition to consent, there are five other legal bases to process data under Article 6(1) GDPR. One of these basis is processing that is “necessary for the performance of a contract”. On 25.5.2018 at midnight, when the GDPR became applicable, Facebook has simply named things like “personalized advertisement” in its terms and conditions. Facebook now argues that it has a “duty to provide personalized advertisement” to the users, therefore, it does not need the user’s consent to process his or her personal data.
- The big difference between consent and contract? The GDPR has very strict rules on consent. Users must be fully informed, have a free choice to agree or to disagree and must be able to consent to each type of processing specifically. Users can also withdraw consent at any time and at no costs. Contracts are, however, a matter of each national law and are usually much more flexible. Users must not have understood a contract to be bound, details can be hidden in “terms and conditions” and they may come on a “take it or leave it” basis.
- Inactivity by Data Protection Authority. The same matter was also raised by noyb before the Irish Data Protection Commission (DPC) more than 2.5 years ago. The three investigations in the alleged “forced consent” were however moving slowly and the core issue of a “bypass” was found to be out of the scope of the procedure. In the Austrian case Facebook argued that the “bypass” was implemented after it had ten meetings with the DPC that developed the “bypass” with the Social Media Company. The DPC has denied this, but refused to disclose details of the ten confidential meetings with Facebook in the run-up to the GDPR. Two of the Irish cases are now the matter of a Judicial Review before the Irish High Court, the third case is on appeal before the Austrian Federal Administrative Court (BVwG).
- Promising track record. The Austrian Supreme Court has previously referred similar cases to the CJEU (see e.g. C-18/18 – Glawischnig-Piesczek, C-498/16 – Schrems). The CJEU in turn has previously decided mostly against Facebook in privacy matters (see e.g. C-40/17 – Fashion ID or C-210/16 – Wirtschaftsakademie Schleswig-Holstein), most notably two cases against Facebook on EU-US data transfers dubbed “Schrems I” and “Schrems II”. It is therefore not unlikely that there are serious troubles ahead for Facebook. The Austrian Supreme Court does not conduct oral hearings and usually decides about references to the CJEU in a matter of months and in a written decision. The CJEU itself however takes up to 1.5 years to conduct all hearings and reach a decision.
Further Reading
- “How Amazon Crushes Unions” By David Streitfeld — The New York Times. In the face of the pending unionization vote at one its Bessemer, Alabama facilities, Amazon’s past labor practices are receiving new scrutiny as this article details a settlement with the National Labor Relations Board (NLRB) over its allegedly illegal practices in stopping a union from forming at a Virginia facility. In Virginia, Amazon pulled out all the stops including intimidating, surveilling, and firing employees in its ultimately successful effort to fend off unionization in 2014. Of course, recently New York Attorney General Letitia James “filed a lawsuit against Amazon over its failures to provide adequate health and safety measures for employees at the company’s New York facilities and Amazon’s retaliatory actions against multiple employees amidst the COVID-19 pandemic” four days after Amazon sued James seeking an injunction to bar her from regulating its New York facilities on a number of grounds.
- “America’s Drinking Water Is Surprisingly Easy to Poison” by Peter Elkind and Jack Gillum — ProPublica. The same poor cyber and digital hygiene that plagues other companies (e.g., SolarWinds allegedly using “password123” as a password) afflicts the United States’ (U.S.) water supply. A near miss in an attempted hack and poisoning of the water in Oldsmar, Florida has cast a harsh light on an apparently lightly regulated sector with respect to cybersecurity. Again, the U.S. government has gone with a trust and almost never verify model. The “America’s Water Infrastructure Act of 2018” (P.L. 115-270) merely requires water systems to inform the Environmental Protection Agency (EPA) risk assessments have been performed but not actually submit the assessments. And even if they did, the EPA is understaffed and possibly not focused on cybersecurity anyway. Experts are saying extreme good fortune and dumb luck in Oldsmar averted a very bad outcome. And it is not like this has not happened before. In 2000, in Maroochy Shire, Queensland, Australia, a former contractor hacked into a wastewater system and released more than 264,000 gallons of sewage in an act of revenge. This state of affairs should not surprise anyone as multiple warnings have been issued over the years, most recently in last year’s Cyberspace Solarium Commission’s final report in which they warned: “even though our water supply is known to be a target for malign actors, water utilities remain largely ill-prepared to defend their networks from cyber-enabled disruption.[i]”
- “‘Deepfake is the future of content creation’” By Bernd Debusmann Jr — BBC. There is another side to deepfakes, and it may revolutionize and disrupt the news and entertainment industries as the technology allows for the rendering of life-like fakes. A South Korean news station recently used deepfake technology to present a version of its news anchor that some found very similar to her. There are a range of legitimate possible uses being developed.
- “Russian Disinformation Campaign Aims to Undermine Confidence in Pfizer, Other Covid-19 Vaccines, U.S. Officials Say” By Michael R. Gordon and Dustin Volz — The Wall Street Journal. Russian intelligence services are seeking to discredit western COVID-19 vaccines through an online disinformation campaign according to United States (U.S.) government officials in order to boost the Russian Federation’s Sputnik V vaccine. They are using small media outlets controlled or manipulated by SVR or FSB and spreading rumors or amplifying and distorting legitimate news, all in the name of making western vaccines, notably the Pfizer developed vaccine, look bad. The Alliance for Securing Democracy, a part of the German Marshall Fund, is echoing this claim in its detailed analysis of Russian, Chinese, and Iranian lies, misinformation, and disinformation about the COVID-19 responses, including vaccine development and rollout, of democratic nations like the U.S.
- “Underpaid Workers Are Being Forced to Train Biased AI on Mechanical Turk” By Aliide Naylor — Vice’s Motherboard. Remote workers around the globe are essentially doing piece work to help train Amazon’s Mechanical Turk platform and are often paid far less than the minimum wage in the United States. Worse still, they feel pressure to conform to the majority sentiment about an image or idea and register the same response. For example, in evaluating paintings, these workers feel inclined to provide similar answers to others or face the risk of not being paid for their work or not landing another assignment. Reports like these give rise to questions about bias in algorithms and artificial intelligence (AI) developed through methods like these to say nothing of potential exploitation of workers.
Coming Events
- On 18 March, the Senate Homeland Security and Governmental Affairs Committee will hold a hearing titled “Understanding and Responding to the SolarWinds Supply Chain Attack: The Federal Perspective.”
- On 18 March, the House Judiciary Committee’s Antitrust, Commercial, and Administrative Law Subcommittee will hold a hearing titled “Reviving Competition, Part 3: Strengthening the Laws to Address Monopoly Power.”
- The House Financial Services Committee’s Diversity and Inclusion Subcommittee will hold a hearing titled “By the Numbers: How Diversity Data Can Measure Commitment to Diversity, Equity and Inclusion.”
- On 18 March, the Senate Finance Committee will hold a hearing titled “Fighting Forced Labor: Closing Loopholes and Improving Customs Enforcement to Mandate Clean Supply Chains and Protect Worker” with these witnesses:
- Joseph Wrona, Member, United Steelworkers
- Martina E. Vandenberg, Founder and President, Human Trafficking Law Center
- Julia K. Hughes, President, United States Fashion Industry Association
- Leonardo Bonanni, Ph.D., Founder and CEO, Sourcemap
- On 19 March, the House Armed Services Committee’s Cyber, Innovative Technologies, and Information Systems Subcommittee will hold a hearing titled “Department of Defense Electromagnetic Spectrum Operations: Challenges and Opportunities in the Invisible Battlespace,” with these witnesses:
- Dr. Joseph Kirschbaum, Director, Government Accountability Office
- Bryan Clark, Senior Fellow, Hudson Institute
- Dr. William Conley, Chief Technology Officer, Mercury Systems, Inc.
- The U.S.-China Economic and Security Review Commission will hold a hearing titled “U.S. Investment in China’s Capital Markets and Military-Industrial Complex” on 19 March that “will examine the Chinese government’s use of capital markets to advance its technology and defense capabilities and evaluate the risks of U.S. investors’ capital being leveraged for such ends:
- The first panel will examine the evolving role of the state in China’s capital markets, including the Chinese Communist Party’s involvement in corporate governance.
- The second panel will review China’s financial opening and U.S. and foreign investor participation in China’s capital markets.
- The third panel will assess U.S. national security risks posed by investment in Chinese companies.
- The fourth panel will evaluate U.S. legal authority and current restrictions on outbound investment to China’s capital markets.
- The House Energy and Commerce Committee’s Communications and Technology and Consumer Protection and Commerce Subcommittees will hold a joint hearing on 25 March “on misinformation and disinformation plaguing online platforms” with these witnesses: Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey.
- The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
- On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.
[i] Page 62, Cyberspace Solarium Commission’s (CSC) Final Report, March 2020.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.