Other Developments, Further Reading, and Coming Events (26 July 2021)

Subscribe to my newsletter, The Wavelength, if you want the content on my blog delivered to your inbox four times a week before it’s posted here.

Other Developments

  • The Federal Trade Commission (FTC) held its second monthly meeting and voted to change agency policy on two issues: the right to repair and mergers.
    • In its press release on its newly adopted right to repair statement of policy, the agency explained:
      • The Federal Trade Commission today unanimously voted to ramp up law enforcement against repair restrictions that prevent small businesses, workers, consumers, and even government entities from fixing their own products. The policy statement adopted today is aimed at manufacturers’ practices that make it extremely difficult for purchasers to repair their products or shop around for other service providers to do it for them. By enforcing against restrictions that violate antitrust or consumer protection laws, the Commission is taking important steps to restore the right to repair.
      • In May, the FTC released a report to Congress that concluded that manufacturers use a variety of methods—such as using adhesives that make parts difficult to replace, limiting the availability of parts and tools, or making diagnostic software unavailable—that have made consumer products harder to fix and maintain. The policy statement notes that such restrictions on repairs of devices, equipment, and other products have increased the burden on consumers and businesses. In addition, manufacturers and sellers may be restricting competition for repairs in a number of ways that might violate the law.
      • In the policy statement, the Commission said it would target repair restrictions that violate antitrust laws enforced by the FTC or the FTC Act’s prohibitions on unfair or deceptive acts or practices. The Commission also urged the public to submit complaints of violations of the Magnuson-Moss Warranty Act, which prohibits, among other things, tying a consumer’s product warranty to the use of a specific service provider or product, unless the FTC has issued a waiver.
      • The Commission voted 5-0 to approve the policy statement during an open Commission meeting live streamed to its website. Chair Lina Khan issued a statement. Commissioner Rohit Chopra issued a separate statement.
    • The FTC also rescinded a 1995 policy statement on mergers and explained:
      • The Federal Trade Commission voted in an open Commission meeting to rescind a 1995 policy statement that made it more difficult and burdensome to deter problematic mergers and acquisitions. The 1995 Policy Statement on Prior Approval and Prior Notice Provisions ended the Commission’s longstanding practice of requiring parties that proposed unlawful mergers to receive prior approval and give prior notice for future transactions. By rescinding this policy statement, the FTC regains a valuable law enforcement tool.
      • Prior to 1995, the Commission required all companies that had violated the law in a previous merger to obtain prior approval by the FTC for any future transaction in at least the same product and geographic market for which a violation was alleged. In 1995, the Commission decided to do away with the requirement, based on the presumption that the Hart-Scott-Rodino premerger notification requirements would suffice. The resulting policy statement required prior approval and prior notice provisions only when there was a “credible risk” of an unlawful merger, with no regard for market conditions or a company’s prior actions.
      • Since the 1995 Policy Statement was implemented, the Commission has been forced to re-review the same transaction on numerous occasions at considerable expense. The FTC twice litigated (and won) legal challenges to Staples’ acquisition of Office Depot. Other industries involving FTC re-review of the same deal include gasoline retailing and wholesaling, gasoline import terminaling, hot oil used to process aluminum, and industrial chemicals. Just last week parties to a transaction involving the same pipelines in Utah abandoned a transaction after a lengthy Commission review, where the Commission had previously rejected the same combination.
      • The FTC is significantly under-resourced and its staff count remains roughly 50 percent less than it was in 1980, at a time when the economy was many times smaller in size than it is now. Today, as the Commission is handling a surge in merger filings, reversing the misconceived 1995 policy will stop repeat offenders – and the illegal mergers they propose that siphon resources and staff – while preserving competition in the markets.
      • The Commission voted 3-2 to rescind the policy statement in an open Commission meeting live streamed to its website.
  • California Attorney General Rob Bonta announced “successful enforcement efforts and urged more Californians to take advantage of their new rights” under the current privacy law, the California Consumer Privacy Act (CCPA). Bonta also changed the CCPA FAQs to clarify that the Global Privacy Center’s universal opt-out tool must be honored under the CCPA. The California Privacy Rights Act (CPRA) of 2020 becomes effective on 1 January 2023. Bonta asserted the CCPA “reported that upon receiving a notice of alleged violation, 75% of businesses acted to come into compliance within the 30-day statutory cure period…[and] [t]he remaining 25% of businesses that received a notice of alleged violation are either within the 30-day cure period or are under active investigation.” He added that “the California Department of Justice is seeing a wide range of numbers of consumer requests reported by businesses as required under the law….[and] [a]mong similarly sized and scoped companies, some have reported requests in the millions while others in the hundreds.” Bonta also “launched a new online tool that allows consumers to directly notify businesses of potential violations.” Bonta further explained:
    • On July 1, 2020, the California Department of Justice began enforcing the CCPA by notifying businesses found not to be in compliance with the law. Under the CCPA, businesses that received notices had 30 days to cure or fix the alleged violation before an enforcement action can be initiated. Notices to cure have been issued to entities including data brokers, marketing companies, businesses handling children’s information, media outlets, and online retailers. Examples of notices to cure included:
      • A business that manufactures and sells cars failed to notify consumers of the use of personal information when collecting personal information from consumers seeking to test drive vehicles at a dealership location, in addition to other omissions in its privacy policy. After being notified of alleged noncompliance, the business implemented a notice at collection for personal information received in connection with test drives and updated its privacy policy to include required information.  
      • A grocery chain required consumers to provide personal information in exchange for participation in its company loyalty programs. The company did not provide a Notice of Financial Incentive to participating consumers. After being notified of alleged noncompliance, the company amended its privacy policy to include a Notice of Financial Incentive. 
      • A social media app was not timely responding to CCPA requests, and users publicly complained that they were not receiving notice that their CCPA requests had been received or effectuated. The business explained its response processes and submitted detailed plans showing that it updated its CCPA consumer response procedures to include timely receipt confirmations and responses to future requests.
      • An online dating platform that collected and sold personal information did not have a “Do Not Sell My Personal Information” link on its homepage and disclosed that a user clicking an “accept sharing” button when creating a new account was sufficient to establish blanket consent to sell personal information. After being notified of alleged noncompliance, the business added a clear and conspicuous “Do Not Sell My Personal Information” link and updated its privacy policy with compliant sales disclosures.  
    • Attorney General Bonta today also launched a new online Consumer Privacy Tool that allows consumers to directly notify businesses that do not have a clear and easy-to-find “Do Not Sell My Personal Information” link on their homepage. As part of the CCPA, businesses are required to have a link to their privacy policy on their website at the bottom of the homepage. Businesses that sell personal information about consumers must also include a “Do Not Sell My Personal Information” link on their websites or mobile apps. The tool, available here, asks guided questions to walk consumers through the basic elements of the CCPA before generating a notification that the user can then email to the business. This email may trigger the 30-day period for the business to cure their violation of the law, which is a prerequisite to the Attorney General bringing an enforcement action. The tool does not constitute legal advice. 
  • The European Data Protection Board (EDPB) adopted final guidelines on virtual voice assistants (VVA). The EDPB stated:
    • Data controllers providing VVA services and their processors have therefore to consider both the GDPR and the e-Privacy Directive.
    • These guidelines identify some of the most relevant compliance challenges and provide recommendations to relevant stakeholders on how to address them.
    • Data controllers providing VVA services through screenless terminal devices must still inform users according to the GDPR when setting up the VVA or installing, or using a VVA app for the first time.
    • Consequently, we recommend to VVA providers/designers and developers to develop voice-based interfaces to facilitate the mandatory information.
      Currently, all VVAs require at least one user to register in the service. Following the obligation of data protection by design and by default, VVA providers/designers and developers should consider the necessity of having a registered user for each of their functionalities.
    • The user account employed by many VVA designers bundle the VVA service with other services such as email or video streaming. The EDPB considers that data controllers should refrain from such practices as they involve the use of lengthy and complex privacy policies that would not comply with the GDPR’s transparency principle. The guidelines consider four of the most common purposes for which VVAs process personal data: executing requests, improving the VVA machine learning model, biometric identification and profiling for personalized content or advertising.
    • Insofar the VVA data is processed in order to execute the user’s requests, i.e. as strictly necessary in order to provide a service requested by the user, data controllers are exempted from the requirement of prior consent under Article 5(3) e-Privacy Directive.
    • Conversely, such consent as required by Article 5(3) e-Privacy Directive would be necessary for the storing or gaining of access to information for any
      purpose other than executing users’ request.
    • Some VVA services retain personal data until their users require their deletion. This is not in line with the storage limitation principle. VVAs should store data for no longer than is necessary for the purposes for which the personal data are processed.
    • If a data controller becomes aware (e.g. due to quality review processes) of the accidental collection of personal data, they should verify that there is a valid legal basis for each purpose of processing of such data. Otherwise, the accidentally collected data should be deleted.
    • VVAs may process data of multiple data subjects. VVA providers/designers should therefore implement access control mechanisms to ensure personal data confidentiality, integrity and availability. However, some traditional access control mechanisms such as passwords are not fit for the VVA context since they would have to by spoken aloud. The guidelines provide some considerations on this regard, including a section specific to the processing special categories of data for biometric identification.
    • VVA providers/designers should consider that when collecting user’s voice, the recording might contain other individuals’ voice or data such as background noise that is not necessary for the service.
    • Whenever possible, VVA designers should therefore consider technologies filtering the unnecessary data and ensuring that only the user voice is recorded.
    • When evaluating the need for a Data Protection Impact Assessment (DPIA), the EDPB considers that it is very likely that VVA services fall into the categories and conditions identified as requiring a DPIA.
  • The United States Surgeon General Vivek Murthy issued “the first Surgeon General’s Advisory of this Administration to warn the American public about the urgent threat of health misinformation.” In a press release, the Department of Health and Human Services (HHS) claimed:
    • Health misinformation, including disinformation, have threatened the U.S. response to COVID-19 and continue to prevent Americans from getting vaccinated, prolonging the pandemic and putting lives at risk, and the advisory encourages technology and social media companies to take more responsibility to stop online spread of health misinformation.
    • During the COVID-19 pandemic, Americans have been exposed to a wide range of misinformation about masks and social distancing, treatments, and vaccines. As of late May, 67% of unvaccinated adults had heard at least one COVID-19 vaccine myth and either believed it to be true or were not sure of its veracity. Health misinformation has already caused significant harm, dividing families and communities and undermining vaccination efforts. An analysis of millions of social media posts found that false news stories were 70 percent more likely to be shared than true stories. And a recent study showed that even brief exposure to misinformation made people less likely to want a COVID-19 vaccine.
    • Health misinformation is information that is false, inaccurate, or misleading according to the best available evidence. It is not a recent phenomenon, and persistent rumors about HIV/AIDS for decades have undermined efforts to reduce infection rates in the U.S. During the Ebola epidemic, misinformation spread rapidly on social media. A 2014 study – PDF found that Ebola-related tweets that contained misinformation were more likely to be politically charged and have content promoting discord.
    • This advisory lays out how the nation can confront health misinformation by helping individuals, families, and communities better identify and limit its spread, and issues a number of ways institutions in education, media, medicine, research, and government stakeholders can approach this issue. It also underscores the urgent need for technology and social media companies to address the way misinformation and disinformation spread on their platforms, threatening people’s health.
  • Former President Donald Trump has continued his war against social media platforms through the filing of three class action suits against Facebook, Twitter, and Google. In a Wall Street Journal op-ed, Trump made his case:
    • One of the gravest threats to our democracy today is a powerful group of Big Tech corporations that have teamed up with government to censor the free speech of the American people. This is not only wrong—it is unconstitutional. To restore free speech for myself and for every American, I am suing Big Tech to stop it.
    • Social media has become as central to free speech as town meeting halls, newspapers and television networks were in prior generations. The internet is the new public square. In recent years, however, Big Tech platforms have become increasingly brazen and shameless in censoring and discriminating against ideas, information and people on social media—banning users, deplatforming organizations, and aggressively blocking the free flow of information on which our democracy depends.
    • No longer are Big Tech giants simply removing specific threats of violence. They are manipulating and controlling the political debate itself. Consider content that was censored in the past year. Big Tech companies banned users from their platforms for publishing evidence that showed the coronavirus emerged from a Chinese lab, which even the corporate media now admits may be true. In the middle of a pandemic, Big Tech censored physicians from discussing potential treatments such as hydroxychloroquine, which studies have now shown does work to relieve symptoms of Covid-19. In the weeks before a presidential election, the platforms banned the New York Post—America’s oldest newspaper—for publishing a story critical of Joe Biden’s family, a story the Biden campaign did not even dispute.
    • Perhaps most egregious, in the weeks after the election, Big Tech blocked the social-media accounts of the sitting president. If they can do it to me, they can do it to you—and believe me, they are.
    • In the complaints filed with a federal court in southern Florida, Trump contended:
      • If Defendants’ use of an unconstitutional delegation of authority to regulate free speech under pressure from Congress can effectively censor and impose a prior restraint on the protected political speech of a sitting President of the United States, then the threat to Putative Class Members, our citizens, and our United States Constitution and form of government, is imminent, severe, and irreparable.
      • Plaintiff respectfully asks this Court to declare that Section 230 on its face is an unconstitutional delegation of authority and that the Defendants’ actions directed at Plaintiff and Putative Class Members are a prior restraint on their First Amendment right to free speech, to order the Defendants to restore the [Twitter, Facebook, and YouTube] account[s] of Plaintiff, as well as those deplatformed Putative Class Members, and to prohibit Defendants from exercising censorship, editorial control, or prior restraint in its many forms over the posts of President Trump and Putative Class Members.
  • The Office of the Inspector General (OIG) of the Department of Energy (Energy) has issued an audit regarding “Allegations Related to the Office of Cybersecurity, Energy Security, and Emergency Response” (CESER.) The OIG stated:
    • In late 2019, the Office of Inspector General received multiple complaints related to CESER. For the purposes of this inspection, we summarized the details of the complaints into four allegations. Specifically, it was alleged that CESER lacked internal control policies and procedures and a full-time staff to oversee its budget. In addition, the Office of Inspector General received allegations that $7.5 million in CESER funds were allocated to Idaho National Laboratory (INL) to finance a startup company; software licenses purchased at a cost of up to $2.2 million were not used; and $2 million in CESER funds were inappropriately spent to update a General Services Administration (GSA) web portal. We conducted this inspection to determine the facts and circumstances surrounding the allegations related to CESER.
    • Our review substantiated certain allegations related to CESER’s management. In particular, we fully substantiated two of the allegations. Although we did not substantiate the remaining allegations, we did question the use of funds related to CESER’s activities. We determined the following regarding each of the allegations:
    • We substantiated that there was a lack of internal controls established for CESER even though the office received more than $275 million since its inception. Specifically, we found that written internal control policies and procedures were not developed for CESER to help ensure appropriate funds management. Further, a workforce management plan had not been developed, which could have guided the hiring of full-time budget personnel to oversee expenditures. These issues were particularly concerning because CESER’s September 2019 Assurance Memo, which is required by the Federal Managers’ Financial Integrity Act, asserted that CESER internal controls were operating effectively.
    • We substantiated that CESER purchased $2.1 million in cybersecurity data analysis software licenses which were to be used to monitor utility companies. Because some of the licenses purchased were utilized as part of a 1-month pilot project, we were unable to substantiate the portion of the allegation that none of the licenses were used. While the licenses were purchased over the alleged time period, we identified that only a limited number of the licenses were provided to monitor utility companies more than a year after acquiring the software. However, there was a lack of industry interest in using the software, and ultimately the licenses were not used. As such, we determined that CESER had spent $2.1 million more than necessary for unused software.
    • Although we determined that funds were provided to INL, we did not substantiate the allegation that they were used to fund a startup company. Specifically, we determined that $7.5 million in CESER funds were allocated to INL to further develop the Cyber Analytics Tools and Techniques program, which sought to enhance CESER’s capability to analyze publicly accessible energy sector internet protocol addresses and determine if there was communication with malicious or suspect threat actors. However, we did not substantiate that the funds provided to INL were used to finance a startup company. While $4 million of the funds were returned to the Department’s Office of the Chief Financial Officer after a change in management within CESER in February 2020, the project was being reconsidered near the end of our review. However, management indicated that this effort was paused pending completion of our review.
    • We did not substantiate that $2 million was spent on updates to the GSA login.gov web portal. However, we determined that $2 million was allocated for an Interagency Agreement between CESER and GSA to provide consulting and implementation work from the login.gov team of engineers, designers, and product managers to improve user integration with the Cyber Analytics Tools and Techniques program. Despite the use of a portion of the allocated CESER resources, a CESER official stated that the program remained non-operational at the time of our review. Therefore, we questioned the use of more than $128,000 in expenditures by CESER.
    • We made four recommendations in our report designed to improve the management of CESER. Specifically, we recommended that the Acting Assistant Secretary for CESER: (1) develop and implement an internal control program that includes documented policies and procedures related to areas such as contract and financial management, procurement, and staffing; (2) ensure that Federal and Department procurement requirements are followed related to areas such as acquisition of commercial software licenses, contract management, and the use of Interagency Agreements; (3) evaluate and determine whether GSA’s login.gov services should be utilized within CESER and, if not, ensure that funds are returned to CESER; and (4) ensure the Department’s Office of the General Counsel has access to all meetings related to CESER’s procurement process and a concurrence role when program decisions deviate from Federal requirements or Office of the General Counsel’s advice.
  • The United States (U.S.) Consumer Product Safety Commission (CPSC) filed an administrative complaint against Amazon in order “to force Amazon to accept responsibility for recalling potentially hazardous products sold on Amazon.com.” The CPSC is bringing this action because Amazon is selling through its Fullfilled by Amazon program children’s sleepwear that fails to meet federal flammability standards, carbon monoxide detectors that do not detect carbon monoxide, and hair dryers that fail to meet industry standards. The CPSC’s complaint follows a report it submitted to Congress showing a dramatic reduction in port inspections of consumer goods entering the U.S. because of the COVID pandemic. In its press release, the CPSC alleged:
    • The complaint charges that the specific products are defective and pose a risk of serious injury or death to consumers and that Amazon is legally responsible to recall them. The named products include 24,000 faulty carbon monoxide detectors that fail to alarm, numerous children’s sleepwear garments that are in violation of the flammable fabric safety standard risking burn injuries to children, and nearly 400,000 hair dryers sold without the required immersion protection devices that protect consumers against shock and electrocution.
    • The Commission voted 3-1 to approve the complaint, which seeks to force Amazon, as a distributor of the products, to stop selling these products, work with CPSC staff on a recall of the products and to directly notify consumers who purchased them about the recall and offer them a full refund. Although Amazon has taken certain action with respect to some of the named products, the complaint charges that those actions are insufficient.
    • In the complaint, the CPSC stated:
      • This administrative enforcement proceeding is instituted pursuant to Sections 15(c) and (d) of the Consumer Product Safety Act (“CPSA”), as amended, 15 U.S.C. §§ 2064(c) and (d), seeking public notification and remedial action to protect the public from the substantial product hazards presented by certain consumer products sold on amazon.com, and distributed by Amazon.com, Inc. through its Fulfilled by Amazon (“FBA”) program. These consumer products are set forth in more detail below.
  • The Cybersecurity and Infrastructure Security Agency (CISA) issued the results of its “Risk and Vulnerability Assessments (RVAs) of Federal Civilian Executive Branch (FCEB), Critical Infrastructure (CI), and State, Local, Tribal, and Territorial (SLTT) stakeholders.” CISA explained the “report analyzes a sample attack path that a cyber threat actor could take to compromise an organization with weaknesses that are representative of those CISA observed in the FY20 RVAs.” CISA continued:
    • The path comprises six successive tactics, or “steps”: Initial Access, Command and Control, Lateral Movement, Privilege Escalation, Collection, and Exfiltration. In addition to this analysis, the report includes the following observations:
      • Most of the successful attacks proved to be methods commonly used by threat actors, e.g., phishing, use of default credentials.
      • The list of tools and techniques used to conduct these common attacks is ever changing.
      • Many of the organizations exhibited the same weaknesses.
    • Methods such as phishing and the use of default credentials were still viable attacks. This shows that the methodologies used to compromise much of our infrastructure have not changed drastically over time. As a result, network defenders must refocus their efforts at deploying the myriad of mitigation steps already known to be effective.

Further Reading

  • Google App-Store Antitrust Suit Is Right Target, Wrong Idea” By Tae Kim — Bloomberg. Opening an antitrust action against a company whose service offers better choice and flexibility than its main rival would be a bizarre move. But that is exactly what a group of 36 state attorneys general did when they sued Alphabet Inc.’s Google on Wednesday. The announcement follows a series of major governmental antitrust actions against the internet giant. Last year, the U.S. Department of Justice and dozens of states filed complaints against Google over its dominant search engine business, while another group of states targeted the company for its ad-technology services. While the prior suits had legitimacy, this latest one seems like overreach.
  • Inside the Industry That Unmasks People at Scale” By Joseph Cox — Vice. Tech companies have repeatedly reassured the public that trackers used to follow smartphone users through apps are anonymous or at least pseudonymous, not directly identifying the person using the phone. But what they don’t mention is that an entire overlooked industry exists to purposefully and explicitly shatter that anonymity. They do this by linking mobile advertising IDs (MAIDs) collected by apps to a person’s full name, physical address, and other personal identifiable information (PII). Motherboard confirmed this by posing as a potential customer to a company that offers linking MAIDs to PII.
  • How the Postal Service Can Help Local Retailers Beat Amazon” By Eric Cortellessa — Washington Monthly. In June of 2018, CVS announced a first-of-its-kind deal: a partnership with the U.S. Postal Service to deliver prescription medications and over-the-counter products, such as toilet paper and facewash, to customers within one to two days for a modest delivery fee. The arrangement was driven by the obvious need to help the drugstore giant compete with an even bigger corporate behemoth that had all but taken over e-commerce—Amazon. Indeed, the move came as Amazon was preparing to roll out a prescription drug offering and ramp up its sale of medical supplies, according to The Wall Street Journal. CVS soon launched the pilot from its nearly 10,000 pharmacy locations throughout the United States.
  • As Cubans protest, government cracks down on internet access and messaging apps” By Kevin Collier — NBC News. As protests grip Cuba, the country’s government has taken steps to block citizens’ use of the encrypted chat apps WhatsApp, Signal and Telegram, researchers say. The entire country went offline for more than 30 minutes on Sunday, according to researchers who study internet censorship. Since then, virtual private networks, which are tools used to reroute internet traffic that can circumvent some internet censorship, and popular communication apps in Cuba have been blocked.
  • Concern trolls and power grabs: Inside Big Tech’s angry, geeky, often petty war for your privacy” By Issie Lapowsky — Protocol. James Rosewell could see his company’s future was in jeopardy. It was January 2020, and Google had just announced key details of its plan to increase privacy in its Chrome browser by getting rid of third-party cookies and essentially breaking the tools that businesses use to track people across the web. That includes businesses like 51Degrees, the U.K.-based data analytics company Rosewell has been running for the last 12 years, which uses real-time data to help businesses track their websites’ performance.
  • Exclusive extract: how Facebook’s engineers spied on women” By Sheera Frenkel and Cecilia Kang — The Telegraph. It was late at night, hours after his colleagues at Menlo Park had left the office, when the Facebook engineer felt pulled back to his laptop. He had enjoyed a few beers. Part of the reason, he thought, that his resolve was crumbling. He knew that with just a few taps at his keyboard, he could access the Facebook profile of a woman he had gone on a date with a few days ago. The date had gone well, in his opinion, but she had stopped answering his messages 24 hours after they parted ways. All he wanted to do was peek at her Facebook page to satisfy his curiosity, to see if maybe she had gotten sick, gone on vacation, or lost her dog – anything that would explain why she was not interested in a second date.
  • Amazon asked Apple to remove an app that spots fake reviews, and Apple agreed” By Annie Palmer — CNBC. Apple has removed Fakespot, a well-known app for detecting fake product reviews, from its App Store after Amazon complained the app provided misleading information and potential security risks. Fakespot’s app works by analyzing the credibility of an Amazon listing’s reviews and gives it a grade of A through F. It then provides shoppers with recommendations for products with high customer satisfaction.
  • Humanoid Robot Keeps Getting Fired From His Jobs” By Miho Inada — The Wall Street Journal. Having a robot read scripture to mourners seemed like a cost-effective idea to the people at Nissei Eco Co., a plastics manufacturer with a sideline in the funeral business. The company hired child-sized robot Pepper, clothed it in the vestments of Buddhist clergy and programmed it to chant several sutras, or Buddhist scriptures, depending on the sect of the deceased. Alas, the robot, made by SoftBank Group Corp., kept breaking down during practice runs. “What if it refused to operate in the middle of a ceremony?” said funeral-business manager Osamu Funaki. “It would be such a disaster.”
  • In 2030, You Won’t Own Any Gadgets” By Victoria Song — Gizmodo. Owning things used to be simple. You went to the store. You paid money for something, whether it be a TV, clothes, books, toys, or electronics. You took your item home, and once you paid it off, that thing belonged to you. It was yours. You could do whatever you wanted with it. That’s not how it is today, and by 2030, technology will have advanced to the point that even the idea of owning objects might be obsolete.
  • Outrage As A Business Model: How Ben Shapiro Is Using Facebook To Build An Empire” By Miles Parks — NPR. In 2021, Ben Shapiro rules Facebook. The conservative podcast host and author’s personal Facebook page has more followers than The Washington Post, and he drives an engagement machine unparalleled by anything else on the world’s biggestsocial networking site. An NPR analysis of social media data found that over the past year, stories published by the site Shapiro founded, The Daily Wire, received more likes, shares and comments on Facebook than any other news publisher by a wide margin.
  • The struggle to make health apps truly private” By Sara Morrison — recode. Jonathan J.K. Stoltman already knew how hard it can be for people with addiction to find the right treatment. As director of the Opioid Policy Institute, he also knew how much worse the pandemic made it: A family member had died of an opioid overdose last November after what Stoltman describes as an “enormous effort” to find them care. So Stoltman was hopeful that technology could improve patient access to treatment programs through things like addiction treatment and recovery apps.
  • Return Scams Jump as Fraudsters Exploit E-commerce Boom” By Suzanne Kapner — Wall Street Journal. Retailers say they are seeing a sharp increase in a type of return fraud in which consumers claim they never received their online orders even though they did. The practice, known as “item not received” fraud, took off during the pandemic, when warehouses were backed up and carriers were overwhelmed by a surge in e-commerce orders. In some cases, consumers are hiring professional fraudsters, who market their services on social media and advertise refunds of as much as $20,000 at chains such as Amazon. com Inc., Walmart Inc. and Target Corp.

Coming Events 

  • 27 July
    • The Federal Trade Commission (FTC) will hold PrivacyCon 2021. The FTC has announced this agenda:
      • Introduction: Jamie Hine, Senior Attorney, Federal Trade Commission, Division of Privacy & Identity Protection
      • Welcome to PrivacyCon: Rebecca Kelly Slaughter, Commissioner, Federal Trade Commission
      • Opening Remarks: Erie Meyer, Chief Technologist, Federal Trade Commission
      • Panel 1: Algorithms
        • Basileal Imana, University of Southern California, Auditing for Discrimination in Algorithms Delivering Job Ads
        • Hongyan Chang, National University of Singapore, On the Privacy Risks of Algorithm Fairness
        • Martin Strobel, National University of Singapore, On the Privacy Risks of Model Explanations
        • Moderator: Devin Willis, Attorney, Federal Trade Commission, Division of Privacy & Identity Protection
      • Algorithms Presentation
        • Ziad Obermeyer, University of California at Berkeley, Algorithmic Bias Playbook Presentation
        • Moderator: Lerone Banks, Technologist, Federal Trade Commission, Division of Privacy & Identity Protection
      • Panel 2: Privacy – Considerations and Understanding
        • Nico Ebert, Zurich University of Applied Sciences, Bolder is Better: Raising User Awareness Through Salient and Concise Privacy Notices
        • Siddhant Arora, Carnegie Mellon University, Finding a Choice in a Haystack: Automatic Extraction of Opt-Out Statements from Privacy Policy Text
        • Cameron Kormylo, Virginia Tech, Reconsidering Privacy Choices: The Impact of Defaults, Reversibility, and Repetition
        • Peter Mayer, Karlsruhe Institute of Technology, Now I’m a bit angry – Individuals’ Awareness, Perception, and Responses to Data Breaches that Affected Them
        • Moderator: Danielle Estrada, Attorney, Federal Trade Commission, Division of Privacy & Identity Protection
      • Panel 3: AdTech
        • Imane Fouad, Inria (France), Missed by Filter Lists: Detecting Unknown Third-Party Trackers with Invisible Pixels
        • Janus Varmarken, University of California Irvine, The TV is Smart and Full of Trackers: Measuring Smart TV Advertising and Tracking
        • Miranda Wei, University of Washington, What Twitter Knows: Characterizing Ad Targeting Practices, User Perceptions, and Ad Explanations Through Users’ Own Twitter Data
        • Moderator: Miles Plant, Attorney, Federal Trade Commission, Division of  Privacy & Identity Protection
      • Panel 4: IoT
        • AnupamDas, North Carolina State University, Hey Alexa, is this Skill Safe: Taking a Closer Look at the Alexa Skill Ecosystem
        • Jeffrey Young, Clemson University, Measuring the Policy Compliance of Voice Assistant Applications
        • Pardis Emami-Naeni, University of Washington, Which Privacy and Security Attributes Most Impact Consumers’ Risk Perception and Willingness to Purchase IoT Devices?
        • Genevieve Liberte, Florida International University, Real-time Analysis of Privacy (un)Aware IoT Applications
        • Moderator: Linda Holleran Kopp, Attorney, Federal Trade Commission, Division of Privacy & Identity Protection
      • Panel 5: Privacy – Children and Teens
        • Mohammad Mannan, Concordia University (Canada), Betrayed by the Guardian – Security and Privacy Risks of Parental Control Solutions and Parental Controls: Safer Internet Solutions or New Pitfalls?
        • Cameryn Gonnella, BBB National Programs, Risky Business – The Current State of Teen Privacy in the Android App Marketplace
        • Moderator: Manmeet Dhindsa, Attorney, Federal Trade Commission, Division of Privacy & Identity Protection
      • Panel 6: Privacy and the Pandemic
        • Marzieh Bitaab, Arizona State University, Scam Pandemic: How Attackers Exploit Public Fear through Phishing
        • Christine Geeng, University of Washington, Social Media COVID-19 Misinformation Interventions Viewed Positively, But Have Limited Impact
        • Moderator: Christina Yeung, Technologist, Federal Trade Commission, Office of Technology Research and Investigation
      • Closing Remarks
        • Lerone Banks, Technologist, Federal Trade Commission, Division of Privacy & Identity Protection
    • The House Oversight and Reform Committee’s National Security Subcommittee will hold a hearing titled “Defending the U.S. Electric Grid Against Cyber Threats.”
    • The Senate Banking, Housing, and Urban Affairs Committee will hold a hearing titled “Cryptocurrencies: What are they good for?”
    • The Senate Homeland Security and Governmental Affairs Committee will hold a hearing titled “Resources and Authorities Needed to Protect and Secure the Homeland” with Secretary of Homeland Security Alejandro Mayorkas.
    • The Senate Judiciary Committee will hold a hearing titled “America Under Cyber Siege: Preventing and Responding to Ransomware Attacks.”
    • The Senate Commerce, Science, and Transportation Committee will hold a hearing titled “Pipeline Cybersecurity: Protecting Critical Infrastructure.”
  • 28 July
    • The House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee will hold a hearing titled “Transforming the FTC: Legislation to Modernize Consumer Protection” with the five FTC Commissioners.
    • The House Oversight and Reform Committee’s Government Operations Subcommittee will hold a hearing titled “FITARA 12.0” to review the federal government’s Federal Information Technology Acquisition Reform Act (FITARA) compliance.
    • The House Administration Committee will hold a hearing titled “Election Subversion: A Growing Threat to Electoral Integrity.”
    • The House Armed Services Committee’s Cyber, Innovative Technologies, and Information Systems Subcommittee will mark up its portion of the committee’s FY 2022 National Defense Authorization Act (H.R.4395).
  • 5 August
    • The Federal Communications Commission (FCC) will hold its monthly open meeting with this tentative agenda:
      • Establishing Two New Innovation Zones. The Commission will consider a Public Notice that would create two new Innovation Zones for Program Experimental Licenses and the expansion of an existing Innovation Zone. (ET Docket No. 19-257)
      • Numbering Policies for Modern Communications. The Commission will consider a Further Notice of Proposed Rulemaking to update the Commission’s rules regarding direct access to numbers by interconnected Voice over Internet Protocol providers to safeguard the nation’s finite numbering resources, curb illegal robocalls, protect national security, and further promote public safety. (WC Docket Nos. 13-97, 07-243, 20-67; IB Docket No. 16-155)
      • Appeals of the STIR/SHAKEN Governance Authority Token Revocation Decisions. The Commission will consider a Report and Order that would establish a process for the Commission to review decisions of the private STIR/SHAKEN Governance Authority that would have the effect of placing voice service providers out of compliance with the Commission’s STIR/SHAKEN implementation rules. (WC Docket Nos. 17-97, 21-291)
      • Modernizing Telecommunications Relay Service (TRS) Compensation. The Commission will consider a Notice of Proposed Rulemaking on TRS Fund compensation methodology for IP Relay service. (CG Docket No. 03-123; RM-11820)
      • Updating Outmoded Political Programming and Record-Keeping Rules. The Commission will consider a Notice of Proposed Rulemaking to update outmoded political programming rules. (MB Docket No. 21-293)
      • Review of the Commission’s Part 95 Personal Radio Services Rules. The Commission will consider a Memorandum Opinion and Order on Reconsideration that would grant three petitions for reconsideration of the Commission’s May 2017 Part 95 Personal Radio Services Rules Report and Order. (WT Docket No. 10-119)
  • 1 September
    • The House Armed Services Committee will mark up the FY 2022 National Defense Authorization Act (H.R.4395).

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Ales Krivec on Unsplash

Photo by Jeremy Bishop on Unsplash

Photo by Clint Bustrillos on Unsplash

Photo by Markus Spiske on Unsplash

Photo by Zachary Smith on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s