A Privacy Bill A Week: Online Privacy Act of 2019

Last week, we dived into the last White House on privacy, the discussion draft of the “Consumer Privacy Bill of Rights Act of 2015“ released by the Obama Administration. This bill was released in conjunction with a report on privacy issues and then proceeded to go nowhere as there was scant appetite on Capitol Hill to legislate on privacy. Let us flash forward to the present where privacy has moved to the fore, and the first of the long-anticipated privacy bills has been released.

Representatives Anna Eshoo (D-CA) and Zoe Lofgren (D-CA) unveiled the “Online Privacy Act of 2019” (H.R. 4978), which they started working on earlier this year when it seemed clear that the House Energy and Commerce Committee’s effort to craft a bill had stalled as Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky’s (D-CA) timeline for when a bill might be unveiled continued to repeatedly slip. It must be said that this bill is going to be a non-starter with Republicans in the Senate and White House not least of which because it gives consumers a private right of action, creates a new federal agency to police data security, and does not preempt state statutes. Moreover, given Eshoo’s close political relationship with Speaker Nancy Pelosi (D-CA), this bill may be viewed in two contexts: 1) Pelosi may approve of the substance of the bill; and 2) in not trying to dissuade Eshoo and Lofgren, the Speaker may have intended to prod House Energy and Commerce to produce a bill. Finally, it bears note that Eshoo challenged current House Energy and Commerce Chair Frank Pallone Jr when former Representative Henry Waxman (D-CA) stepped down as the top Democrat, and relations between the two reportedly remain affected by the bruising contest. In any event, it is a comprehensive bill that takes a number of new approaches on some of the aspects of privacy and data security, and Eshoo and Lofgren also released a one-page summary and a section-by-section summary.

Big picture, the bill would create a new agency to oversee a new privacy and security regime, the United States Digital Privacy Agency (DPA), meaning, that unlike virtually every other privacy bill, the Federal Trade Commission (FTC) would not be the primary enforcer. However, there may still be a role for the FTC to play as discussed below. The bill unites privacy with data security, which has been a policy preference of a number of high-profile Democrats including Schakowsky and Senate Commerce, Science, and Transportation Committee Ranking Member Maria Cantwell (D-WA). Republicans have been lukewarm on this notion, however. Moreover, express, affirmative consent would generally be needed before most businesses could collect, process, maintain, or disclose a person’s personal information subject to a number of exceptions. Businesses would need to state clearly and concisely their privacy and security policies, be responsive to people exercising their rights visa vis their data, and closely supervise the service providers and third-parties with whom personal information is disclosed.

As always, it is crucial to digest the key definitions for this will inform the scope of the Act. Those entities covered by the “Online Privacy Act of 2019” is a broad group, spanning most businesses in the U.S: “a person who…intentionally collects, processes, or…maintains personal information; and…sends or receives such personal information over the internet or a similar communications network.” There are two crucial exemptions: 1) people not engaged in commercial activities and those engaging in commercial activities that is considered “de minimis;” and 2) small businesses, which are defined as entities not selling personal information, earning less than 50% of revenue from processing personal information for targeted or behavioral advertising, not having held the personal data of 250,000 or more people in the last six months, having 200 or fewer employees, and earning $25 million or less in gross revenue in the preceding year. If a small business’s status changes, and it crosses the threshold into being a covered entity, then there is a nine-month grace period before it must begin complying the Act.

Besides covered entities, two other classes of entities figure prominently in the bill: “service providers” and “third-parties.” A “service provider” is a “covered entity” that generally “processes, discloses, or maintains personal information, where such person does not process, disclose, or maintain the personal information other than in accordance with the directions and on behalf of another covered entity.” A third-party is “a person…to whom such covered entity disclosed personal information; and…is not…such covered entity…a subsidiary or corporate affiliate of such covered entity…or…a service provider of such covered entity.” Consequently, almost all disclosures of personal information made by a covered entity would likely be to either a service provider or a third party, the latter of which can be a covered entity itself.

The bill defines “data breach” in fairly standard terms as “unauthorized access to or acquisition of personal information or contents of communications maintained by such covered entity.” This term has evolved over the last decade to include mere access as opposed to exfiltration or acquisition. The Act coins a new term to cover some possible privacy violations: “data sharing abuse.” This means “processing, by a third party, of personal information or contents of communications disclosed by a covered entity to the third party, for any purpose other than—

  • a purpose specified by the covered entity to the third party at the time of disclosure; or
  • a purpose to which the individual to whom the information relates has consented.”

Personal information is simply and very comprehensively defined as “any information maintained by a covered entity that is linked or reasonably linkable to a specific individual or a specific device, including de-identified personal information and the means to behavioral personalization created for or linked to a specific individual.” Moreover, “personal information” does not include “publicly available information related to an individual” or “information derived or inferred from personal information, if the derived or inferred information is not linked or reasonably linkable to a specific individual.” Under this definition, is there an inadvertent loophole created whereby information not maintained by a covered entity is not personal information for purposes of this Act, and therefore, such information would be beyond many of the requirements of the Act?

The bill uses the definition of “contents” of a communication from the “Electronic Communications Privacy Act” (ECPA) (P.L. 99-508) that is used for wiretapping and electronic surveillance, among other purposes, which shows the intent to allow the legal structure for government surveillance to coexist frictionless alongside the new privacy regime. However, most metadata, which includes the call detail records currently being debated regarding reauthorization of National Security Agency authority, would be covered at a lesser level by this Act, meaning private sector entities could collect, process, maintain, and disclose metadata.

De-identified information are generally those data “that cannot reasonably identify, relate to, describe, reference, be capable of being associated with, or be linked, directly or indirectly, to a particular individual or device.” However, this definition stipulates further conditions that must be met: “provided that a business that uses de-identified information—

  • has de-identified the personal information using best practices for the types of data the information contains;
  • has implemented technical safeguards that prohibit re-identification of the individual with whom the information was linked;
  • has implemented business processes that specifically prohibit re-identification of the information;
  • has implemented business processes to prevent inadvertent release of de-identified information; and
  • makes no attempt to re-identify the information.”

This language is going in the right direction, for de-identification of personal information will likely need to be permanent or as close to permanent as possible to forestall the temptation some entities will invariably face to re-identify old personal information and derive value from it.

The “Online Privacy Act of 2019” spells out what constitutes a “privacy harm” and a “significant privacy harm,” two key definitions in helping covered entities gauge the sensitivity of certain information and their legal obligations in handling such information. “Privacy harm” is “adverse consequences or potential adverse consequences to an individual or society arising from the collection, processing, maintenance, or disclosure of personal information.” Such harms are identified in the definition and worth quoting in full:

  • direct or indirect financial loss or economic harm;
  • physical harm;
  • psychological harm, including anxiety, embarrassment, fear, and other demonstrable mental trauma;
  • adverse outcomes or decisions with respect to the eligibility of an individual for rights, benefits, or privileges in employment (including hiring, firing, promotion, demotion, and compensation), credit and insurance (including denial of an application or obtaining less favorable terms), housing, education, professional certification, or the provision of health care and related services;
  • stigmatization or reputational harm;
  • price discrimination;
  • other adverse consequences that affect the private life of an individual, including private family matters and actions and communications within the home of such individual or a similar physical, online, or digital location where such individual has a reasonable expectation that personal information will not be collected, processed, or retained;
  • chilling of free expression or action of an individual, group of individuals, or society generally, due to perceived or actual pervasive and excessive collection, processing, disclosure, or maintenance of personal information by a covered entity;
  • impairing the autonomy of an individual, group of individuals, or society generally; and
  • other adverse consequences or potential adverse consequences, consistent with the provisions of this Act, as determined by the Director

This list of privacy harms is as expansive, and perhaps even more so, than almost any other bill analyzed. Additionally, this list is not comprehensive, and the DPA may add other harms.

A related, crucial definition is that of “significant privacy harm” which is “adverse consequences to an individual arising from the collection, processing, maintenance, or disclosure of personal information, limited” to three specific privacy harms:

  • direct or indirect financial loss or economic harm;
  • physical harm; and
  • adverse outcomes or decisions with respect to the eligibility of an individual for rights, benefits, or privileges in employment (including hiring, firing, promotion, demotion, and compensation), credit and insurance (including denial of an application or obtaining less favorable terms), housing, education, professional certification, or the provision of health care and related services.”

A term related to these two is “protected class,” which is “the actual or perceived race, color, ethnicity, national origin, religion, sex (including sexual orientation and gender identity), familial status, or disability of an individual or group of individuals.”

The Act intends to create a safe harbor for many of the notice and consent requirements that will encourage greater use of encryption or similar methods that would make personal information and the contents of communications very hard to access. To this end, the bill defines a term “privacy preserving computing” as “the collecting, processing, disclosing, or maintaining of personal information that has been encrypted or otherwise rendered unintelligible using a means that cannot be reversed by a covered entity, or a covered entity’s service provider,” subject to further requirements. Additionally, the DPA “may determine that a methodology of privacy preserving computing is insufficient for the purposes of this definition,” so covered entities and service providers would not be free to deem any security measure “privacy preserving computing.”

The Act also makes clear that a covered entity’s sharing of personal information with a third party for any sort of remuneration will be a sale or selling, and hence the definition is “the disclosure of personal information for monetary consideration by a covered entity to a third party for the purposes of processing, maintaining or disclosing such personal information at the third party’s discretion.”

Now, let’s turn to the substance of the Act. The bill makes clear that no one may waive its requirements, and any contracts or instruments to do so are null and void. Additionally, no one may agree to any pre-dispute arbitration under this bill, meaning that no person will be forced to accept mandatory arbitration as is often the case when one agrees to the terms of service for an application or to use a device.

The “Online Privacy Act of 2019” would take effect one year after enactment, but there is curious language making clear the effective date does not “affect[] the authority to take an action expressly required by a provision of this Act to be taken before the effective date.”

The Act carves out journalism from the privacy and security requirements of the bill to the extent an organization like The New York Times is engaged in bona fide journalism as opposed to other commercial activities such as selling photographs, the latter of which may qualify such an entity as a covered entity. There is a definition of journalism, and one wonders if companies like Facebook or Google will try to get some of their activities exempted on the basis that they qualify as journalism.

The Act adds a section to the federal criminal code on extortion and threats titled “Disclosure of personal information with the intent to cause harm,” that makes a criminal offense of the actual or attempted disclosure of personal information to threaten, intimidate, or harass another person in order to commit or incite an act of violence. It is also a criminal offense to do so if a person is placed in reasonable fear of death or serious bodily injury. Violators would face fines and prison sentences of up to five years in addition to any state liability as this would seem to cover a number of crimes ushered in by the digital age: doxing, revenge porn, or making public a person’s private information in order to intimidate.

Title I of the “Online Privacy Act of 2019” would provide individuals with a number of rights regarding their personal information and how it may and may not be used by covered entities. First, people would receive a right of access which entails each covered entity making available a reasonable mechanism by which a person find out the categories of personal information and contents of communications being held and those obtained from third parties. Moreover, this information must also contain all the third parties, subsidiaries, and affiliates to whom personal information has been disclosed. Also, individuals must be able to easily access a clear and concise description of the commercial and businesses purposes for which the covered entity collects, maintains, processes, and discloses personal information. Finally, covered entities must provide a list of all automated decision-making processes it employs and those a person may ask that a human being make instead of the automated processes. Covered entities may sidestep a number of these requirements by making it publicly available on its website in a conspicuous location obviating the need for people to make requests.

Individuals would also get a right of correction allowing for the use of a reasonable mechanism to dispute the accuracy and completeness of personal information being held by a covered entity, but only if this personal information is processed such fashion as to “increase reasonably foreseeable significant privacy harms.” This language suggests that data processing that would result in mere privacy harms, say those that would impact one’s personal, familial communications, would not need to be corrected or completed. In any event, covered entities have the option to correct or complete as requested, tell the requester the information is complete or correct, respond that insufficient information does not allow for the correction or completion, or deny the request on the basis of exemptions discussed below. Small businesses are exempted from this responsibility. Of course, what ultimately is determined to be a significant privacy harm will be the result of case-by-case adjudication by the new DPA, likely in court.

People could ask that their personal information be deleted, including those data acquired from third parties or inferred by the covered entity. Again, on the basis of Section 109 exemptions, this request could be denied.

Individuals will receive a right of portability, and in order to effectuate this right, the DPA must annually publish in the Federal Register categories of online services and products that are determined to be portable. However, before a final list is published, the DPA must release an initial list of portable services and products and accept comments. Once it has been established which services and products are portable, then covered entities must allow individuals to request and receive their personal information and/or contents of communications for purposes of taking their business to a competitor. There is also language that contemplates asking one covered entity to directly transmit this information on account of a person’s request.

Upon request, covered entities must have humans make decisions instead of an “automated processing of personal information of an individual, if such processing increases reasonably foreseeable significant privacy harms for such individual.”

Before a covered entity may engage in behavior personalization, it must obtain express, affirmative consent from a person to collect, process, maintain or disclose personal information for this purpose. Behavior personalization is a term defined in the Act and “means the processing of an individual’s personal information, using an algorithm, model, or other means built using that individual’s personal information collected over a period of time, or an aggregate of the personal information of one or more similarly situated individuals and designed to—

  • alter, influence, guide, or predict an individual’s behavior;
  • tailor or personalize a product or service; or
  • filter, sort, limit, promote, display or otherwise differentiate between specific content or categories of content that would otherwise be accessible to the individual.”

This right seems squarely aimed at the use of one’s data to show him advertising based on their browsing history, searches, location, occupation, and the huge volumes of other data collected daily. Moreover, if a person denies such consent, then the product or service must be provided without the behavior personalization unless this is infeasible at which point only the core service or product need be provided. And, if it is infeasible to provide core services or products, then a covered entity may altogether deny a product or service. It is likely covered entities will seek to define “infeasible” as broadly as possible in order to leverage consent for its products and services and so that it may continue the lucrative practice of personalized advertising.

A person would also get the right to be informed which entails any covered entity that begins collecting personal information on a person despite there not being a direct relationship, the covered entity must inform the person within 30 days by writing.

There would be established a right to impermanence that would limit the holding a person’s personal information for no more time than she consented to. Covered entities must obtain affirmative, express consent from people for categories of personal information for as long as the original purpose for collection is completed or by a certain date. And yet, there is an exemption for implied consent when long-term maintenance of personal information is an obvious, core feature of a product or service and these data are maintained only to provide the product or service.

As mentioned, Section 109 details the exemptions that may allow a covered entity to disregard the rights bestowed on people under the “Online Privacy Act of 2019,” which include

  • Detecting, responding to, or preventing security incidents or threats.
  • Protecting against malicious, deceptive, fraudulent, or illegal activity.
  • Complying with specific law enforcement requests or court orders.
  • Protecting a legally recognized privilege or other legal right.
  • Protecting public safety.
  • Collection, processing, or maintenance by an employer pursuant to an employer-employee relationship of records about employees or employment status, except—
    • where the information would not be reasonably expected to be collected in the context of an employee’s regular duties; or
    • was disclosed to the employer by a third party.
  • Preventing prospective abuses of a service by an individual whose account has been previously terminated.
  • Routing a communication through a communications network or resolving the location of a host or client on a communications network.
  • Providing transparency in advertising or origination of user generated content.

However, the covered entity will need to have “technical safeguards and business processes that limit the collection, processing, maintaining, or disclosure of such personal information” to the aforementioned purposes.

This section also details the reasons why a covered entity may decline a request made pursuant to one of the rights listed in Title I:

  • A requester’s identity cannot be confirmed
  • If the request would create a legitimate risk to the privacy, security, safety, or other rights of another person
  • A legitimate risk to free expression
  • In regard to completing or deleting requests, if doing so would stop a transaction or process set into motion but not completed per a person’s request or such a request would undermine the integrity of a legally significant transaction

Service providers are altogether exempted from Title I, and covered entities employing privacy preserving computing are exempted from certain rights of people: right of access, right to human review of automated decisions, right of human review, and the right to individual autonomy. However, this exemption applies only to the data processing performed with privacy preserving computing.

Covered entities must reply to requests within 30 days and may not normally charge a fee for fulfilling requests unless it is determined that the requests are excessive or unfounded, then a covered entity may charge a fee subject to DPA approval.

Title II of the “Online Privacy Act of 2019” details the requirements placed on covered entities, service providers and third parties.

Covered entities must have a reasonable articulable basis for collecting, processing, maintaining, and disclosing personal information related to the reasonable business needs of the entity. Additionally, the covered entity must keep no more personal information than is necessary to effectuate the business or commercial purpose and these needs are to be balanced against privacy intrusions, possible privacy harms, and the reasonable expectations of people whose information in question. Additionally, covered entities should not collect more personal information than is necessary to carry out its business purpose nor should it hold these data longer than necessary. However, covered entities may engage in “ancillary” collection, processing, maintenance, and disclosure of personal information in certain circumstances subject to certain requirements. For example, if these activities are substantially similar to the original ones and it is the same type of personal information being collected and no privacy harms would result, then notice and consent are not required. However, notice is required for ancillary activities if:

  • The ancillary activities the covered entity is engaged in are similar to the original activities and there is a privacy harm risk
  • The ancillary activities are not substantially similar and there is not risk of privacy harms; or
  • The activities are substantially similar and would result in privacy harm but privacy preserving computing is used

Consequently, notice and consent would be required for any other ancillary activities that do not fall into those categories.

Covered entities would also need to limit the access of employees and contractors to personal information and the contents of communication on the basis of an articulable rationale that balances reasonable business needs, the potential for privacy harm, and the reasonable expectations of individuals. Moreover, covered entities must maintain records on all access.

There is a requirement that covered entities cannot collect or maintain any personal information unless they are in compliance with the Act. However, this requirement does not cover processing or maintaining personal information.

The disclosure of personal information by covered entities to third parties is limited only to situations when a person consents. And, any such consent is only valid after a person has been notified of all the categories of third parties the personal information may be disclosed to, the personal information to be shared, and the business purposes for doing so. Sales of personal information would be more severely constrained. Each sale to a third party by a covered entity must be agreed to by a person. What’s more, covered entities must disclose the parameters of the original purpose for the collection of the information when it sells it to a third party. Regarding the use of privacy preserving computing and de-identified personal information, disclosure does not require consent for either designation, but consent is always required for the sales of personal information.

There are provisions designed to sweep into U.S. jurisdiction players in the data ecosystem that are outside the country. The bill bars covered entities from disclosing personal information to entities not subject to U.S. jurisdiction or not in compliance with the Act. However, a safe harbor is created under which covered entities and non-U.S. entities could do business that is largely premised on the latter being willing to comply with the Act, having the cash available to pay fines for violations, and evince a willingness to be subject to DPA enforcement. The non-U.S. entity also needs to sign an agreement with the DPA. This section, however, makes clear it is seeking to create a data localization requirement in the U.S or to restrict a covered entity’s internal disclosures, so that Microsoft, say, could continue shuttling personal data around the globe to its servers without running afoul of this section.

Covered entities are barred from re-identifying de-identified information unless allowed by one of the Section 109 exemptions, and this prohibition attaches to third parties that may have the de-identified information. However, “qualified research entities” are not covered by this restriction, and it would be up to the DPA to determine who may be considered one.

A covered entity’s ability to collect, process, maintain, or disclose the contents of communication would be limited only to those situations where there is a security incident or threat, the processing is expressly requested by one of the parties to the communication, and other specified purposes. There is an exception for publicly available communications, and covered entities cannot stop people using their services or products from encrypting their communications. There is a safe harbor for service providers acting at the direction of a covered entity with a reasonable belief the directions comply with the Act.

Covered entities could not process personal information in a way that impinges a person’s opportunities on the basis of a protected class in education, employment, housing, credit, healthcare, finance, and a range of other areas. The same is true of public accommodations. Moreover, the DPA is required to promulgate regulations to effectuate this section.

The use of genetic information would be very severely limited, and more or less these types of data would only be available for medical testing and even then, subject to restrictions.

The DPA will establish a minimum percentage threshold for people to read and understand a notice for purposes of consent or a privacy policy that covered entities would need to meet or exceed before its notice or privacy policies would be allowed to be used. The DPA will establish a procedure to vet the data submitted by covered entities to show compliance with this requirement. Moreover, the DPA will make available the notices and privacy policies of all covered entities. All covered entities must make available reasonable mechanisms for people to revoke consent. And, not surprisingly, deceptive notices and privacy policies are barred.

Pursuant to these DPA approved notices, covered entities must provide clear and concise notice of the personal information being collected, maintained, processed, or disclosed. Additionally, covered entities may not collect, process, maintain, or disclose personal information without consent if it creates or increases the risk of foreseeable privacy harms. However, consent will be implied if the personal information activities of an entity are obvious on their face and notice is provided. However, privacy preserving computing would be exempt from the notice and consent requirements.

Covered entities shall, of course, have privacy policies regarding tis personal information activities, including a general description of its practices, an explanation as to how individuals may exercise their Title I rights, the categories of personal information collected, the business or commercial purposes for which such data will be used, and other requirements.

Information security would be a part of the new regime covered entities must comply with. Consequently, covered entities must design and establish an information security system to protect personal information based on the sensitivity of the data and the types of activities in which the covered entity is engaged. The information security system must include

  • A written security policy
  • A means of identifying, assessing, and mitigation security vulnerabilities
  • A process for disposing personal information securely
  • A process for overseeing those with access to personal information; and
  • A plan or protocol to respond to data breaches or data sharing abuses

In the event a data breach or data sharing abuse occurs, the covered entities must report it to the DPA within 72 hours of discovery unless the event is unlikely to create or increase foreseeable privacy harms. Any notifications made after this 72-hour window must be accompanied by reasons why it was delayed. Additionally, a covered entity must alert other covered entities from whom they personal information, and people must be notified if there is a risk of increased privacy harms.

Title III details the DPA’s structure and powers. The DPA would be headed by a Director appointed by the President and confirmed by the Senate, and the Director could appoint a Deputy Director. The Director would serve a five year, and the bill is silent is on how many terms a Director may serve. The agency would receive broad powers to set itself up and to promulgate regulations for its operations or to regulate entities under its jurisdictions. The DPA must consult with other federal agencies and state agencies in policing privacy and security. Finally, the agency would have appropriations of $550 million per year for the next five years authorized, but the Appropriations Committees would have to actually make these funds available in annually in an appropriations bill.

Title IV lays out the enforcement of the Act. The DPA could enforce the Act in two separate ways, much like the FTC’s current means of enforcement. It could initiate an internal, administrative process that would result in a cease and desist order, allowing any such defendant in this action the opportunity to challenge the agency at an agency hearing and then appealing from what would presumably be an administrative law judge’s decision to the full agency, and then to a U.S. Circuit Court of Appeals. Or, the DPA could file a complaint in a U.S. District Court and litigate against a defendant. In either case, the agency could seek civil penalties of up to $42,530 per person, and this number could get high depending on the number of people involved. For example, in the Facebook/Cambridge Analytica case where more than 87 million people were affected, if the DPA sought the maximum civil fine for each violation, the potential liability would be more than $37 trillion. It is important to note that civil penalties are calculated per person and not per violation, for the latter method could yield even larger numbers as it is easy to contemplate multiple violations per person. However, a court’s directions under the Act in terms of the factors to consider when meting out a fine would weigh against such a gigantic, company crushing fine.

In enforcing the act, the DPA must coordinate with other federal regulators, which means multiple, overlapping jurisdictions is the likely future landscape is this bill is enacted. These agencies may refer cases to the DPA for prosecution, and yet, should one federal agency initiate an action for a privacy or security violation, the DPA may not also bring an action. Moreover, the Act requires the DPA to execute an agreement with the FTC to coordinate enforcement. State attorneys general may bring actions under this Act but only if the DPA is not doing so, and the state needs to provide notice to the DPA before proceeding.

As noted, a private right of action is available for people to allege violations of the Act. However, a class action seeking civil damages could only be brought by a non-profit and not plaintiffs’ attorneys, suggesting a class action for injunctive relief may be brought by a plaintiffs’ attorney. There is also a provision allowing a whistleblower to bring an action after first allowing the DPA the option to litigate. If the DPA accepts and prevails, the whistleblower would be entitled to 15%, but if the whistleblower litigates the case, she may be entitled to between $25 and 50% of the award.

In terms of the relief the DPA or a state attorney may recover aside from civil penalties, a range of equitable relief:

  • Rescission or reformation of contracts;
  • Refund of moneys;
  • Restitution;
  • Disgorgement or compensation for unjust enrichment;
  • Payment of damages or other monetary relief;
  • Public notification regarding the violation, including the costs of notification; and
  • Limits on the activities or functions of the person;

Additionally, the DPA or state attorneys general may also seek to recover all the costs of prosecuting the case.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s