• Privacy Picks
  • Posts
  • Sports Wagering: New Data Privacy Regulations in Massachusetts

Sports Wagering: New Data Privacy Regulations in Massachusetts

Plus: The FTC cracks down on a genetic testing company; A short primer on Privacy UX; Car companies are among the worst privacy offenders and more.

Welcome to Issue #7 of Privacy Picks, your weekly dose of interesting stories and insights from the world of data privacy. 

My privacy picks for this week include:

🎲 A new privacy law targeting sports wagering operators in Massachusetts.

🧬 An FTC enforcement action against a genetic testing company.

🎨 Learning the basics of Privacy UX.

đźš— Why cars are the worst privacy offenders (according to one study).

🚸 U.S. District Court in California Halts Enforcement of CAADCA

đź“° In the news . . .

Thanks for visiting, and please consider subscribing to receive future updates by email!

🎲 Massachusetts isn’t gambling with your privacy

A new set of data privacy regulations, known as the Sports Wagering Data Privacy (SWDP) regulations, recently took effect in Massachusetts. The regulations apply to sports wagering operators, including those who operate sports wagering areas, facilities or platforms.

As this article notes, the SWDP is an interesting example of how, in the absence of a comprehensive state privacy regime, sector-specific regulations may expand on the provisions of general consumer data privacy laws. 

For instance, in addition to protecting the use, disclosure, access rights, and security of a patron’s confidential information and personally identifiable information, the Massachusetts SWDP regulations also require sports wagering operators to:

“ collect and aggregate patrons’ Confidential Information and Personally Identifiable Information to analyze patron behavior for the purposes of identifying and developing programs and interventions to promote responsible gaming and support problem gamblers, and to monitor and deter Sports Wagering in violation of G.L. c. 23N and 205 CMR.”

Based on the current state of U.S. privacy law, businesses can expect to see more privacy regulations like the SWDP which combine standard privacy protections with unique sector-specific requirements. 

For more about the SWDP regulations in Massachusetts, read here and here.

🧬 The FTC cracks down on genetic privacy claims

As 1Health.io claimed on their website, “The most personal information must also be the most private.” 

That’s according to the complaint filed by the FTC, which cited several other deceptive claims and assurances made by 1Health.io (aka Vitagene, Inc.) concerning the privacy and security of their customers’ sensitive DNA testing information.

On September 7th, the FTC finalized an order with 1Health.io, settling charges against the genetic testing company for committing unfair or deceptive practices in violation of Section 5 of FTC Act.

Specifically, an investigation by the FTC revealed that 1Health.io was leaving sensitive genetic and health data unsecured, deceiving consumers about their ability to get their data deleted, and changing its privacy policy retroactively without notifying consumers and obtaining consent.

Read this article to learn more about the case against 1Health.io, the details of the FTC’s order, and why the U.S. needs to do more to address privacy protections for sensitive personal data.

🎨 The Basics of Privacy UX

I’ve been interested in learning more about Privacy UX lately, starting with the basics of how it should be implemented on websites, apps and other online platforms.

If you’re also new to the concept, here’s an intro: As an emerging subset of User Experience (UX) design, Privacy UX seeks to enhance user experience by prioritizing clear, transparent, and user-friendly interactions concerning personal data.

In contrast to deceptive dark patterns, Privacy UX is a simple and cost-effective way to reduce the risk of non-compliance with domestic and international privacy laws and, as this article explains, “build trust, promote loyalty, and avoid the negative impacts of Dark UX”.  

For example, designers can implement transparent and user-friendly UX elements by:

  • Explaining why consent will benefit the user

  • Only introducing consent options for more intrusive privacy settings when the user wants to use or assess related functionality

  • Prioritizing clear and simple language

  • Providing immediate confirmation when privacy settings are changed

  • Making privacy controls easy to access

For more on this topic, I also suggest this write-up discussing Privacy UX and “Legal Design” in the context of AI platforms.

đźš— When it comes to privacy, cars are the worst

This summer, the California Privacy Protection Agency (CPPA) announced an enforcement review of connected vehicle manufacturers and technologies, noting that modern vehicles are "effectively connected computers on wheels…able to collect a wealth of information via built-in apps, sensors, and cameras, which can monitor people both inside and near the vehicle."

Now, a report published by the Mozilla Foundation, as part of their Privacy Not Included series, has singled out the car industry as among the absolute worst regarding consumer data protection.

The report analyzed 25 car brands and discovered that every surveyed brand collects more personal data than is necessary for vehicle operation and customer relationship management.

They also found flaws in data handling (sharing and selling), limited user access and delete rights, weak security standards, and a lack of clear and affirmative consent options.

Read Mozilla’s report here along with their follow-up article, which attempts to shed light on the legality of these privacy practices given the sectoral and regional privacy landscape in the U.S.

🚸 U.S. District Court in California Halts Enforcement of CAADCA

On September 18, a U.S. District Court judge in California granted a preliminary injunction preventing enforcement of California's Age-Appropriate Design Code Act (CAADCA).

The law, which was set to take effect on July 1, 2024, requires businesses to comply with certain mandates and prohibitions when releasing online products and services likely to be accessed by children under the age of 18.

For example, among other obligations, before releasing any new product, service or feature to the public, a company would need to create a Data Protection Impact Assessment that identifies “the purpose of the online service, product, or feature, how it uses children’s personal information, and the risks of material detriment to children that arise from the data management practices of the business.”

The lawsuit was filed by NetChoice, a trade association of online businesses, which argued that the CAADCA is unconstitutional based on the First Amendment and Dormant Commerce Clause, and preempted by federal laws, specifically the Children’s Online Privacy Protection Act (COPPA) and Section 230 of the Communications Decency Act (CDA).

In granting the injunction, the Court only found that NetChoice is likely to prevail on its claim that the law violates the First Amendment.

As this article notes, the state may appeal the decision, but the holding may still provide a roadmap for others to challenge expansive privacy laws based on free speech arguments.

In Professor Solove’s opinion, the holding is a “ridiculously expansive interpretation of the First Amendment, one that would annihilate most regulation if applied elsewhere.”

đź“° In the News

  • The EDPB adopted guidelines on transfers of personal data under the Law Enforcement Directive [EDPB News]

  • Pizza Hut in Australia confirmed it suffered a data breach in early September affecting 193,000 customers [CPO Magazine]

  • The FTC extended its deadline by 120 days to consider whether companies can use “Privacy-Protective Facial Age Estimation” technology under COPPA [FTC]

  • Ireland’s DPC issues 345 million euro children’s privacy fine against TikTok [IAPP]

  • SEC announces approval of its revised Privacy Act Rules, effective October 26, 2023 [JD Supra, SEC]