- Privacy Picks
- Posts
- Is Your AI Tool a "Business Associate" Under HIPAA?
Is Your AI Tool a "Business Associate" Under HIPAA?
Plus: KOSA and COPPA 2.0 Amendments, CPPA Enforcement Priorities, and More
A lot of the news around data privacy these days reflects the rise of generative AI.
As our first article illustrates, when it comes to the healthcare industry, just using an AI tool in the same vicinity as a patient’s protected health information raises some immediate privacy concerns under HIPAA.
Other hot privacy topics addressed in this issue include proposed amendments to privacy and safety laws protecting children and teens, the enforcement priorities of the California Privacy Protection Agency (CPPA), and a resource that clarifies the different mechanisms for transferring data under the EU-US Data Privacy Framework.
Enjoy!
Featured Topics 👇
Chat-GPT and HIPAA
By now, many of us have experimented with different generative AI tools like Chat-GPT to see if they can actually make us more productive and efficient in our personal lives or at work.
However, if you work in the healthcare field and are subject to HIPAA, and use tools like Chat-GPT as part of your workflow, be sure to proceed with caution.
As this article points out, employees of HIPAA covered entities who share patient-protected health information (PHI) with third-party tools, such as Chat-GPT, need to consider whether the tool qualifies as a business associate under the HIPAA Privacy Rule.
As a reminder, a “business associate” performs functions or activities on behalf of a covered entity involving the use or disclosure of PHI. These activities may include anything from claims processing or administration, data analysis, processing or administration to accounting and legal work, utilization review, quality assurance, billing, and much more.
A business associate must provide certain assurances concerning the use or disclosure of any PHI:
The Privacy Rule allows covered providers and health plans to disclose protected health information to these “business associates” if the providers or plans obtain satisfactory assurances that the business associate will use the information only for the purposes for which it was engaged by the covered entity, will safeguard the information from misuse, and will help the covered entity comply with some of the covered entity’s duties under the Privacy Rule.
[Source]
These “satisfactory assurances” must be defined in a business associate agreement between a covered entity and the 3rd party. Without an agreement in place, a covered entity cannot share PHI with a 3rd party unless they have the patient’s authorization or another exception under HIPAA applies.
To be sure, as the article’s author discovered, even Chat-GPT advises against sharing PHI with its platform:
Sharing patient data on this platform is not recommended, as it could compromise patient confidentiality.
The Takeaway:
If you operate a covered entity under HIPAA, ensure your employees fully know the implications of using generative AI tools like Chat-GPT to perform even simple tasks and job functions. Take appropriate training, documentation, and security measures regarding the use of these tools concerning any PHI.
Kids’ Privacy and Safety Redux: Amended KOSA and COPPA 2.0 Advance By Voice Vote
The Senate Commerce Committee has approved amended versions of the Kids Online Safety Act (KOSA) and the Children and Teens' Online Privacy Protection Act (aka COPPA 2.0) to protect kids' privacy and safety.
KOSA is a safety bill that aims to reduce harmful content on social media and provide more tools and controls for minors and parents.
COPPA 2.0, on the other hand, is a privacy bill that extends privacy protections from COPPA 1.0 to teens aged 13 to 16 and changes the "actual knowledge" standard.
Both bills have faced opposition and criticism due to the potential of content blocking and increased data collection.
To learn more, read this article, which gives a rundown of notable moments from the July 27 Senate Commerce Committee Executive Session addressing the amended bills.
CPPA Enforcement Priorities ⚖️
Moving on from federal privacy bills. . . Despite the court-ordered stay on regulatory updates in California, enforcement by the CPPA is ramping up.
It's been widely reported that the California Privacy Protection Agency (CPPA) is starting to review the data privacy practices of connected vehicle (CV) manufacturers and related CV technologies.
However, as this article points out, the CPPA has a broader enforcement agenda for the coming year than just connected vehicles.
In particular, the CPPA Deputy Director of Enforcement, Michael Macko, indicated that the Agency will focus on three enforcement priorities for the coming year:
Privacy Notices and Policies, including whether businesses are honoring disclosures to their customers on how they collect, share, and use data.
The Right to Delete, including whether the right to deletion of one’s personal data is being honored by entities.
Implementation of Consumer Requests, with a focus on assessing whether companies are erecting barriers for consumers who seek to exercise their rights.
đź“š Featured Resource
Implementing Transatlantic Transfers (New IAPP Chart)
The IAPP recently published* a chart outlining the key changes and requirements for U.S. organizations participating in the Data Privacy Framework.
The chart clarifies different obligations for companies transferring personal data from Europe to the U.S. as either: 1) a current Privacy Shield participant that is converting to the Data Privacy Framework, 2) a new DPF participant, or 3) a U.S. entity not self-certified to the DPF.
The char breaks down this information depending on where data transfers originate from:
The EU, Norway, Iceland, and Liechtenstein
The UK and Gibraltar, and
Switzerland
*By IAPP Managing Director, Washington D.C., Cobun Zweifel-Keegan, CIPP/US, CIPM, and IAPP Director of Research and Insights Joe Jones
đź“° News Bytes
Final Rule on Cybersecurity Disclosures for Public Companies: The Securities and Exchange Commission (SEC) issued new rules requiring public companies to disclose material cybersecurity incidents on Form 8-K and to provide disclosure in their annual report about their processes for assessing and managing cybersecurity risks. [Learn More]
Zoom’s Updated TOS Draws Criticism: Zoom has updated its terms of service to allow the use of user-level data to train its AI models, but after facing criticism, the platform made changes to give users the option to opt-out. Privacy experts maintain that the consent provision is still unclear and have raised concerns that sensitive company information (such as trade secrets) could inadvertently be exposed to AI during Zoom meetings. [Read]
AI Can Identify Your Keystrokes by Sound: Researchers have developed a machine learning system that can accurately identify keystrokes based on sound, highlighting the potential risk of acoustic "side-channel attacks" and the need for increased security measures. [Read]
TikTok Facing Fines Over Processing of Children’s Data: TikTok is expected to be fined by the European Data Protection Board for violating children's privacy, following previous fines imposed by the Dutch DPA and the UK's Information Commissioner's Office. [Read]