AI Transparency and Neural Data
California regulates AI in healthcare communications and gets businesses out of your head
In September of this year, California Governor Gavin Newsom signed into law Assembly Bill 3030 (AB 3030). AB 3030 added regulations on health care businesses use of communications that were generated by AI.
Key Provisions of AB 3030
AB 3030 mandates that any health facility, clinic, physician’s office, or group practice utilizing generative AI to produce written or verbal communications related to patient clinical information must:
1. Include a Disclaimer: Inform patients that the communication was generated by generative AI. The format of the disclaimer varies based on the medium:
• Written Communications: For letters, emails, and similar messages, the disclaimer must appear prominently at the beginning.
• Continuous Online Interactions: In chat-based telehealth sessions, the disclaimer should be displayed throughout the interaction.
• Audio Communications: The disclaimer must be provided verbally at both the start and end of the interaction.
• Video Communications: The disclaimer should be prominently displayed throughout the interaction.
2. Provide Human Contact Information: Offer clear instructions on how patients can reach a human health care provider, employee, or other appropriate person.
An exemption exists for communications that, although generated by generative AI, have been read and reviewed by a licensed or certified human health care provider.
Implications for Health Care Providers
Health care entities employing generative AI must now implement systems to ensure compliance with AB 3030. This includes updating communication protocols to incorporate the required disclaimers and providing accessible human contact information. Non-compliance could result in disciplinary actions by the Medical Board of California or the Osteopathic Medical Board of California, as applicable.
Broader Context
The enactment of AB 3030 reflects growing concerns about the integration of AI in sensitive sectors like health care. While AI offers potential benefits in efficiency and personalization, it also raises issues related to transparency, accountability, and patient trust.
Why Should We (Patients) Care?
Below are just five hypothetical situations, some of which are likely already occurring, that this new California law would affect. Each of these interactions/communications would now require a prominent patient disclaimer alerting the patient that AI was used in the generation of the information they are being provided.
AI-Generated Prescription Instructions
A patient receives an email from their health care provider with detailed instructions for taking a new prescription. Unbeknownst to the patient, the email’s content was fully generated by a generative AI system based on their medical record.
2. Telehealth Chat with a Virtual Assistant
A clinic uses a chat-based telehealth platform where patients interact with an AI chatbot to discuss symptoms and receive general medical advice before being directed to a human provider.
3. AI-Driven Patient Diagnosis Summary
A hospital sends a patient a video summary of their recent diagnostic results, generated entirely by AI. While the video includes graphs and voice narration explaining the findings, there is no visual or verbal disclaimer indicating the content was created by AI. Furthermore, the video does not include contact information for a human provider. This scenario directly breaches the requirements of AB 3030.
4. Automated Post-Surgery Follow-Up Call
A patient receives a phone call from their surgeon’s office after a procedure. The call, conducted entirely by an AI system, provides recovery tips and follow-up care instructions.
5. AI-Assisted Mental Health Counseling
An online mental health platform uses generative AI to assist therapists by providing written advice to patients during counseling sessions. In one instance, the advice is fully generated by AI and directly sent to the patient without review by a licensed therapist.
Thoughts Are Now Private Property: California’s Brainy New Law
California Expands “Sensitive Personal Information” to Include Neural Data: Understanding SB 1223
The landscape of consumer privacy law in California continues to evolve. California often gets criticized for having “out there” regulations and approaches to curing social problems. And, with the state being the hub of the entertainment industry, the legislators and constituents there are particularly attuned to how AI is encroaching on their way of life - and ways of making a living.
The most recent addition to the state’s regulatory approach to how businesses collect and use our personal information, however, does not seem all that radical. The latest development comes with the enactment of Senate Bill 1223 (SB 1223), introduced by Senator Becker, which amends the California Consumer Privacy Act of 2018 (CCPA) to include “neural data” under the definition of “sensitive personal information.” This expansion has significant implications for businesses that collect or process such data.
Background: The CCPA and CPRA
The CCPA grants California consumers various rights concerning their personal information collected by businesses. These rights include the ability to know what personal information is collected, to request deletion of personal information, and to opt-out of the sale of personal information. Importantly, the CCPA also provides consumers the right to direct businesses to limit the use of their “sensitive personal information.” Nothing seems extreme about such consumer protections and, like some other California laws, this one just might propagate to other states, those with more conservatively controlled legislatures included.
In November 2020, California voters approved the California Privacy Rights Act (CPRA) through Proposition 24. The CPRA amended and expanded the CCPA, enhancing consumer privacy rights and imposing additional obligations on businesses. One key area of focus is the handling of “sensitive personal information,” which includes data such as Social Security numbers, driver’s license numbers, biometric information, and precise geolocation data.
SB 1223: Inclusion of Neural Data
SB 1223 amends the definition of “sensitive personal information” within the CCPA to include a consumer’s “neural data.” According to the bill, “neural data” is defined as:
“Information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.”
This means that any data directly obtained from measuring nervous system activity—such as brainwaves, nerve signals, or other neurological indicators—now falls under the category of sensitive personal information. Importantly, this does not include data inferred from non-neural sources.
Non-Neural Sources Explained
• Data Inferred from Non-Neural Sources (Not Covered by SB 1223):
A fitness tracker monitors a consumer’s heart rate, skin temperature, and galvanic skin response to estimate stress levels or emotional states. Although the device provides insights into the consumer’s psychological state, it does so by analyzing physiological signals not directly measuring neural activity. The stress level estimation is inferred from non-neural data (heart rate, skin conductance), so it does not fall under the definition of neural data in SB 1223.
Implications for Businesses
Businesses that collect or process neural data are now subject to the stringent requirements associated with handling sensitive personal information under the CCPA and CPRA. This includes:
• Limitation of Use: Consumers have the right to direct businesses to limit the use of their neural data to what is necessary to provide the requested goods or services.
• Enhanced Disclosure Obligations: Businesses must disclose the categories of neural data collected and the purposes for which it is used.
• Data Security Requirements: Given the sensitive nature of neural data, businesses must implement robust security measures to protect it from unauthorized access or breaches.
• Potential for Increased Liability: Non-compliance can result in enforcement actions by the California Privacy Protection Agency and potential civil penalties.
Neurotechnology and Emerging Industries
The inclusion of neural data reflects the rapid advancements in neurotechnology and the increasing ability of devices to measure and interpret neural activity. Industries involved in developing brain-computer interfaces, neuroprosthetics, or even consumer products like EEG headsets must now carefully consider how they handle neural data to comply with California law. While this is only a state law, so many healthcare, exercise and other monitoring equipment manufacturers rely on the California market that this regulation effectively works to change those companies’ operations nationwide.
Legislative Authority and Related Bills
The CPRA permits the Legislature to amend its provisions to further its purposes and intent, provided such amendments are consistent with the act and passed by a majority vote in both legislative houses. SB 1223 explicitly declares that its provisions further the purposes and intent of the CPRA.
Additionally, SB 1223 notes that it incorporates changes proposed by Assembly Bill 1008 (AB 1008) to Section 1798.140 of the Civil Code. These additional changes become operative only if both bills are enacted and SB 1223 is enacted after AB 1008.
Conclusion
SB 1223 represents a significant development in California’s consumer privacy landscape by expanding the definition of sensitive personal information to include neural data. Businesses operating in California or dealing with California consumers must assess whether they collect neural data and, if so, ensure compliance with the CCPA and CPRA’s requirements.
For legal practitioners, it’s crucial to advise clients on:
• Reviewing and updating privacy policies to reflect the inclusion of neural data.
• Implementing mechanisms to allow consumers to exercise their rights concerning neural data.
• Enhancing data security measures to protect neural data from unauthorized access.
As I have often told lawyer audiences at CLE events, there is no need to be an expert on all the latest technology out there. Your focus is serving the legal needs of your clients. But, it is important to at least be aware that something new in tech, or tech regulation is occurring so you do not miss legal issues that WILL affect your clients.