Earlier this week, Attorney General Andrea Campbell issued an advisory to provide guidance to developers, suppliers, and users of AI regarding their obligations under the existing state consumer protection, anti-discrimination, and data security laws. The advisory clarifies that existing state consumer protection, anti-discrimination, and data security laws apply to emerging technology, including AI systems. The Advisory is intended to address and ultimately mitigate risks by clarifying for consumers, developers, suppliers, and users of AI systems that existing state laws and regulations apply to this emerging technology to the same extent as they apply to any other product or application within the meaning of the Attorney General’s Consumer Protection regulations.
AI systems must also comply with the Commonwealth’s Standards for the Protection of Personal Information of Residents of the Commonwealth, promulgated under Chapter 93H. This means that AI developers, suppliers, and users must take the necessary and appropriate steps to safeguard personal information used by those systems and are expected to comply with the breach notification requirements set forth in statute.
Additionally, the Commonwealth’s Anti-Discrimination Law prohibits developers, suppliers, and users of AI systems from deploying technology that discriminates against residents on the basis of a legally protected characteristic. This includes algorithmic decision-making that relies on or uses discriminatory inputs and that produces discriminatory results, such as those that have the purpose or effect of disfavoring or disadvantaging a person or group of people based on a legally protected characteristic.
The advisory includes a non-exhaustive list of acts and practices that may be considered to be unfair and deceptive under the Massachusetts Consumer Protection Act (Chapter 93A of MGLs), please click here to review the advisory.