Last updated: 29 Oct 2024 11:00 Posted in:
Companies that employ generative AI (gen-AI) need to consider and clarify their role in determining how personal data is processed by those tools.
The recommendation, from the UK’s data protection authority, the Information Commissioner’s Office (ICO), is particularly relevant to businesses buying ‘off-the shelf’ AI models for use in developing their own systems and services, according to data protection expert Kathryn Wynn of law firm Pinsent Masons.
She said those businesses must ensure that they understand when and how they, rather than the AI developer, will be responsible as controller in relation to personal data within the model.
The ICO made its recommendation in a consultation document about the allocation of responsibilities under data protection law within the gen-AI supply chain. Its call for evidence centres on the concept of ‘controllership’ in UK data protection law.
Wynn explained that ‘controllers’ of personal data are organisations that exercise overall control over the purposes and means of the processing of personal data, while ‘processors’ of that data are organisations that merely process the data on behalf, and under the instruction, of controllers.
However, the ICO said that “allocation of accountability is complicated” in the context of generative AI “because of the different ways in which generative AI models, applications and services are developed, used and disseminated, but also the different levels of control and accountability that participating organisations may have”.
The regulator added: “In practice, the relationship between developers and third-party deployers in the context of generative AI will mean there are often shared objectives and influence from both parties for the processing, which means it is likely to be a joint controllership instead of a processor-controller arrangement.”
Wynn said: “Although developers may seek to sell access to AI models on non-negotiable standard terms and conditions under which the customer is assumed to be the controller and the developer the processor, a ‘one-size-fits-all’ approach is not likely to produce accurate results, fuelling the ICO’s concerns about gaps in accountability.
“Customers looking to buy-in and deploy AI models as controllers should ensure that the developer provides sufficient information to allow them to meet data protection obligations. Where, on proper analysis, the developer and customer are joint controllers, they must ensure that responsibilities are fully identified and allocated to the party best placed to manage compliance. In those ‘joint controllership’ cases, the developer and customer will also be obliged to communicate the essence of their arrangement to individuals whose personal data they are processing,” Wynn added.
“Although developers may seek to sell access to AI models on non-negotiable standard terms and conditions under which the customer is assumed to be the controller and the developer the processor, a ‘one-size-fits-all’ approach is not likely to produce accurate results, fuelling the ICO’s concerns about gaps in accountability."
Kathryn Wynn, Data Protection Expert, Pinsent Masons