ICO UPDATES PAPER ON BIG DATA, ARTIFICIAL INTELLIGENCE, MACHINE LEARNING AND DATA PROTECTION TO INCLUDE A GDPR COMPLIANCE ELEMENT

Greg Whitaker

This week, the ICO published the latest version of its paper on big data, AI and machine learning. Though not an official GDPR guidance document or code of practice, the paper sets out the ICO’s views on the issues and has been updated to show how big data, AI, machine learning relate to the GDPR (however not the new draft PEC Regulation).

Of note to Datonomy readers are the six key recommendations the Paper gives to help organisations achieve data protection compliance in a “big data world”. The ICO states that organisations should…

  1. Carefully consider whether the big data analytics to be undertaken actually requires the processing of personal data. Often, this will not be the case; in such circumstances organisations should use appropriate techniques to anonymise the personal data in their dataset(s) before analysis.
  2. Be transparent about their processing of personal data by using a combination of innovative approaches in order to provide meaningful privacy notices at appropriate stages throughout a big data project. This may include the use of icons, just-in-time notifications and layered privacy notices.
  3. Embed a privacy impact assessment framework into their big data processing activities to help identify privacy risks and assess the necessity and proportionality of a given project. The privacy impact assessment should involve input from all relevant parties including data analysts, compliance officers, board members and the public.
  4. Adopt a privacy by design approach in the development and application of their big data analytics. This should include implementing technical and organisational measures to address matters including data security, data minimisation and data segregation.
  5. Develop ethical principles to help reinforce key data protection principles. Employees in smaller organisations should use these principles as a reference point when working on big data projects. Larger organisations should create ethics boards to help scrutinise projects and assess complex issues arising from big data analytics.
  6. Implement innovative techniques to develop auditable machine learning algorithms. Internal and external audits should be undertaken with a view to explaining the rationale behind algorithmic decisions and checking for bias, discrimination and errors.

Datonomy will keep you up to date with the latest developments in this area; particularly of interest will be how the pending WP29 guidance on profiling relates to this week’s paper.

Leave a Reply

Your email address will not be published. Required fields are marked *