In The University of Bristol v John Peters and The Information Commissioner EA/2018/0142, the First-Tier Tribunal held that anonymised clinical trial data is not exempt from disclosure under The Freedom of Information Act 2000 (FOIA). The extent to which the trial data was “sufficiently anonymous” played a key role in the Tribunal’s decision.

What is anonymised data?

The General Data Protection Regulation (GDPR) explains that anonymous information means that the individual is no longer identifiable. When personal data is anonymised, it is exempt from the requirements of the GDPR and can in theory therefore be disclosed without breaching the GDPR. Due to the intimately personal nature of health data and the proliferation of “big data” analytics, the extent to which clinical trial data can truly be anonymised has been the subject of much debate.

The facts

The University of Bristol conducted a clinical trial which tested the effectiveness of certain treatments for children suffering from myalgic encephalomyelitis, also known as ME or chronic fatigue syndrome. The University recruited 100 children between the ages of 12 to 18, primarily in the Bath area, to participate in the trial and the results were published in September 2017.

Mr Peters disputed the research findings and requested access to anonymised data used in the trial under section 1(1)(b) FOIA. Each line of data requested was the data of an individual trial participant. The University’s decision to refuse access to the data rested primarily on its belief that, although anonymised, it could not be “certain” that release of the requested data would not lead to re-identification of the research participants. As such, the University argued that disclosure of the requested data was exempt under section 40 of FOIA.

Mr Peters previously successfully challenged the Queen Mary University of London (QMUL) to disclose anonymised clinical trial data in similar circumstances. However, the Information Commissioner (ICO) distinguished the QMUL case on the basis that the database at issue there was a large national one of adults, whereas the University’s trial covered 100 school children in a very limited geographical area. The ICO concluded that the likelihood of reidentification of participants in the present trial was therefore “reasonably likely”. Mr Peters appealed the ICO’s decision to the Tribunal.

The Tribunal’s decision

On appeal the key issue was whether there was a reasonable likelihood that trial participants could be reidentified from a combination of the requested data and other data which was or might be generally accessible. In this respect, the University argued that the trial data could be linked with school attendance records to re-identify participants.

The Tribunal disagreed. Applying the “motivated intruder” test, the Tribunal concluded that you do not need to be “certain” that release of the requested data would not lead to re-identification. This approach sets the bar too high. Instead, the University should have considered the “likelihood” of re-identification. In this respect, the Tribunal noted that the University had not explained what (lawful) investigative techniques a “motivated intruder” might employ to access school records, noting that such records are confidential and themselves have “motivated defenders”.

The Tribunal also took account of Mr Peters’ argument that he did not need to re-identify participants in order to challenge the results of the trial.


As with the earlier QMUL case, this case will be of interest to universities and other public bodies who maintain medical research databases. It serves as a welcome reminder that the ICO does not require data anonymization to be completely risk free; only that the risk is mitigated until it is sufficiently remote. This should be of comfort to the wider medical research community which relies heavily on inter-institutional data flows for medical research purposes.

However, for public bodies, it seems that the quid pro quo for successfully anonymising your datasets could be that you are required to disclose them to competitors or other third parties under FOIA. The interplay between FOIA and the GDPR in this regard seems uneasy. Data anonymisation is effectively encouraged by the ICO as a means of escaping the shackles of the GDPR. As a result, public bodies often spend a vast amount of time, money and effort anonymising their carefully curated datasets. These anonymisation efforts are sometimes undertaken with a view to establishing data repositories accessible by third parties for research purposes, subject to paying a licence fee. There is a risk that the ability to request access to anonymised data under FOIA could undermine both the commercial model on which these repositories operate and remove the incentive to anonymise data. This outcome would be of no benefit to study participants or the greater public good.

Note: Mr Peters' request was dealt with under DPA 1998 as it was processed before the Data Protection Act 2018 (DPA 2018) came into force. The Tribunal stated that the result would have been the same under DPA 2018.