ACGT: Risky Business?
Safeguarding the privacy of patients is an important task for anyone dealing with medical data as harming it can do tremendous damage to the individual involved (note that privacy breaches are irreparable as an information leak cannot be undone). With respect to the research itself, such incidents are likely to lead to patients withdrawing from participation (damaged trust) and could very well lead to prosecution of those responsible for the breach.
The ACGT project has contributed considerable effort to solving the data protection-related issues which accompany the creation of a transnational biomedical research infrastructure. The project aimed to provide researchers with a convenient and easily implementable solution to deal with legal and regulatory compliance regarding data privacy. The solution has been defined in the form of a framework consisting of a combination of technical, organisational and legal measures.
This ACGT Data Protection Framework (DPF) builds upon the concept of “context of anonymity”, the establishment of a Data Protection Authority (DPA) and on the integration of a Trusted Third Party. In short, the main goal of the DPF is to ensure that within the ACGT environment, data can be considered “de-facto” anonymous and as such it frees researchers from the tedious administrative tasks connected to data protection compliance for each individual research initiative, as long as they stick to the global ACGT rule set.
A key factor in determining the success of this approach is the question to what extent the DPF offers guarantees with respect to regulatory compliance. To answer this question and asses the strengths and weaknesses of the DPF, a risk analysis concerning data security and data protection was performed. The four main threats that were examined are:
Non-compliance
At all levels, be it technical, procedural or contractual.
Infrastructure and technology- related security risks
Contextual anonymity
Specific risk associated with the establishment of a controlled context in which data can be considered defacto anonymous.
Long-run sustainability
Risks associated with maintaining the research infrastructure on the long term.
Within ACGT a set of interdependent base components forms the foundation of the entire infrastructure. The security related components are largely based on a set of proven concepts, technologies and implementations. Risks associated with purely technological aspects are therefore limited.
However, technical security measures leave a number of gaps open due to architecture and design limitations or simply because it is technically impossible to prevent certain actions. For example: a system administration can forbear to disable accounts of people leaving an organisation (not following employee removal policies): those people can still access data, although they are no longer authorised.
To cover these gaps, the DPF resorts to procedures, contracts, End User License Agreements (EULA) and other (legal) agreements. Non-compliance with these procedures (related to maintenance and management of operations) and contracts (legal requirements imposed by ACGT) pose an increased risk. Irrespective of the cause of this non-compliance (ignorance, sloppiness or malicious intent), the consequences can be devastating as a technical security architecture can only function properly if the legitimate users adhere to the policies.
This mainly “human factor” is the biggest risk to data privacy. The sheer size of the project (cf. for one, the geographical distribution of people involved) is an additional complicating factor to controlling compliance. Controlling this risk includes preventive measures such as training and avoiding procedural complexity of procedures (a goal not easily reached) and corrective measures such as financial penalties (the ACGT agreements for participation foresee such penalties).
On the plus side, the risk analysis shows that most identified threats are associated with the leaking of de-identified data (thanks to the DPF). In general these will have a limited impact as it is highly unlikely that data will end up with people capable of re-identification (especially not with accidental breaches).
And finally, again the need for a separate governing body such as the Center for Data Protection (CDP) in international collaborative environments can be concluded, because in a longtime operational infrastructure, there is a clear need for coordinated technical and procedural auditing.
Brecht Claerhout,
Custodix