Home Health & Wellness Privacy Challenges of Big Data in Healthcare: Balancing Innovation and Patient Confidentiality

Privacy Challenges of Big Data in Healthcare: Balancing Innovation and Patient Confidentiality

Reading Time: 7 minutes

Big data and healthcare are talked of as an unsplittable duo more and more frequently. Breakthrough technology brings along immense possibilities – the lightning speed and preciseness at which streams of various facts and details can now be collected, recorded, and applied through ML and AI-fueled systems. But just as one cloud is enough to eclipse the sun, the violation of patient privacy during information elicitation and usage can outweigh the merits. Cases of such infringement can be viewed from the ethical and legal sides. 

Big data in healthcare: what do they serve for?

To understand the perks and stumbling blocks that big data in healthcare involves, we must first comprehend the essence of this notion. It refers to the vast amounts of miscellaneous data that appear thanks to ML and AI-based systems and are handled at tremendous speed.

In which areas can big data and healthcare intersect, and how is this beneficial?

  • Through multiple records and statistics, the quality of service in care institutions can be accessed rapidly and precisely, and the areas for improvement will become obvious;
  • Large selections of insights are helpful when analysing treatment schemes and drawing conclusions on the most efficient ones;
  • They also facilitate scientific breakthroughs through the verification of multiple theories;
  • Lastly, gathering and analysing the feedback on the usage of drugs and medical devices, namely, reported side effects, contributes to getting an entire picture of their safety and adverse reactions.

One of the rapidly evolving areas in this relationship is medical image analysis. By comparing and evaluating hundreds of those, algorithms make valuable predictions on the progression of oncology, vision disorders in diabetes, heart diseases, and other conditions, as well as give valuable recommendations on measures to be taken in each case.

At any rate, even when records are presented in forms other than images, the necessary condition to come to a precise conclusion on their significance and usage is to provide loads of details on both patients and their circumstances, as well as on the effects and dynamics of the phenomenon which is being tracked, whether it’s a drug, an image, reports, etc.

An illustrative example of big data and healthcare software that tackles these sorts of tasks are modules that are dedicated to diverse conditions and embedded into clinics’ EHR systems. When a physician faces a variety of options on how to treat a certain problem, they turn to a specific module, for example, on a certain type of cancer or diabetes. The system quickly pulls out miscellaneous details on a patient from their health records and other publicly available sources, e.g. on their gym plans, food orders, etc. Then, this information is organised and compared against dozens of records of patients with similar patterns who have already undergone treatment. The resulting recommendations on treatment schemes are generated based on successful cases in the past.

Another illustration of the application of big data in healthcare is when predictive analytics engines make recommendations on the order of priorities when deciding on organ transplantations or the admission of severe patients to ICUs that can have a limited number of beds. They base their conclusions on dozens of health records and immediate updates on people’s states to help care professionals make critical and life-saving decisions within minutes.

The crucial thing to understand here is if these and other programmes violate patient privacy, and if they do, what can be done to prevent this.

What does the notion of health privacy imply?

Although the notion of privacy is challenging to delineate explicitly, one thing is certain: it’s inextricably linked with the concept of context. From the outset, there are clearly defined parties who contribute to the transmission and keeping of details, the ways that these details can be rightfully handled, and the purposes for which they’re handled. If, somewhere in this chain, any flaws or discrepancies occur in relation to who, how, where, and when someone uses them, we can speak of a privacy violation.

When the confidentiality of big data in healthcare (or any other sphere) is compromised, this can have a catastrophic impact on both parties. The nature of health records is often sensible and intimate. Being aware that it fell into the wrong hands can cause emotional distress for the affected party. At this point, the sheer number of affected patients is normally not limited to one or two and can reach hundreds and even thousands. The financial losses that a care facility, manufacturer, or other actors end up bearing can be quite impressive, not to mention the fact that the toll of this infringement can be on people’s lives. And even if the consequences are not that dramatic and the records that, say, had been mistakenly viewed weren’t spread any further, the sensible health information being disclosed in this way has its ethical implications, too.

How can facilities collect data without breaking laws?

To begin with, big data in healthcare can flow from myriad sources, for example, IoT trackers, EHRs, insurance databases, etc. Across the EU, all these records have the same status and are referred to as “data concerning health”. Thus, they are all covered by the same laws, the principal of which is GDPR. 

On the contrary, in the US, these records fall into distinct categories according to their origin and the person or entity who is exploiting them, which can be a medical worker, a care system, or its partner. The confidentiality of information is secured by the Privacy Rule that was enacted within the framework of HIPAA. 

Confidential health information, according to US regulations, is the information that can be linked to a person through identifiers like their name, ID, email, and so on. It makes sense then that the obvious way of spreading this information without violating its confidentiality is breaking its link with a person to whom it pertains, e.g. by removing their name and so on. This method has a substantial flaw, though, as the information can be re-identified by matching various datasets, including those in the public domain. Another case of when patient details can be legally handled by another person is when they give their explicit consent to it.

To sum up, the core distinction between the US-based and the European-based systems is that the latter is broader and covers a wider range of details, assigning to them the status of confidential. This assumes that, in the US, consumers are more vulnerable to privacy breaches as many types of info are not even considered private, e.g. those collected by apps or Google search engines.

Do patients have a say in the matter of distribution of their details?

Even if we view the disclosure and subsequent usage of someone’s health-related records as something done to their benefit, e.g. by bringing these facts to fruition in research or improving the quality of care, the question remains: does this individual have a say in this disclosure, and to what extent?

From the ethical point of view, the answer largely relies on what sort of info it is, as well as where and how it’ll be used. The matter is complicated by the fact that speaking of big data and healthcare, it becomes increasingly challenging to sift care-related details from non-care-related ones. Another challenge in this relation is the increasing introduction of Blockchain in Healthcare. On one hand, this technology guarantees that individuals have unprecedented control over their files. On the other hand, details recorded on a distributed ledger are susceptible to the re-identification we’ve touched upon above, as it’s not impossible to link a blockchain address with its owner.

Opinions on soliciting patient consent vary, from the need to entitle individuals to issue permission to access and disclose every single piece of their data to the permissibility of accessing personal records in case the merits outweigh the harms. Perhaps the happy medium in this matter would be forming representative groups, or councils, consisting of patients and care representatives, who’d render a verdict on who and how is eligible to use certain collections of records.  

How severe disclosure consequences can be

An enormous pool of big data in healthcare already exists owing to rapidly evolving AI tools. What consequences, looking from a legal and an ethical point of view, does the disclosure of some pieces of these records imply, and what are ways to mitigate this aftermath?

Various discrimination concerns

Let’s imagine that there is an employee who suffers from a severe chronic condition or is undergoing a remission course related to an addiction or other sensitive issue, and the information about this becomes publicly available. This can mark this person as prone to risk in the eyes of their employer and put their employment at risk. Moreover, this can influence the decision of an insurer to deal with this potentially risky customer and provide them with a requested coverage plan, knowing that there are some health problems in place. 

What do the laws state in this regard? The principal ones, such as the GINA, ADA, and more, explicitly denounce employment-related prejudice as illegal, especially the one resulting from insights on one’s disabilities or genetic information. The PPACA regulates the work of insurers in this respect. However, these and other regulations are aimed at mitigating the consequences of insights that are already in the public domain, rather than preventing them from getting there. Additionally, in some cases, it’s impossible to tell if certain decisions are biassed if the insights on the basis of which they’re made are not that ‘critical.’ To illustrate, an employer may implicitly base their decisions on the knowledge that their employee suffers from a sexually transmitted condition.

Seemingly unharmful disclosures that can cause emotional trauma

The disclosure of one’s health circumstances may not imply financial losses, dismissal, or other tangible consequences. However, the invisible trauma caused by distress and the fear that the details of one’s private life will come to light can mean harm more grave than the one implying objective damage. The regulations that we’ve touched upon above are a little help in these types of cases.

When we deal with big data and healthcare, there is a high probability that some pieces of information that are considered private can still be accessed. To illustrate, the police might get access to DNA databases in order to trace a criminal. At this point, the data of those under suspicion will be inevitably interlinked with the data of their families, who have never consented to these details being viewed by third parties.

Curiously, big data in healthcare not only indicates certain confirmed facts but also leads their custodians to an entire range of assumptions that logically stem from the known data. For instance, one can speculate on a woman’s pregnancy based on her eating behaviour and purchase history, which are collected by app recommendation engines. Are such assumptions considered to be violating privacy? There is no definite answer to this question.

Overcoming privacy violation issues in big data and healthcare

When it comes to big data in healthcare, both excessively rigorous and vague privacy regulations can be harmful to patients and become a stumbling stone for innovation, as scientists don’t have free access to massive arrays of facts and files needed for their research. The obvious way out of this situation would be to rely on de-identified records. At any rate, if we’re determined to come up with groundbreaking AI and ML solutions, we as developers must strike a balance between rightfully leveraging the available data without violating their holders’ rights.

Adam Mulligan, a psychology graduate from the University of Hertfordshire, has a keen interest in the fields of mental health, wellness, and lifestyle.

© Copyright 2014–2034 Psychreg Ltd