This talk on Privacy and Confidentiality in Digital Health was presented by Vanessa Halter – privacy and eHealth compliance advisor – at HealthTech Sydney’s Security, Privacy & Patient Data for Digital Health, 23/2/2016.

Security in healthcare systems

This is a headline that is unfortunately all too common. It tells the story of a healthcare organisation that was subject to a cyber attack. This hospital had their medical records encrypted (by the hackers) and those records were held to a ransom of $3.6m. The doctors at the hospital had to revert to the dark ages – using fax.

Los Angeles hospital returns to faxes and paper charts after cyberattack

Source: The Guardian

This article is important for two reasons:

  1. Not only is health information an asset, but access to the information is an asset; and
  2. It’s an expectation of society that we store and share health information electronically

With the expectations that health data should be stored and shared electronically, comes the expectation that is kept secure.

This blog post will explore the aspects of security, privacy and the protection of personal information for healthcare products and services, specifically:

  • Privacy and confidentiality risks of collecting and sharing health information electronically
  • The business case for privacy
  • Embedding privacy into process and policy design
  • Implications of cross-border disclosure

Privacy & security

People often have a misconception that there is nothing more to privacy than ‘security’.

Privacy vs Security

Privacy and security symbiotic in nature, they’re not one and the same. This is because privacy is about the rights of individuals to keep information private and only share it with people that they want to share it with.

Privacy concerns personal information:

  • Collection, use and disclosure of personal information
  • Data quality
  • Access that the patient – or individual – has to the data

Security on the other hand, is about protecting all information:

  • Confidentiality
  • Integrity
  • Availability

Privacy and security intersect at the point where personal information needs protection.

Confidentiality

Healthcare providers have an ethical, professional and legal duty to respect patient rights to privacy and confidentiality regarding their health information.

Confidentiality is the obligation of healthcare providers to hold information that they have in confidence – this stems from the Hippocratic Oath. This is important because your clients and your stakeholders have this obligation.

The business case for ‘privacy’

There are four aspects to any business case addressing privacy.

(1) You have a legislative obligation to care about privacy:

  • Commonwealth privacy laws (The Privacy Act 1988) govern the private sector nationally, as well as State & Territory privacy laws
  • If you fail to meet privacy standards, and there is a serious or repeated breach of privacy, you may face a fine of up to $1.7m

(2) The integrity of the health system depends on strong privacy:

  • People would be reluctant to seek medical advice and attention if their information was disclosed to others without their permission. This is particularly important in cases where a patient has a condition with social stigmas attached (e.g. HIV, STIs)
  • A health system with strong privacy mechanisms will promote public confidence in health care services

(3) Reputational damage should privacy be breached:

  • Health information is considered ‘sensitive’ information under privacy legislation and is given a higher protection than other types.

(4) Privacy is an asset for any product or service:

  • Privacy compliance could actually be the point of difference between you and your competitors. You could get a competitive advantage by having a privacy compliant product or service.

The business case for privacy

Privacy risks

You need to understand the privacy risks associated with collecting digital health information.

  • If health information is collected, used and shared, the risk is that it may be accessed without authorisation. For example:
  • Misuse of records by a staff member. An often overlooked risk is that people and process are a significant internal threat.
  • Failure to store records containing personal information appropriately or dispose of them securely.
  • Loss or theft of computer equipment or portable storage devices containing personal information.
  • Mistaken release of records to someone other than the intended recipient.
  • Hacking or other illegal access of databases by someone outside the entity.

On the other hand, if it is not collected or shared, there is a risk that patients won’t be able to benefit from the data. For example:

  • Privacy controls are too restrictive to effectively access/share health information.
  • Not enough confidence in the product/service by end-users, therefore not used.

 

Privacy risks of digital health

 

Unauthorised access to health information

Unauthorised access to health information can have very real consequences. Take the recent example of one of AHPRA’s (Australian Health Practitioner Regulation Agency) employees who used the agency’s database to track down a nurse and then assault her. This hasn’t been the first breach of AHPRA’s database, and it seems unlikely to be the last according to this report from The Guardian.

While this is an example of a malicious breach of private information, you shouldn’t underestimate non-malicious threats. Common sense is not so common, as was evident with the recent hacking of Ashley Madison website.

According to Quartz, The most common password used on www.ashleymadison.com was “123456”.

Top 100 passwords on Ashley Madison

Source: Quartz

Poor password choices are evidently an internal threat. This problem is compounded in the healthcare sector where the users of software products are often time poor and not very technically savvy.

Inability to collect or share health information

In the health sector there is a missed clinical benefit when information is not collected, stored and shared. There can be instances where patient misses out on the right clinical care if the right information can’t be accessed.

This can stem from privacy controls being too restrictive where designers of health systems often defer to “BOPA” (Because of the Privacy Act).

In other instances, users lack confidence in the system and as a result don’t use it.

See privacy as an opportunity, not a challenge

Suppose that you are building a product or service for the health sector – what will clients be looking for?

You’ve got their back:

  • Remember that they have a professional, legal and ethical to protect their patient’s privacy and confidentiality. Clients in the health sector are often distrustful of new technology and are less likely to want to change. They want to know that you have their interests as a priority.

Use of your product or service does not result in any new or increased privacy risk:

  • Especially in relation to their professional indemnity providers. If the client decide to start using your product or service, their indemnity providers will need to be satisfied that there is no increased privacy risk.

Privacy is embedded in design:

  • Where privacy is considered at the inception of the product or service and remains at the core of its evolution.

Practical example: Privacy embedded in design

To better illustrate privacy embedded in design, consider a hypothetical smartphone app called ‘Bump & Me’.

Bump & Me app

The user of the Bump & Me app can:

  • Establish a profile including name, age, expectant date of birth
  • Keep a Health Diary
  • Monitor their nutrition
  • Monitor their exercise with the ability to sync wearables for heart rate, daily steps
  • Share aggregated health data with connected healthcare providers (stored in ‘the cloud’)

The Bump & Me app can:

  • Use the aggregated health data and disclose the information to a research company

Keep privacy in mind from the start

Perform a threshold assessment and/or a Privacy Impact Assessment (PIA).

The threshold assessment asks whether any ‘identifiable’ personal information is collected, stored, used or disclosed:

  • In the ‘Bump & Me’ example, the app does collect both personal and health information

Also consider undertaking a Privacy Impact Assessment (PIA):

  • The downside is that if you undertake a PIA externally, it can be costly.
  • The upside is that it is an excellent asset to demonstrate that your product has been designed with privacy in mind.

Even if you don’t undertake the threshold assessment, or a PIA, you need to:

  1. Assess any privacy risks; and
  2. Consider mitigations

You should undertake a threshold assessment and PIA at the point where they can actually impact design and build. Don’t leave it too late to consider privacy as the system rebuild could be extensive (expensive), or you run the risk of releasing a non-compliant product or service into the market.

Map the personal information

The best way to assess the privacy risk is to begin by mapping the information flows for your product or service. For example, how is the information collected, how is it used, how is it disclosed, is there cross-border disclosure?

Map personal information flows

Disclosure

Let’s consider the privacy risks associated with ‘Disclosure’ for our hypothetical smartphone app called “Bump & Me”.

Privacy impact:

  • Bump & Me discloses private information to an external research company

Privacy risks:

  • Disclosing information for a secondary purpose (e.g. research company) cannot be done without the person’s consent
  • A user is unlikely to expect that their private information will be disclosed to a 3rd party

Privacy mitigations:

  • If disclosing the data to a research company, only share the pure (de-identified) data
  • Have a privacy notice that makes it clear that only de-identified data will be shared with third parties

Cross-border disclosure

When disclosing private information overseas, you have to meet certain obligations around cross-border disclosure.

What is cross-border disclosure?

  • Where the information is transferred offshore; or
  • Where an overseas entity can access information stored in Australia

An example of cross-border disclosure could be:

  • You employ a contractor based in London who is able to access personal information/patient records for the purpose of providing remote IT support

Disclosure would be unlikely to apply in the case of:

  • Personal information received by a cloud provider with servers based in Washington USA for the purpose of storage services only, provided that there is a contract in place that restricts the access that the cloud provider has to the private information

Cross-border disclosure

Obligations & liabilities

You have legal obligations & liabilities when disclosing personal information to overseas recipients. These are:

(1) Where you disclose personal information overseas you must take steps to ensure the overseas recipient does not breach the Australian Privacy Principles (APPs)

  • Exceptions to this is where there is:
    • An equivalent Law. The overseas recipient is subject to a similar law or binding scheme with protections substantially similar to the APPs; or
    • Consent from the individual. The individual consents after being expressly informed that information will be sent overseas

(2) Where the overseas recipient breaches the APPs, the Australian organisation will be held liable for the breach, with penalties of up to $1.7m

Your obligations remain the same, regardless of your size:

  • Even if you are a small startup, you have the same obligations to protect health information as everyone else
  • Regardless of your size, ensure that your contractors protect your client’s information

Cross border so what?

Privacy as a strategic asset

By being open with your clients about the privacy controls that you have built into your product or service, you have a better chance at building their trust in your offering.

Conclusion

Privacy is an important concept as part of collecting, storing and sharing patient health information. Strong privacy protections will promote confidence in healthcare services and ensure information can be shared to benefit patients.

Innovative service and product solutions entering the market need to be aware of the ethical, professional and legal obligations in the healthcare sector to keep health information private. These solutions should consider privacy as a strategic advantage, an asset to build trust in an offering.

 

Speaker profile

vanessa

Vanessa Halter is an advisor on privacy and eHealth compliance. She currently works with NEHTA (the National eHealth Transition Authority) and has been involved in the design and implementation of a number of national eHealth products.

Vanessa provides advice to stakeholders in the healthcare industry in relation to privacy, digital health legislation and policy. She also identifies issues and solutions to manage medico-legal impacts on eHealth uptake. In her role at NEHTA, she is responsible for delivering NEHTA’s corporate privacy programme, which includes compliance, training and awareness.

Vanessa became interested in health IT and privacy when working in corporate law, and specialised in advising Australian and international organisations on best privacy practice. She also has regulatory experience having undertaken privacy investigations for the Australian Privacy Commissioner.

References