Download presentation
Presentation is loading. Please wait.
1
Introduction to Health Informatics:
Health Informatics in Low- and Middle-Income Countries Short Course for Health Information System Professionals Introduction to Health Informatics: Privacy, Security, and Confidentiality
2
Privacy, Security, and Confidentiality
The objectives for Privacy, Security, and Confidentiality are to: Define and discern the differences between privacy, security, and confidentiality. Discuss methods for using information technology to protect privacy and confidentiality. Describe and apply security, privacy, and confidentiality under the tenets of the U.S. Health Insurance Portability and Accountability Act (HIPAA) Privacy and Security rules. Discuss the intersection of a patient’s right to privacy with the need to share and exchange patient information. Privacy, Security, and Confidentiality Learning Objectives Define and discern the differences between privacy, security, and confidentiality Discuss methods for using information technology to protect privacy and confidentiality
3
Principles of Ethics Respect for persons Beneficence Justice
This unit defines important terms related to privacy, security, and confidentiality. It discusses reasons for concerns about privacy and security in the context of health information. Tools for protecting health information are examined, followed by a discussion of the Health Insurance Portability and Accountability Act, or HIPAA [hip-uh] regulations and what additions have been made in the HITECH [high-tek] (Health Information Technology for Economic and Clinical Health Act) legislation.
4
Respect for Persons Autonomy and self-determination Informed consent
Protection of vulnerable populations
5
Beneficence Weigh risks versus benefits
Physical, mental, and social well-being Risk reduction (to a minimum) Protection of participants
6
Justice Equal distribution of risk and benefits
Equitable recruitment of participants Protection of vulnerable groups
7
Privacy, Security, and Confidentiality
This unit defines important terms related to privacy, security, and confidentiality. It discusses reasons for concerns about privacy and security in the context of health information. Tools for protecting health information are examined, followed by a discussion of the Health Insurance Portability and Accountability Act, or HIPAA [hip-uh] regulations and what additions have been made in the HITECH [high-tek] (Health Information Technology for Economic and Clinical Health Act) legislation. Privacy, Security, and Confidentiality Definitions Concerns Privacy Security Tools for protecting health information Health Insurance Portability and Accountability Act (HIPAA) Privacy rule Security rule Enhancements in Health Information Technology for Economic and Clinical Health Act legislation Implications
8
Definitions Privacy The right to be left alone
Privacy is the individual’s right to keep information to himself or herself. It is the right to be left alone, the right to keep personal information secret, and in essence, the right to control personal information. Confidentiality, by contrast, is the individual’s right to keep information about himself or herself from being disclosed to other people. When a patient vests confidentiality in a physician and a health care system, it is expected that personal information is kept confidential and not disclosed to others. Data is shared or disseminated only to those with a “need to know.” Security is the activity of protecting personal information. It consists of mechanisms to assure the safety of data and the systems in which the data reside. Security should address the physical security of the buildings, equipment, and storage media as well as the data and informational assets retained by all health care organizations. Definitions Privacy The right to be left alone The right to keep personal information secret The right to control personal information Confidentiality Sharing or disseminating data only to those with a “need to know” Security Mechanisms to ensure the safety of data and systems in which the data reside
9
Individually identifiable health information, or IIHI [eye-eye-H-eye], is any data that can be correlated with an individual—for example, information in a medical record or a database that can be linked to a specific patient. A related term is protected health information, or PHI, which is defined as individually identifiable health information. The HIPAA privacy rule defines individually identifiable health information as a subset of health information, including demographic and other health information related to past, present, or future physical or mental health or condition of an individual that is created or received by a health care provider, health plan, employer, or health care clearinghouse. Finally, consent is a broad term, but it is defined here in the context of privacy. When consent is given to the health care provider organization and/or physician, it entails written or verbal permission to allow use of individually identifiable health information for the activity of providing health care or for participation in a research project or related activity. Definitions Individually identifiable health information: Any data that can be correlated with an individual Also called protected health information (PHI) Consent (in context of privacy): Written or verbal permission to allow use of your individually identifiable health information
10
Concerns about Privacy
The remainder of this lecture focuses on concerns about privacy and security, beginning with the notion of personal privacy versus the common good. The discussion continues regarding disclosures of personal health information, examining some of the concerns that the public has about the privacy of health information. Finally, the lecture closes with a few comments about de-identified data. Concerns about Privacy Personal privacy versus common good Continued disclosures Concerns of public De-identified data
11
Digital Health and Privacy, Security, and Confidentiality
12
Personal Privacy versus the Common Good
Consider the notion of personal privacy versus the common good. Some of the concerns are well demonstrated in a video that was produced in 2011 by the American Civil Liberties Union, which is available at In this video, a pizza restaurant has access to customers’ medical information, and they penalize them for things like ordering extra cheese when their cholesterol levels are shown to be high. It is a video worth watching even though it takes a very specific point of view. There’s a broad spectrum of views on personal privacy versus the common good, often reflecting underlying political beliefs. At one end of the spectrum is the view that although personal privacy is important, there are some instances when the common good of society outweighs personal privacy. An example that is often given is biosurveillance [buy-oh-sur-vay-lehns], whether it is monitoring emerging natural diseases or things like bioterrorism. Early intervention and response is possible with more information. Another example is clinical research. When more clinical research is conducted, the ability to provide quality health care is increased. The other end of the spectrum holds that personal privacy trumps everything, that there should really be no reason to violate a person’s privacy without explicit consent. Others have called for a more balanced approach between personal privacy and the common good. For more information on this topic, some good articulations can be found in documents from the California Health Care Foundation, an editorial by Dr. Don Detmer, and a policy paper from the American College of Physicians. As with many ethical issues, there are no explicitly right or wrong answers, and each individual has to decide where their views fall on the spectrum; however, the U.S. political process, not the individual, will more than likely determine how personal privacy and common good in terms of health care are balanced. Personal Privacy versus the Common Good Concerns expressed in American Civil Liberties Union video Scary Pizza A spectrum of views: One end holds that although personal privacy is important, there are some instances when the common good of society outweighs it, such as in biosurveillance (Gostin & Hodge, 2002; Hodge, Gostin, & Jacobson, 1999). The other end holds that personal privacy trumps all other concerns (Privacy Rights Clearinghouse, 2009; see also video by Deborah Peel, Designing Technology to Restore Privacy, More “balanced” views? (California Health Care Foundation, 2008; Detmer, 2010; American College of Physicians, 2011) Where do your views fit?
13
Patient Information Disclosures
It is important to know about patient information disclosure and how to prevent it from happening in the future. Disclosures occur due to a variety of reasons, including mobile devices or data storage media that is lost or stolen, as well as cybersecurity attacks on an organization’s technology infrastructure. Not all cybersecurity attacks result in patient information disclosure, but any threat of an actual attack or breach places the organization at high risk. Also, hackers may not reveal they have stolen the information until long after the event. Health care providers are a prime target for cyberattacks due in part to the value of PHI on the black market. Anyone can be subject to a breach, including health care providers, vendors, insurance companies, patients, and consumers. The increasing use of mobile devices such as smart phones, tablets, and laptops poses unique issues with the effort of protecting both physical and data assets. Any device that connects to a network is vulnerable, including medical devices. Also, implantable devices such as pacemakers are prone to hackers. This slide provides just a sampling of the many types of events that can result in disclosure of PHI. These examples range from 2005 to 2016, which demonstrates this is not a recent issue. One particularly egregious [ih-gree-juhs] story happened in Portland, Oregon, on New Year’s Eve An individual left in his car disks, backup tapes, and other media that contained records of about 365,000 patients who were seen by a visiting nurse association. This indiscretion naturally received a lot of press and demonstrated the need to be careful if one manages devices with PHI. This type of event has continued to occur over the years regardless of the amount of press. The Veterans Administration system has had a number of episodes, probably the largest of which was when a laptop with the data of over a million veterans was stolen. The laptop was recovered, and it appeared that the data was not accessed, but of course, no one knows exactly what went on with the machine when it was in the hands of those who stole it. Improper disclosure of research participants’ PHI resulted in a HIPAA settlement in Anthem, a large insurance payer organization, was hacked, exposing over 80 million customers’ PHI. Over the past several years, many health care providers have had their clinical and operational software systems and networks frozen until some type of “ransom” was paid. These events do not necessarily expose PHI, but they demonstrate the organization’s vulnerability and place their PHI at high risk. Again, these are only a few of the many examples of breaches, attacks, and loss of PHI impacting health care organizations, providers, and their patients. Patient Information Disclosures Healthcare cybersecurity attacks over the past five years have increased 125 percent as the industry has become an easy target; PHI is 50 times more valuable than financial information on the black market (Kutscher, 2016). In Portland, Oregon, thieves broke into a car with back-up disks and tapes containing records of 365,000 patients (Rojas-Burke, 2006). Several episodes from Virginia, including a laptop with data of more than 1 million veterans, were recovered without apparent access (Lee & Goldfarb, 2006). A hack of Indianapolis-based payer Anthem’s information technology (IT) systems exposed the personal data of approximately 80 million customers (Perma, 2015). Improper disclosure of research participants’ PHI resulted in a $3.9 million HIPAA settlement (U.S. Department of Health and Human Services [HHS], 2016b). A hospital paid hackers $17,000 to unlock electronic health records frozen in a “ransomware” attack (Conn, 2016).
14
Patient Information Disclosures
Two websites are devoted to documentation of disclosures. The Privacy Rights Clearinghouse provides a searchable Chronology of Data Breaches. The data includes medical breaches but is not limited to them. The site can be accessed at The Department of Health and Human Services (HHS [aych-aych-ess]) is required under the HITECH Act to post a list of breaches of unsecured PHI affecting 500 or more individuals. It is called by some their “wall of shame.” It can be accessed at This website contains a running list of all report breaches. The top ten data breaches in 2015 accounted for over 111 million records, with the top six breaches impacting one million individuals. Patient Information Disclosures Privacy Rights Clearinghouse provides searchable Chronology of Data Breaches—not limited to medical: HHS (2016a) must post list of breaches of unsecured PHI affecting 500 or more individuals (“wall of shame”): 2015 top 10 data breaches accounted for just more than111 million records; the top 6 breaches affected at least 1 million individuals.
15
Breaches Adversely Impact Organizations
The Ponemon [pone-eh-mon] Institute publishes an annual report on the impact of security breaches on health care organizations. The 2015 report estimated that data breaches may [quote] “be costing the industry six billion dollars. More than 90 percent of health care organizations represented in this study had a data breach, and 40 percent had more than five data breaches over the past two years.” [end quote] (Ponemon Institute 2015). According to the study, the average cost of a data breach for health care organizations is estimated to be more than $2.1 million. A significant part of the cost was “lost business” by the organization. For the first time, criminal attacks were the number one cause of data breaches in health care in 2015, according to the study. Criminal attacks on health care organizations are up one hundred twenty-five percent compared to five years ago. In fact, forty-five percent of health care organizations say the root cause of the data breach was a criminal attack, and twelve percent say it was due to a malicious insider. Half of all organizations indicated that they have little or no confidence in their ability to detect all patient data loss or theft. The HIMSS 2015 Security Survey identified the sources for breaches as the following: foreign sources, hacktivist, nation-state actor, malicious insider, hacker, social engineering, and online scam artist. Interestingly, this threat has not impacted the security budgets for health care providers. The HIMSS report identified that health care providers spend on average less than six percent of their IT budget for security expenditures even though security is a top business priority. In contrast, the federal government spends sixteen percent of its IT budget on security, while financial and banking institutions spend twelve to fifteen percent. Defenses are not keeping pace with the volume of attacks and the new trends and methods of threats and breaches. Breaches Adversely Impact Organizations Costs (Ponemon Institute, 2015) Estimated cost to the industry: $6 billion Average cost per breach: $2.1 million Significant part of cost: “lost business” Criminal attacks up 125 percent in healthcare organizations—now number one cause of breaches Attack sources (HIMSS, 2015) Foreign sources, hacktivist, nation-state actor, malicious insider, hacker, social engineering, online scam artist Security budgets (HIMSS, 2015) Healthcare providers average more than 6 percent of the IT budget for security The federal government spends 16 percent of its IT budget on security, while financial and banking institutions spend 12 to 15 percent Information security is a business priority
16
Challenges from Proliferation of Technologies and Applications
Security challenges are created with the proliferation of health IT technologies and software applications. For example, there is an ever-growing use of electronic data in clinical workflows and use of technology by all health care providers. Likewise, health information exchange (HIE) and data sharing activities across multiple networks and cloud computing greatly expands the required perimeter of data protection. Financial constraints often result in shrinking technology budgets, which presents another point of potential vulnerability for the health care organization because it becomes more difficult to monitor and quickly respond to threats. Patient and family engagement activities are increasing as they become more involved in their care using their own various devices and applications. There are also new models of health care, such as accountable care organizations (ACOs, ay-see-ohs) and care transitions (care across the patient care continuum) that require more members of a care team to access information. Clinicians also want to use their devices, such as personal laptops, tablet devices, smartphones, and so forth. This causes increased use of cellular and other wireless networks, which may be vulnerable if not properly encrypted, thus threatening security of medical devices and implantable devices. Challenges from Proliferation of Technologies and Applications Increased technology use by all care providers Health information exchange and data-sharing activities across multiple networks Cloud computing and third-party outsourcing Increased use by patients, families, and consumers of their devices (tablets, smartphones, etc.) New models of care require more care providers to access data across the patient care continuum Clinicians using their own devices Personal laptops, tablet devices, smartphones, and so on Connected medical devices and implantable devices
17
Some Technologies Can Worsen the Problem
And, of course, technology itself can worsen the problem. A widely cited study by Wright looked at the USB drives (sometimes called thumb drives) commonly plugged into computers. These drives run a program that enables their use when they are plugged in, and that program can be modified to extract data from the computer. So if that computer has personal health information on it, the thumb drive can basically copy it from the computer. Some personal health record systems and other consumer-targeted health applications may or may not have encryption functionality and could be easily compromised. Another interesting analysis found that ten percent of hard drives sold by second-hand retailers in Canada had remnants of personal health information on them. Often, when computers are disposed of, the hard drives are not completely wiped clean, potentially providing access to personal information for the next user if that user knows how to extract it. This applies to both patient and consumer mobile devices and computers, as well as equipment owned by health care organizations. Also of note is that PHI can be discovered by files available from peer-to-peer (P2P, pee-two-pee) file-sharing networks. One analysis found that half of one percent of all IP addresses on the Internet in the United States have discoverable PHI. Finally, another technology that can store PHI is the digital photocopier, which stores all copies on an internal hard disk. If this information is compromised, PHI can potentially be leaked. Fax machines and scanners may also store data that can include PHI. A rule of thumb is to restrict physical access when possible and always encrypt. Physical access includes access to hardware devices but also the physical area where computers, servers, and network equipment are housed. Some Technologies Can Worsen the Problem USB (“thumb”) drives run programs when plugged into a USB port; they can be modified to extract data from computer (Wright & Sittig, 2007a, 2007b). Personal health records, other systems, may lack encryption and can be easily compromised. Ten percent of hard drives sold by a second-hand retailer in Canada had remnants of PHI (El Emam, Neri, & Jonker, 2007). Peer-to-peer file sharing—0.5 percent of all U.S. IP addresses have PHI (El Emam et al., 2010). Digital photocopiers store all copies made (Keteyian, 2010); scanners may also store copies. Restrict physical access where possible and always encrypt!
18
Healthcare Organizations Are Not Well Prepared for Security
There are many challenges facing health care organizations in preparing and maintaining proper security measures. These are just a few: Security budgets are not keeping pace with the complex technology environments and the growing risk of attacks; this limits the ability of health care organizations to address proper security measures. The significant increase in threats and the growing sophistication level of attacks have created a situation in which providers cannot keep up an adequate offensive front. There’s a need for more innovative, advanced security tools and in-depth approaches to keep pace with security threats and vulnerabilities. There’s not enough qualified and skilled security expertise. Slightly more than half (fifty-three percent) of organizations have personnel with the necessary technical expertise to be able to identify and resolve data breaches involving the unauthorized access, loss, or theft of patient data. Paper remains an issue. Fifty-four percent of respondents indicate security incidents occurred involving paper documents with most involving less than one hundred PHI records. Data leakage is a primary threat with identity, and access management is a top priority. Not all organizations have nor can afford a full-time chief information security officer (CISO). This role is primarily found in larger organizations, while smaller organizations may include these job duties with another position. Healthcare Organizations Are Not Well Prepared for Security Security budgets are not keeping pace with complex IT environments and the growing risk of attacks (HIMSS, 2015). The growing sophistication of attacks and the number of threats make it hard to keep up. There is a need for innovative, advanced security tools and in-depth approaches to address threats and vulnerabilities (HIMSS, 2015). There is not enough highly skilled, security expertise (Ponemon Institute, 2015). Paper security breaches remain an issue (Ponemon Institute, 2015). Data leakage is a primary threat. Identity and access management is a top priority. Chief information security officers are not in all organizations.
19
What is the role of your government?
One question to ask is, “What is the role of government in protecting privacy and confidentiality?” This discussion begins by looking at the United States and then moves to other countries. There are many state- and federal-level activities focused on privacy and security of PHI and data sharing. Under HITECH, two committees were formed: an HIT Policy Committee and an HIT Standards Committee. Two groups under these two committees focus on privacy and security. The Privacy and Security Workgroup, working under the HIT Policy Committee, provides input and makes recommendations on policy issues and opportunities to ensure that electronic data captured and exchanged is protected and shared only by means consistent with consumer needs and expectations. The API Task Force, working under the Health IT Joint Committee Collaboration, identifies both perceived and real privacy and security concerns that are barriers to the widespread adoption of open APIs in health care. Please visit the URL on the slide, which is a flow chart depicting ONC’s Federal Advisory Committee Process for developing recommendations: outlines Several previous federally supported groups have produced many publications that focus on privacy and security. These include HISPC, which focused on multi-state harmonization, and the Privacy and Security Tiger Team, which developed policies and recommendations. Publications from these groups and others are found on the website. The National Committee on Vital and Health Statistics, or NCVHS [N-C-V-H-S], has weighed in over the years on a number of privacy and security issues with various publications and recommendations for policies concerning health privacy. Another example is the 2015 Precision Medicine Initiative. This initiative specifically provided ONC $5 million to support development of interoperability standards and requirements addressing privacy and enabling secure data exchange. What is the role of your government?
20
European Union General Data Protection Regulation
The General Data Protection Regulation (GDPR) imposes new rules on organizations that offer goods and services to people in the European Union (EU), or that collect and analyze data tied to EU residents, no matter where they are located. Enforced on May 25, 2018
21
GDPR Data Definitions Regardless of Nationality or EU Residence
Personal Data (from GDPR) “…means any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.” Examples: Name Identification number (e.g., Social Security number) Location data (e.g., home address) Online identifier (e.g., address, screen names, IP address, device IDs) Genetic data (e.g., biological samples from an individual) Biometric data (e.g., fingerprints, facial recognition) “The GDPR also requires compliance from non-EU organizations that offer goods or services to EU residents or monitor the behavior of EU residents.” Source: Brief: You Need An Action Plan For The GDPR; Forrester Research; October 2016
22
GDPR Basics Provide notification to data subjects, in clear and plain language. Request and obtain the data subject’s affirmative and granular consent. Discontinue with processing activities if the data subject denies consent. Provide a mechanism for data subjects to withdraw consent. Obtain affirmative consent from a child’s (under the age of 16) parent or guardian.
23
What do other governments do?
The United States is not the only government that has been addressing privacy and security activities. In January 2012, the European Commission proposed a comprehensive reform of data protection rules in the European Union (EU) with the objective of giving back to citizens control over of their personal data and to simplify the regulatory environment for business. The data protection reform is a key enabler of the Digital Single Market. Under EU law, personal data can be gathered legally only under strict conditions, for a legitimate purpose. Furthermore, persons or organizations that collect and manage personal information must protect it from misuse and must respect certain rights of the data owners, which are guaranteed by EU law. The belief driving this initiative is that everyone has the right to the protection of personal data. In February 2016, the European Commission finalized the [quote] “reform of EU data protection rules, which apply to all companies providing services on the EU market. The Commission negotiated the EU–U.S. Umbrella Agreement ensuring high data protection standards for data transfers across the Atlantic for law enforcement purposes. The Commission achieved a renewed sound framework for commercial data exchange: the EU–U.S. Privacy Shield.” [end quote] The United States will publish written commitments in the U.S. Federal Register and assurance on the safeguards and limitations concerning public authorities’ access to data. Previously, the European Commission devoted efforts to the protection of individual privacy. The 2007 directive 95/46/EC provided a set of fairly stringent rules that essentially allows data processing only with consent or in some highly specific circumstances, such as a legal obligation, or what is defined as a public necessity, usually revolving around public health. These are just examples of what other governments are doing around privacy and security. What do other governments do?
24
Related Issues for Medical Privacy
There are a number of related issues for medical privacy. Ownership of health information is complex and varies from state to state. Additionally, court decisions may relate to health information in one state but not apply to health information from other states. In general, the person or organization that holds the record is considered to be the owner of the information. Once health information has been given to a provider or organization, patients have rights to access and copy their health information but are not considered to own the information. A comparison of state policies on medical record ownership is available at For example, if an office practice or hospital had paper charts, and had bought and owned the paper the charts were printed on, it was presumed that the practice or hospital owned the information on that paper. However, in the electronic era, information moves freely across networks from one system to another, and ownership of that information becomes less clear. As the amount of information increases, there’s an increased economic value to health care providers, pharmaceutical companies, insurance companies, research institutions, and others who may want to use that data for various purposes. The article by Rodwin, in particular, argues that when there is an economic advantage gained by the use of that information, then at least some of that gain should be shared back to the patient. Another concern is compelled disclosures of information—that is, even though laws and regulations may highly protect information, individuals may sometimes be compelled to disclose information for nonclinical care reasons in the health care setting. Employers, insurance companies, and even government agencies sometimes require people to sign authorizations releasing their health information for various purposes. Health care providers need to be aware of requiring individuals to disclose information that is not really being used for health-related activities. Another growing issue concerns the human genome [jee-nohm], which may be a person’s ultimate personal identifier. A person’s genome is what makes each person an individual. Individual genes and the variation that they have from others’ genes are unequivocally unique to each person. Health information can be de-identified, but with genomic information, individuals may be easily identifiable. Access to the genomic information manifests itself in a number of ways. For example, a person’s genome can be identified by the genomic [ ji-noh-mik] information in his or her siblings. There are a growing number of genome-wide association studies that attempt to associate variation in an individual’s genome with different diseases. There’s actually a requirement for researchers to put this data in public databanks, although usually the individual personal information is protected and is available only to the researchers, who can legitimately access it. It is not too difficult to identify an individual from genomic data, so as research moves forward with genomics and personalized medicine, more privacy issues will come to the fore. The Genetic Information Nondiscrimination Act of 2008 was enacted as a mechanism to ensure that genetic information is not used to discriminate against an individual in health insurance and employment settings. Related Issues for Medical Privacy Who “owns” medical information? Complex; varies from state to state ( medical-records-50-state-comparison) May have economic value, which should be shared back to the patient (Hall & Shulman, 2009; Rodwin, 2009) “Compelled” disclosures (Rothstein & Talbott, 2006) We are often compelled to disclose information for nonclinical care reasons Ultimate “personal identifier” may be individual’s genome (McGuire & Gibbs, 2006) Even “de-identified” data may compromise privacy (Malin & Sweeney, 2005) Genome of family members can identify siblings (Cassa et al., 2008) Data from genome-wide association studies can reveal individual- level information (Lumley & Rice, 2010)
25
Health Information Rights
Another number of organizations have tried to define health information rights. One example is the Declaration of Health Data Rights, which comes from a group of mostly personal health record (PHR [pee-aych-are]) vendors. This group advocates that all individuals should have the right to their own health data. They should also have the right to know the source of each health data element. In addition, individuals should have the right to take possession of a complete copy of their individual health data, without delay, at minimal or no cost. If data exists in computable form, it must be made available in that form. Finally, individuals should have the right to share their health data with others as they see fit. The American Health Information Management Association (AHIMA, a-hee-mah) also has a Health Information Bill of Rights that is slightly more detailed but has similar provisions. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule includes provisions for patient’s health information privacy rights. These include the right to access health information, right to an accounting of disclosures of health information, right to correct or amend health information, right to notice of privacy practices, and right to file a complaint. More information can be found at The Privacy Right Clearing House also provides information on HIPAA’s privacy rule at Health Information Rights Declaration of Health Data Rights Right to our own health data Right to know the source of each health data element Right to take possession of a complete copy of our individual health data, without delay, at minimal or no cost; if data exist in computable form, they must be made available in that form Right to share our health data with others as we see fit Read: health-data-rights American Health Information Management Association (AHIMA) Health Information Bill of Rights (2009) Slightly more detailed but with similar provisions HIPAA Privacy Rule Provisions (2016) rights
26
Are “de-identified” data more secure?
When data is referred to as being de-identified, it means that personally identifying characteristics of the data, such as name or address, or other fields that make up personal health information have been removed. Is de-identification secure? It may not always be as secure as intended. Sweeney brought this to light and has received notice in the popular press. When she was completing her PhD at MIT, she did a widely cited study that essentially identified William Weld, the governor of Massachusetts at the time, through information found by linking to publicly available data sources. Her research also showed that eighty-seven percent of the U.S. population could be uniquely identified by their five-digit ZIP code, gender, and date of birth. So when relatively common data elements are combined, individual identities may be easily identified. In the case of William Weld, Sweeney was able to access a health insurance database for state employees, and Governor Weld was obviously a state employee. Sweeney also was able to purchase the voter registration list for the city of Cambridge, Massachusetts, where the governor lived. She then combined these two databases, linking the ZIP code, gender, and date of birth, and was able to identify the governor, as will be demonstrated further in the next slide. Just as genomic data generated in clinical research studies may make individuals identifiable, some recent research has shown how Social Security numbers of individuals can be predicted from public data, because so many data sets contain Social Security numbers. Are “de-identified” data more secure? Not Necessarily 87 percent of the U.S. population can be uniquely identified by five-digit ZIP code, gender, and date of birth (Sweeney, 2002). Sweeney identified William Weld, governor of Massachusetts, in a health insurance database for state employees by purchasing voter registration for Cambridge, Massachusetts, for $20 and linking ZIP code, gender, and date of birth to the “de‑identified” medical database (Sweeney, 1997). Genomic data can aid in re-identification in clinical research studies (Malin & Sweeney, 2005; Lumley & Rice, 2010). Social Security numbers can be predicted from public data (Acquisti & Gross, 2009).
27
How Governor Weld Was Identified
This slide demonstrates how Governor Weld was identified. On the left is the so-called de-identified state employee health database, which included state employees’ ethnicity, visits to health care providers, diagnoses, procedures, medications, and charges. It also contained ZIP codes, dates of birth, and gender. The Cambridge voter registration database included name, address, registered party affiliation, and the same ZIP codes, dates of birth, and gender. Governor Weld was one of those eighty-seven percent who had a unique combination of ZIP code, date of birth, and gender. Sweeney took Weld’s voter registration information and then accessed his entire medical information. How Governor Weld Was Identified Ethnicity Visit date Diagnosis Procedure Medication Charge Name Address Date registered Party affiliation Date last voted Zip Date of birth Gender Figure: The overlapping data enabled identification of the governor. (Adapted from Sweeney, 1997)
28
Concerns about Security
This lecture discusses concerns that people have about security of health information. One of the ways to protect privacy is to make information more secure. There are many books and resources available that provide insight into security best practices, including assessment, ongoing management, and training and education. The Office of the National Coordinator for Health Information Technology (ONC), in coordination with the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR), created the Guide to Privacy and Security of Electronic Health Information to help you integrate privacy and security into a health care practice. You can access the guide at These organizations have also provided a YouTube video to complement the Guide. Check out the video called “Guide to Privacy and Security of Electronic Health Information” located at What concerns do people have about security? The following slides look at the many points of leakage in the system, some of the consequences of poor security, and the related topic of medical identity theft. It’s important to remember that security is not unique to electronic systems—it is also an issue for paper systems. Concerns about Security Comprehensive overview (Herzig, 2010) Guide to Privacy and Security of Electronic Health Information (Office of the National Coordinator for Health Information Technology [ONC] & Office for Civil Rights [OCR], 2015) Many points of leakage Also a problem for paper records Consequences of poor security Medical identity theft
29
Flow of Information in Healthcare: Many Points to “Leak”
As anyone who works in a health care setting knows, there are many points where information can leak out of the system. This figure, adapted from Rindfleisch [rihnd-flahysh], shows how information flows through the health care provider organization. Information is first generated in the provision of patient care by health care providers, clinics, and hospitals. It then flows to health care support activity, such as payers of health care, the insurance companies that reimburse, quality reviews that measure the quality of care delivered, and other types of administration. There are also what Rindfleisch describes as social uses of information, everything from insurance eligibility to reporting to public health authorities and to using data in medical research. Fortunately, use of health information is now regulated by the Health Insurance Portability and Accountability Act (HIPAA). There are also commercial uses of information, such as for marketing, participating in managed care organizations that may use data for various purposes to try to improve the quality or efficiency of the care they deliver, and the monitoring of drug usage. Another leakage point revolves around patients and their families as they engage in the care process. Flow of Information in Healthcare: Many Points to “Leak” Chart. Flow of information in healthcare (Rindfleisch, 1997)
30
Security for Paper Records Is a Significant Problem
It’s important to note that even though concerns about privacy and security are heightened with electronic systems, paper records have their own set of privacy and security problems. In fact, some have argued that paper medical records may be more prone than electronic records to breaches of security and disclosure. Unlike electronic records, it is very difficult to audit the trail of a paper chart. Even with paper-based tracking documents, it is not clear exactly where the chart goes and who has looked at it. Electronic information systems are able to provide a detailed audit trail as a background process that identifies everyone who logs into the software application and tracks their specific system activities with a date and time stamp. There are also issues with fax machines and scanners. Even in this electronic era, many people still rely on fax machines and scanners to move information. When the document comes out of fax machines, anyone can view the paper, and it is difficult to track where the paper goes. Also, data stored in fax machines and scanners can easily be accessed. Records also continue to be photocopied. We photocopy for many reasons: the patient goes to a new provider, the insurance company needs to have documentation that a specific procedure was done or referral was made, and records get abstracted by individual people. After copying, the paper copies may be scanned into information systems. More advanced systems allow scanning of documents directly into the software application. Whether they are paper or electronic, records are also copied for research or quality assurance purposes. Most health care insurers belong to the Medical Information Bureau, which monitors for insurance fraud and other insurance-related concerns. The Medical Information Bureau has developed a huge database of individuals’ health care claims, looking very properly for health insurance fraud but also collecting quite a bit of information on individuals’ personal health. Security for Paper Records Is a Significant Problem Difficult to audit trail of paper chart Fax machines, scanners are easily accessible Records frequently copied for many reasons New providers, insurance purposes Records abstracted for variety of purposes Research Quality assurance Insurance fraud → Medical Information Bureau (Rothfeder, 1992)
31
Potential Consequences of Poor Security
Aware of the consequences of poor security, Rindfleisch pointed out in the late 1990s that patients take various actions to protect their security. They avoid seeking health care. They lie or withhold information so private information won’t end up in their charts. Health care providers also have concerns about security, so they may avoid entering sensitive data that could be important in the care of a patient by others, and they may also devise workarounds to entering that information. A California Health Care Foundation survey of health care consumers found that thirteen percent engaged in activity that the foundation termed privacy-protective—activities that might put their health at risk, such as asking a doctor to leave out a diagnosis, perhaps to prevent someone from knowing that they have a certain diagnosis. Some consumers also pay for tests out-of-pocket because they do not want to submit an insurance claim, knowing that when a claim is submitted, the insurance company then knows that the test was done. Others avoid seeing their regular doctor for some problems because they want to protect their privacy and avoid revealing sensitive information. These examples demonstrate why clinicians may use patient-sourced information as complementary data to provider-sourced information. Potential Consequences of Poor Security According to Rindfleisch (1997): Patients avoid healthcare Patients lie Providers avoid entering sensitive data Providers devise workarounds California Health Care Foundation (2005) Thirteen percent of consumers admit to engaging in “privacy- protective” behaviors that might their put health at risk, such as: Asking a doctor to lie about a diagnosis Paying for a test because they did not want to submit a claim Avoiding seeing their regular doctor
32
Medical Identity Theft
A final security concern is medical identity theft, which significantly increased as the use of information technology expanded in health care. Medical identity information is more valuable than financial information, and the theft can go undetected for some time. When this happens, the victims are not only the individuals whose medical records have been compromised but also health care providers, health plans, and society at large who pay for health care. In 2008, the American Health Information Management Association (AHIMA [uh-hee-muh]) determined that general identity theft is a growing concern and that the value of medical identity information is much higher than that of other information, such as Social Security numbers. Today, medical identity theft is one of the top crimes facing the country, and projections indicate that this problem will only continue. The 2015 Medical Identity Fraud Alliance annual report, supported by Ponemon Institute, showed that medical identity theft incidents increased twenty-one-point-seven percent from The study estimated that two-point-three-two million adult-aged Americans or close family members became victims of medical identity theft during or before 2014. Medical identity theft is costly. Sixty-five percent of medical identity theft victims in the 2015 study had to pay an average of thirteen thousand five hundred dollars to resolve the crime. In some cases, they paid the health care provider, repaid the insurer for services obtained by the thief, or engaged an identity service provider or legal counsel to help resolve the incident and prevent future fraud. (Ponemon 2015) It can be months before a victim uncovers an incident, and few achieve resolution of the incident, which places them at risk for future theft. Medical identity is used to obtain health care services, prescription pharmaceuticals, or medical equipment and to fraudulently receive government benefits such as Medicare or Medicaid. Some thieves access a victim’s medical records and/or modify the record. HHS, along with other organizations, addresses this problem and publishes resources that outline various approaches to prevention, detection, and remediation of medical identity theft. Medical Identity Theft AHIMA reported in 2008 a growing concern of general identity theft 2015 Medical Identity Fraud Alliance Annual Report Medical information more valuable than financial Costly to the victim Can be complex to solve over a long time HHS report outlines approaches to prevention, detection, and remediation (ONC & OCR, 2015)
33
Tools for Protecting Health Information
The next slides discuss tools for protecting health information. A good source to begin with is the Institute of Medicine (IOM) report “For the Record,” which addresses issues of protecting electronic health information. The report was commissioned by the National Library of Medicine and informed the HIPAA [hip-uh] legislation. It also made recommendations on immediate and future best practices. While some of the content in the book is dated, the framework provides a good way of thinking about the problem. As already mentioned, ONC, in coordination with OCR, created the Guide to Privacy and Security of Electronic Health Information. Many other industry activities, resources, and publications are publically available and address various aspects of privacy and security practices, practices pertaining to specific technologies such as mobile devices, as well as training resources for security professionals. In 2013, the President of the United States issued Executive Order (EO) 13636, Improving Critical Infrastructure Cybersecurity, which directed NIST—National Institute of Standards and Technology—to work with stakeholders to develop a voluntary framework for reducing cyber risks to critical infrastructure. The NIST website has many resources publically available, and NIST’s work continues in development of practices, guidelines, and tools to support effective cybersecurity efforts. The Framework for Improving Critical Infrastructure Cybersecurity is found at SANS (at is an example of an industry resource that focuses on security training and certification of security professionals as well as on research. Tools for Protecting Health Information Brought to wider light by Institute of Medicine report For the Record (Committee on Maintaining Privacy and Security,1997) Guide to Privacy and Security of Electronic Health Information (ONC & OCR, 2015) National Institute of Standards and Technology (NIST) Critical Cybersecurity Infrastructure Framework SANS And many more ….
34
Threats to Security Insider Accidental disclosure Curiosity
There are many different threats to security. According to the 2015 HIMSS Survey, sixty-four percent of breaches were from external sources, and fifty-four percent were from internal sources. Outsider threats may be from organized crime, hacktivists, cyber thieves, and overseas hackers. When attacks originate from overseas, little can be done legally. The three most common attacks are spear phishing (when an appears to be from a legitimate business but is actually from a hacker), Trojans (when malware or spyware is disguised as legitimate software), and malvertising. Insider threats and breaches may be from careless insiders who accidentally disclose or access information, the curious insider who accesses information, or the malicious insider who is a disgruntled or dissatisfied employee who accesses information inappropriately. Threats to Security Insider Accidental disclosure Curiosity Malicious/subornation Outsider Organized crime Hacktivists Cyber thieves
35
Technologies to Secure Information
A variety of technologies can be used to secure information. There are deterrents, which do not exclude people from breaching security but give them pause for doing so, such as putting up alerts when, for example, an employee’s medical record is about to be accessed. Another deterrent is the audit trail. System management precautions also can be taken. A number of software systems do not protect information as well as they should, and an analysis of vulnerability can reveal such risks. Here are some obstacles that can prevent individuals from getting to private information: Authentication: The user must provide credentials, usually a password, to access a system or file Authorization: The user is given (or denied) permission to specifically access, read, write, edit, create, move, and/or delete files Integrity management: The soundness of the overall system is assessed and maintained Digital signatures: A code is attached to electronically transmitted messages to validate that the sender is who he or she claims to be Encryption: Data are converted into “ciphertext” by the sender, and it can be read only if the recipient has a key (covered in the next two slides) Firewalls: Software, hardware, or both, designed to keep systems inaccessible from, say, Internet users Rights management: Digital “locks” are used to protect or restrict the use of proprietary hardware and software and copyrighted material—for example, a software program may have an embedded tag that allows the program to be installed only a limited number of times Technologies to Secure Information Deterrents Alerts Audit trails System management precautions Software management Analysis of vulnerability Obstacles Authentication Authorization Integrity management Digital signatures Encryption Firewalls Rights management
36
Encryption Necessary but not sufficient to ensure security
The next slides discuss encryption. Although encryption is a necessary precaution, it is not sufficient to ensure security. Any medical communication, whether an or transmission of a medical record, should be encrypted if it is being sent over a public network, because anyone with the right know-how could intercept that information. What is encryption? t is when information is scrambled using a key, which essentially is a randomly generated “secret code.” As an oversimplified example, a key might convert every A to a 9, every B to a $, and so on, before a message or document is sent. The recipient must possess a key to unscramble the message. There are different types of encryption. Symmetric encryption is when information is scrambled and unscrambled with the same key. Asymmetric encryption, sometimes called public-key encryption, is when a different key is used for scrambling than for unscrambling the information. Encryption Necessary but not sufficient to ensure security Is a “safe harbor” under federal and state laws when data loss occurs Should, however, be used for all communications over public networks, such as the Internet, and with mobile devices Information is scrambled and unscrambled using a key Types: Symmetric and asymmetric Asymmetric, also known as public key encryption, can be used for digital certificates, electronic signatures, and so on
37
Standards for Encryption and Related Functions
A number of important standards related to encryption and other functions are listed on this slide. Not everyone in the informatics field needs to become an expert, but it is important to know how these standards are applied in different roles—for example, how they will be mandated in the Health Insurance Portability and Accountability Act or in the HITECH [high-tech] criteria for the meaningful use of electronic health records. First, there is the encryption standard itself, the advanced encryption standard, or AES [ay-ee-ess], that has been designated by the National Institute for Standards and Technology, or NIST [nihst], as the standard for robust enough encryption and decryption to be used in computer systems for securing information such as health information. Of course, information is not just encrypted and decrypted on individual machines; it moves across networks, so the movement of data from point to point also requires a process that not only encrypts the data but also make sure that it stays secure as it moves across those connections. The emerging standard is transport layer security, or TLS, which succeeds a standard that was a very prominent route in the early days of the World Wide Web, the secure sockets layer, or SSL. Of course, information moves according to a protocol, such as IP [eye-pee], so there is an Internet Protocol Security, or IPsec [eye-pee-sec]. This is part of the IP Internet protocol communications process that was developed for the new version of IP, version 6, but it has been pulled from that version and added to version 4, which is what most people use when they connect to the Internet. In addition to making sure information is secure from one point to another across a network, the system needs to ensure the integrity of the information—that it has not been altered either by transmission errors or by malicious users. Secure hash algorithms, or SHA [ess-aych-ay], ensure the integrity of transmitted information documents. The original SHA protocol was found to have some security flaws, so SHA-2 [ess-aych-ay-two] has emerged and is the more robust way of ensuring the integrity of data transmission across networks. Wikipedia provides a nice overview of these standards, as does the NIST website, listed on this slide. Standards for Encryption and Related Functions Advanced Encryption Standard (AES): NIST-designated standard for encryption and decryption (Daemen & Rijmen, 2002) Transport Layer Security (TLS) and predecessor, Secure Sockets Layer (SSL): Cryptographic protocols that provide security for communications over all points on networks (Rescorla, 2001) Internet Protocol Security (IPsec): Protocol for securing Internet Protocol (IP) communications by authenticating and encrypting each IP packet of a data stream Part of IPv6 but also added as standalone on top of IPv4 Secure Hash Algorithm (SHA): Protocols that ensure integrity of transmitted information and documents (NIST, 2002) Security flaws have been identified in SHA-1, so SHA-2 family of protocols has been developed For more: Secure Hash Algorithm NIST’s Cryptographic Toolkit
38
For the Record Best Practices
The “For the Record” report lists a number of best practices, divided into organizational and technical practices. In addition, other best practices for protecting information have emerged in the industry. This slide identifies just a few areas divided into organizational and technical. Organizational practices encompass overall information and security governance for the organization. This includes policies and procedures regarding security, privacy and confidentiality, education and training programs, and the all-important sanctions which ensure that when an individual is caught breaching security, he or she faces appropriate penalties. Patients also need to be given access to the audit trail so they can see who has accessed their record and then determine whether it has been done appropriately. Management dashboards are tools for oversight of the organization’s performance. Privacy and security must be included in the organization’s risk management program, which includes overall compliance management of regulations and laws. Risk management is involved in ongoing risk (vulnerability) assessments, event disaster planning and recovery processes, as well as remediation and mitigation. Technical best practices include securing information access such as with user authentication, audit trails, identity management, and activity monitoring. Protecting the data assets encompasses cloud management; third-party outsource suppliers’ protection; data warehouses, repositories, databases, and storage security; and end-point device protection, including all mobile devices. Infrastructure management includes physical security management, security analytics, and infrastructure protection. For the Record Best Practices (Committee on Maintaining Privacy and Security, 1997) Organizational Information and security governance Confidentiality and security policies and committees Education and training programs Sanctions Patient access to audit trails Management dashboards Risk management and compliance Technical Authentication of users Audit trails Physical security and disaster recovery Protection of remote access points and external communications Software discipline Ongoing system vulnerability assessment Infrastructure management
39
Authentication and Passwords
The next slides elaborate on authentication and passwords. Authentication is the process of gaining access to a secure computer, for example, logging onto a computer. The usual approach for authentication is the password, which is a piece of information that the computer user “knows.” With more secure systems, organizations may require information about a physical characteristic, or what you “have,” such as a biometric device that registers thumbprints or the use of a smart card or some other physical key that enables the user to access the machine. Most of these systems have pros and cons that must be worked through by the organization for effective use. In terms of passwords, the ideal password is one that can be remembered but that no one else can guess. This is easier said than done, especially today, when the typical Internet user may interact with many different sites, each of which requires the use of a password. In many health care organizations, especially large organizations, single sign-on is used, where the user only has to authenticate once and then has access to the other systems that they need. Of course, the downside to single sign-on is that if an unauthorized user gains access through an authorized user’s sign-on, the unauthorized user gains access to every point that is open to the authorized user. Two-factor authentication, which is commonly used in health care, is a security process that requires the user to provide two means of identification from separate categories of credentials; one is typically a physical token, such as a card, and the other is typically something memorized, such as a security code or PIN (personal identification number). Three-factor authentication is the strongest authentication method but has proven difficult to implement in the provider environment. Authentication and Passwords Authentication: Process of gaining access to secure computer Usual approach is passwords (“what you know”), but secure systems may add physical entities (“what you have”) Biometric devices: Physical characteristic (e.g., thumbprint) Physical devices: Smart card or some other physical “key” Ideal password is one you can remember but no one else can guess Typical Internet user interacts with many sites for which he or she must use password “Single sign-on” is commonly used Two-factor authentication
40
Some Challenges with Passwords
There are a number of challenges with passwords. One approach that is commonly used is password aging: the password expires after a certain time—for instance, six months—and then the user has to create a new password. A number of security experts have written about password aging, and the foremost conclusion is that password aging isn’t a good approach to security, and it may induce counterproductive behaviors, such as writing passwords down or somehow making them easier to guess. One report argues that other measures are more effective. Session locking, for instance, allows only one or a small number of simultaneous logons, so a user can log on to only a limited number of places at the same time. There are also login failure lockouts—after a certain number of unsuccessful attempts, the individual is locked out. But clearly, passwords will continue to be an issue in terms of protecting the security of information, including health information. Some Challenges with Passwords Common approach to security is password “aging” (i.e., expiration), which is less effective than other measures (Wagner, Allan, & Heiser, 2005) Session-locking: One or small number of simultaneous logons Login failure lockout: After three to five attempts Password aging may also induce counterproductive behavior (Allan, 2005)
41
Health Information Security Is Probably a Trade-off
In the big picture of health information, security represents a trade-off. At one end of the spectrum, no security is in place, and a website shows the user any page requested, which is appropriate most of the time. At the other end of the spectrum is the extreme level of security employed by government agencies such as the CIA and the National Security Administration (NSA, enn-ess-ay). Neither of these extremes works well for health care security. Health information can’t be freely available for anyone to look at, but it also can’t be buried in the kind of total security that the CIA or NSA uses. For extremely high-level security, there is a price—a person can’t, for example, bring an ordinary laptop into a CIA building. In health care settings, many different people may be looking at information or may need to access it quickly in order to maintain the workflow and get the work done. For health information security, there has to be some kind of happy medium that protects information but still allows it to be quickly and easily accessed by authorized people. There must be a balance between the strength of the security and ease of access to clinical information, especially in critical situations. “Breaking the glass” refers to a quick means for a clinician who doesn’t have access privileges to certain patient information to gain access under circumstances such as a patient emergency. Health Information Security Is Probably a Trade-off Chart. Health information security is a trade-off (CC BY-NC-SA 3.0, 2012).
42
Privacy, Security, Confidentiality
This concludes Lecture a of Privacy, Security, and Confidentiality. In summary, it’s important to distinguish privacy, which is the right to keep information to yourself, from confidentiality, which is the right to keep information about yourself from being disclosed to others. For many reasons, breaches and disclosures of patient information are increasing. In addition, the concept of de-identified information is not necessarily as secure as originally thought. Privacy, Security, Confidentiality Summary Privacy is the right to keep information to ourselves, whereas confidentiality is the right to keep information about ourselves from being disclosed to others. For many reasons, breaches and disclosures of patient information are increasing. “De-identified” information is not necessarily more secure.
43
Privacy, Security, and Confidentiality
This concludes Lecture B of Privacy, Security, and Confidentiality. In summary, there are many points where private patient information can “leak” out of the health care system. There are also many technologies for protecting security that must be used. One of these, encryption, is necessary but not sufficient by itself. Finally, issues of privacy and security apply to both electronic and paper-based information. Privacy, Security, and Confidentiality Summary There are many points where information can “leak” out of the system. Many technologies are available for protecting security. Encryption is necessary but not sufficient. Paper-based information has its own security problems.
44
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Acluvideos. (2009). Scary Pizza [Video file] Retrieved from Acquisti, A., & Gross, R. (2009). Predicting Social Security numbers from public data. Proceedings of the National Academy of Sciences, 106, 10975– Allan, A. (2005). Password aging can burden an already-weak authentication method. Stamford, CT: Gartner. American College of Physicians. (2011). Health information technology & privacy: Position paper. Philadelphia, PA: American College of Physicians. Retrieved from American Health Information Management Association. (n.d.). AHIMA consumer health information bill of rights. Retrieved from American Health Information Management Association. (2003). Flow of patient health information inside and outside the healthcare industry. Retrieved from Bowe, R. (2013). Identity crisis: Organizations are implementing medical identity theft teams to combat rising incidents. Journal of AHIMA, 84(1), 38–42. California Health Care Foundation (CHCF). (2005). National consumer health privacy survey Oakland, CA: CHCF. Retrieved from
45
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Cassa, C., Schmidt, B., Kohane, I., & Mandl, K. (2008). My sister's keeper?: Genomic research and the identifiability of siblings. BMC Medical Genomics, 1(1), 32. Retrieved from Cohn, S. (2006). Privacy and confidentiality in the nationwide health information network. Retrieved from presentations/june letter-to-the-secretary-recommendations-regarding- privacy-and-confidentiality-in-the-nationwide-health-information-network/ Cohn, S. (2008). Individual control of sensitive health information accessible via the Nationwide Health Information Network for purposes of treatment. Retrieved from Committee on Maintaining Privacy and Security in Health Care Applications of the National Information Infrastructure. (1997). For the record: Protecting electronic health information. Washington, DC: National Academies Press. Retrieved from electronic-health-information Conn, J. (2016, February 17). Hospital pays hackers $17,000 to unlock EHRs frozen in “ransomware” attack. Modern Healthcare. Retrieved from Daemen, J., & Rijmen, V. (2002). The design of Rijndael: AES—The advanced encryption standard. Berlin, Germany: Springer-Verlag.
46
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Detmer, D. (2010). Activating a full architectural model: Improving health through robust population health records. Journal of the American Medical Informatics Association, 17, 367–369. El Emam, K., Neri, E., & Jonker, E. (2007). An evaluation of personal health information remnants in second-hand personal computer disk drives. Journal of Medical Internet Research, 9(3), e24. El Emam, K., Neri, E., Jonker, E., Sokolova, M., Peyton, L., Neisa, A., & Scassa, T. (2010). The inadvertent disclosure of personal health information through peer-to- peer file sharing programs. Journal of the American Medical Informatics Association, 17, 148–158. European Commission. (2016, February 29). European Commission unveils EU-U.S. privacy shield. Retrieved from protection/news/160229_en.htm European Commission. (2016, March 23). Protection of personal data. Retrieved from Gostin, L., & Hodge, J. (2002). Personal privacy and common goods: A framework for balancing under the national health information privacy rule. Minnesota Law Review, 86, 1439–1479. Retrieved from xt=facpub
47
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Hall, M., & Schulman, K. (2009). Ownership of medical information. JAMA, 301, 1282–1284. HealthIT.gov. (2013). Federal-state healthcare coordination: Health information security and privacy collaboration (HISPC). Retrieved from security-privacy-collaboration-hispc HealthIT.gov. (2016). Your health information rights. Retrieved from Healthcare Information and Management Systems Society (HIMSS). (2015) HIMSS security survey. Chicago, IL: HIMSS. Retrieved from (full report limited to members only; executive summary available free to download). Herzig, T. (Ed.). (2010). Information security in healthcare—Managing risk. Chicago, IL: Healthcare Information Management Systems Society. Hodge, J., Gostin, L., & Jacobson, P. (1999). Legal issues concerning electronic health information: Privacy, quality, and liability. JAMA, 282, 1466–1471.
48
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Joint NEMA/COCIR/JIRA Security and Privacy Committee. (2004). Break glass procedure: Granting emergency access to critical ePHI systems. Retrieved from access-critical-ephi-systems Keteyian, A. (2010, April 15). Digital photocopiers loaded with secrets. CBS News. Retrieved from Kutscher, B. (2016, March 3). Healthcare underspends on cybersecurity as attacks accelerate. Modern Healthcare. Retrieved from /healthcare-underspends-on-cybersecurity-as-attacks-accelerate Lee, C., & Goldfarb, Z. (2006, June 30). Stolen VA laptop and hard drive recovered. The Washington Post, p. A01. Retrieved from dyn/content/article/2006/06/29/AR html Lumley, T., & Rice, K. (2010). Potential for revealing individual-level information in genome-wide association studies. JAMA, 303, 859–860. Malin, B., & Sweeney, L. (2005). How (not) to protect genomic data privacy in a distributed network: Using trail re-identification to evaluate and design anonymity protection systems. Journal of Biomedical Informatics, 37, 179–192.
49
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References McGuire, A., & Gibbs, R. (2006). No longer de-identified. Science, 312, 370–371. Milken Institute School of Public Health. (2012). Who owns medical records: 50 state comparison. Retrieved from analysis/who-owns-medical-records-50-state-comparison McNabb, J., & Rhodes, H. B. (2014). Combating the privacy crime that can KILL. Journal of AHIMA, 85(4), 26–29. Munro, D. (2015, Dec. 31). Data breaches in healthcare totaled over 112 million records in Forbes, Pharma & Healthcare. Retrieved from healthcare-total-over-112-million-records-in-2015/#356bb7337fd5 National Academies Press. (1997). For the record protecting electronic health information. Retrieved from National Institute for Standards and Technology (NIST). (2015). Secure hash standard. Gaithersburg, MD: NIST. Retrieved from National Institute of Standards and Technology. (2014). Cryptographic toolkit. Retrieved from
50
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References National Institute of Standards and Technology (NIST). (2014). Framework for improving critical infrastructure cybersecurity. Gaithersburg, MD: NIST. Retrieved from pdf Office of the National Coordinator for Health Information Technology (ONC) & Office for Civil Rights. (2015). Guide to privacy and security of electronic health information. Washington, DC: ONC. Retrieved from guide.pdf Orcutt, M. (2015, Dec. 23) could be the year of the hospital hack. MIT Technology Review. Retrieved from hospital-hack Pabrai, A. (2008, January 23). The single sign-on solution. H&HN’s Most Wired Magazine. Patient Privacy Rights Foundation. (n.d.). Patient privacy rights. Retrieved from Perma, G. (2015, Feb. 5). Anthem hit by large data breach. Healthcare Informatics. Retrieved from item/anthem-hit-large-data-brach
51
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Ponemon Institute. (2015). Fifth annual benchmark study on privacy and security of healthcare data. Retrieved from ponemon-study-on-privacy-security-incidents-of-healthcare-data Pritts, J., & Connor, K. (2007). The implementation of e-consent mechanisms in three countries: Canada, England, and the Netherlands. Washington, DC: Substance Abuse and Mental Health Services Administration. Retrieved from Privacy Rights Clearinghouse. (2005). Chronology of data breaches. Retrieved from Privacy Rights Clearinghouse. (2014). Fact sheet 8c: The HIPAA privacy rule: Patients’ rights. Retrieved from patients-rights Rescorla, E. (2001). SSL and TLS: Designing and building secure systems. Boston, MA: Addison Wesley. Rindfleisch, T. (1997). Privacy, information technology, and healthcare. Communications of the ACM, 40(8), 93–100. Rodwin, M. (2009). The case for public ownership of patient data. JAMA, 302, 86–88. Rojas-Burke, J. (2006). Providence critics push for safer records. The Oregonian. Rothfeder, J. (1992). Privacy for sale: How computerization has made everyone’ s private life an open secret. New York, NY: Simon & Schuster.
52
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References Rothstein, M., & Talbott, M. (2006). Compelled disclosure of health information: Protecting against the greatest potential threat to privacy. JAMA, 295, 2882–2885. Sequoia Project. (2016). Data use and reciprocal support agreement (DURSA). Retrieved from Sweeney, L. (1997). Guaranteeing anonymity when sharing medical data: The Datafly System. Proceedings of the 1997 AMIA Annual Fall Symposium, Nashville, TN, 51–55. Sweeney, L. (2002). k-Anonymity: A model for protecting privacy. International Journal on Uncertainty, Fuzziness, and Knowledge-based Systems, 10(5), 557–570. TEDx Talks. (2104). Designing technology to restore privacy: Deborah C. Peel, MD at TEDxTraverseCity [Video file]. Retrieved from The SANS Institute. (2016). About (SANS). Retrieved from Tufts University School of Medicine. (2009). Declaration of health data rights. Retrieved from rights U.S. Department of Health and Human Services. (2016a). Breaches affecting 500 or more individuals. Retrieved from
53
Privacy, Security, and Confidentiality
No audio. Privacy, Security, and Confidentiality References U.S. Department of Health and Human Services. (2016b). Improper disclosure of research participants’ protected health information results in $3.9 million HIPAA settlement. Retrieved from disclosure-research-participants-protected-health-information-results-in-hipaa- settlement.html# Wagner, R., Allan, A., & Heiser, J. (2005). Eight security practices offer more value than password aging. Stamford, CT: Gartner. White House Office of the Press Secretary. (2015). Fact sheet: President Obama’s precision medicine initiative. Retrieved from office/2015/01/30/fact-sheet-president-obama-s-precision-medicine-initiative Wikipedia. (2016). Secure hash algorithm. Retrieved from Wright, A., & Sittig, D. (2007a). Encryption characteristics of two USB-based personal health record devices. Journal of the American Medical Informatics Association, 14, Wright, A., & Sittig, D. (2007b). Security threat posed by USB-based personal health records. Annals of Internal Medicine, 146, 314–315.
54
Privacy, Security, and Confidentiality
No audio. Health IT Workforce Curriculum Version 4.0 Privacy, Security, and Confidentiality References Charts, Tables, Figures Figure Adapted from Sweeney, L. (1997). Guaranteeing anonymity when sharing medical data: The Datafly System. Proceedings of the 1997 AMIA Annual Fall Symposium, Nashville, TN, 51–55. Chart. Flow of information in health care (Rindfleisch, 1997). Chart. Health information security is a trade-off (CC BY-NC-SA 3.0, 2012).
55
No audio. Health IT Workforce Curriculum Version 4.0 This material was developed by Oregon Health & Science University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number IU24OC This material was updated in 2016 by Bellevue College under Award Number 90WT0002. This presentation was produced with the support of the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AID-OAA-L MEASURE Evaluation is implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International; John Snow, Inc.; Management Sciences for Health; Palladium; and Tulane University. Views expressed are not necessarily those of USAID or the United States government.
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.