Share icon

Case Studies: High-Profile Cases of Privacy Violation

Contributor.

Smith Gambrell & Russell weblink

Case Studies: Recent FTC Enforcement Actions - High-Profile Cases of Privacy Violation: Uber, Emp Media, Lenovo, Vizio, VTech, LabMD

Uber Technologies

The scenario: In August 2018, the FTC announced an expanded settlement with Uber Technologies for its alleged failure to reasonably secure sensitive data in the cloud, resulting in a data breach of 600,000 names and driver's license numbers, 22 million names and phone numbers, and more than 25 million names and email addresses.

The settlement: The expanded settlement is a result of Uber's failure to disclose a significant data breach that occurred in 2016 while the FTC was conducting its investigation that led to the original settlement. The revised proposed order includes provisions requiring Uber to disclose any future consumer data breaches, submit all reports for third-party audits of Uber's privacy policy and retain reports on unauthorized access to consumer data. 2

Emp Media Inc. (Myex.com)

The scenario: The FTC joined forces with the State of Nevada to address privacy issues arising from the "revenge" pornography website, Myex.com, run by Emp Media Inc. The website allowed individuals to submit intimate photos of the victims, including personal information such as name, address, phone number and social media accounts. If a victim wanted their photos and information removed from the website, the defendants reportedly charged fees of $499 to $2,800 to do so.

The settlement: On June 15, 2018, the enforcement action brought by the FTC led to a shutdown of the website and permanently prohibited the defendants from posting intimate photos and personal information of other individuals without their consent. The defendants were also ordered to pay more than $2 million. 3

Lenovo and Vizio

The scenario: In 2018, FTC enforcement actions led to large settlements with technology manufacturers Lenovo and Vizio. The Lenovo settlement related to allegations the company sold computers in the U.S. with pre-installed software that sent consumer information to third parties without the knowledge of the users. With the New Jersey Office of Attorney General, the FTC also brought an enforcement action against Vizio, a manufacturer of "smart" televisions. Vizio entered into a settlement to resolve allegations it installed software on its televisions to collect consumer data without the knowledge or consent of consumers and sold the data to third parties.

The settlement: Lenovo entered into a consent agreement to resolve the allegations through a decision and order issued by the FTC. The company was ordered to obtain affirmative consent from consumers before running the software on their computers and implement a software security program on preloaded software for the next 20 years. 4 Vizio agreed to pay $2.2 million, delete the collected data, disclose all data collection and sharing practices, obtain express consent from consumers to collect or share their data, and implement a data security program. 5

The scenario: The FTC's action against toy manufacturer VTech was the first time the FTC became involved in a children's privacy and security matter. The settlement: In January 2018, the company entered into a settlement to pay $650,000 to resolve allegations it collected personal information from children without obtaining parental consent, in violation of COPPA. VTech was also required to implement a data security program that is subject to audits for the next 20 years. 6

The scenario: LabMD, a cancer-screening company, was accused by the FTC of failing to reasonably protect consumers' medical information and other personal data. Identity thieves allegedly obtained sensitive data on LabMD consumers due to the company's failure to properly safeguard it. The billing information of 9,000 consumers was also compromised. The settlement: After years of litigation, the case was heard before the U.S. Court of Appeals for the Eleventh Circuit. LabMD argued, in part, that data security falls outside of the FTC's mandate over unfair practices. The Eleventh Circuit issued a decision in June 2018 that, while not stripping the FTC of authority to police data security, did challenge the remedy imposed by the FTC. 7 The court ruled that the cease-and-desist order issued by the FTC against LabMD was unenforceable because the order required the company to implement a data security program that needed to adhere to a standard of "reasonableness" that was too vague. 8

The ruling points to the need for the FTC to provide greater specificity in its cease-and-desist orders about what is required by companies that allegedly fail to safeguard consumer data.

1 15 U.S.C. § 45(a)(1)

2 www.ftc.gov/news-events/press-releases/2018/04/uber-agrees-expanded-settlement-ftc-related-privacy-security

3 www.ftc.gov/system/files/documents/cases/emp_order_granting_default_judgment_6-22-18.pdf

4 www.ftc.gov/news-events/press-releases/2018/01/ftc-gives-final-approval-lenovo-settlement

5 www.ftc.gov/news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-newjersey-settle-charges-it

6 www.ftc.gov/news-events/press-releases/2018/01/electronic-toy-maker-vtech-settlesftc-allegations-it-violated

7 The United States Court of Appeals for the Third Circuit has rejected this argument. See FTC v. Wyndham Worldwide Corp., 799 F.3d 236, 247-49 (2015).

8 www.media.ca11.uscourts.gov/opinions/pub/files/201616270.pdf

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Photo of Marcia M. Ernst

United States

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

A woman is seen shopping for clothes on a smartphone.

A bipartisan data-privacy law could backfire on small businesses − 2 marketing professors explain why

data privacy act case study

University of Colorado Distinguished Professor, University of Colorado Boulder

data privacy act case study

James M. Kilts Distinguished Service Professor of Marketing, University of Chicago

Disclosure statement

John Lynch served as Executive Director of the Marketing Science Institute, a nonprofit think tank founded in 1961 to support academic research that advances the science of marketing.

Jean-Pierre Dubé consults for Amazon.com.

University of Colorado provides funding as a member of The Conversation US.

View all partners

Orion Brown started Black Travel Box to serve Black female travelers who find hotel lotions and shampoos inadequate. Randel Bennett co-founded the insurance startup Sigo Seguros for underserved Spanish-speaking drivers. Bill Shufelt and John Walker founded Athletic Brewing Company so athletes and nondrinkers in social situations could drink tasty nonalcoholic beer .

What do these three successful businesses have in common? In each case, the entrepreneurs built their businesses on personalized digital advertising platforms such as Facebook and Instagram. They didn’t have the budgets for TV advertising campaigns to compete with bigger businesses. And all served markets that had previously been ignored.

A privacy bill that’s being eyed by Congress could unintentionally make it harder for similar ventures to take off in the future. We are professors of marketing who are experts in academic research on effects of public policy on marketing. We are concerned that the bipartisan bill – the American Privacy Rights Act – could undermine small entrepreneurs like these who rely on targeted digital advertising.

While Americans increasingly favor government taking a more interventionist approach to data privacy, a growing body of rigorous research shows that privacy regulations can have unintended consequences.

Privacy rights and wrongs

The American Privacy Rights Act – introduced by lawmakers in both the House and Senate in April 2024 – would, in the words of a Senate summary, create “ national consumer data privacy rights and set standards for data security .”

The bill would create a national standard for data collection and data use. A national standard would have the benefit of unifying a patchwork of state regulations. In a supportive editorial, The Washington Post described the bill “as tough as, if not tougher than, what states have mustered so far.” Tougher must be better, right?

Not necessarily.

The state bills at issue are generally modeled on the European General Data Protection Regulation, or GDPR . The European Union touts the GDPR as “ the strongest privacy and security law in the world .”

But a growing body of academic literature shows that privacy regulations such as the GDPR can have unintended consequences. In May, the nonprofit Marketing Science Institute released our report summarizing that work. In short, data privacy doesn’t come free – it requires trade-offs.

The price of privacy

For starters, there’s a trade-off between privacy and usefulness of information exchanges for firms and consumers. The 2006 book “ The Long Tail ” described how digital marketing changed our economy from a market focused on selling hit products to a market serving many smaller niches of consumers with diverse needs and tastes. Digital marketing makes it possible for small entrepreneurs and consumers with nonmainstream needs to find each other.

There’s also a trade-off between privacy and fairness. Just as consumers differ in their needs for products, they differ in whether, when and why they are willing to share data. Research shows that those most keen to minimize data-sharing are richer, more educated and older than those who are less keen. The goal of privacy regulation, we argue, should be to give consumers control of their data rather than to slow the flow of data for all.

Coarser personalization can exclude marginalized consumer segments. Some lower-income consumers and certain minority groups live in digital data deserts . The problem isn’t that companies know too much about them. Instead, they are so invisible that they are unwittingly excluded from the digital economy.

Privacy can be, in some sense, a problem of the privileged. We know of no rigorous study showing that toughened digital marketing privacy policies produced tangible economic benefits for anyone, let alone lower-income consumers.

A split-screen photograph of two congresspeople.

There’s also a trade-off between privacy and freedom from discrimination, particularly against marginalized groups. Algorithms have been known to inadvertently discriminate. For example, one study showed that women were less likely than men to be served ads for job opportunities in STEM careers. That seems unfair.

Regulators, including the framers of the American Privacy Rights Act, have prescribed that firms should limit the data they collect only to what is reasonable and necessary, minimizing information about race, gender or other protected class attributes. But without that information, how will regulators and firms audit data-based marketing algorithms for unintended discrimination?

Finally, there’s a trade-off between privacy and innovation by sellers in the marketplace. Many small brands exist because digital marketing allows them to create sustainable businesses at a small scale without giant media budgets. Digital advertising costs a fraction of what’s needed for traditional television campaigns, saving small U.S. entrepreneurs US$163 billion annually . Small brands benefit more from accurate targeting than the big brands with broader appeal.

A growing number of studies show that privacy regulations may slow innovation and reduce the competitiveness of markets . This is especially harmful to those same small businesses and entrepreneurs that benefit most from being able to accurately target diverse consumers.

Recently, privacy advocates started attaching the label “corporatists” to those who argue for benefits of personalized marketing. Ironically, it’s small businesses that benefit most from personalized marketing, as our report for the Marketing Science Institute shows.

Giants like Unilever and Nike gain competitive advantage from privacy regulation and changes to platform privacy policies that dramatically raise small businesses’ costs of acquiring new customers, and giants like Amazon and Walmart gain new appeal as ad platforms. Similarly, studies show that GDPR boosted Google’s and Facebook’s market dominance in Europe and disproportionately increased privacy compliance costs for smaller firms.

To be sure, we believe there’s value in the bill being crafted in Congress to protect consumers’ right to privacy. The May markup included carve-outs for small businesses, for example, but without considering how they rely on others’ data for customer acquisition. In June, divisions between Republicans and Democrats led to canceling a markup session .

In our view, Congress would be wise to use the current impasse to carefully consider how the proposed law would affect smaller sellers and disadvantaged consumer groups.

  • Small business
  • Privacy law
  • US Congress
  • Data privacy
  • General Data Protection Regulation
  • Privacy laws
  • EU General Data Protection Regulation
  • digital marketing

data privacy act case study

Manager, Centre Policy and Translation

data privacy act case study

Finance Business Partner - FBE/MLS/EDUCN

data privacy act case study

Newsletter and Deputy Social Media Producer

data privacy act case study

College Director and Principal | Curtin College

data privacy act case study

Head of School: Engineering, Computer and Mathematical Sciences

Page Tips

Home / Resources / ISACA Journal / Issues / 2016 / Volume 6 / An Ethical Approach to Data Privacy Protection

An ethical approach to data privacy protection.

Data Privacy Protection

Privacy, trust and security are closely intertwined, as are law and ethics. Privacy preservation and security provisions rely on trust (e.g., one will allow only those whom one trusts to enter one’s zone of inaccessibility; one will not feel secure unless one trusts the security provider). Violation of privacy constitutes a risk, thus, a threat to security. Law provides a resolution when ethics cannot (e.g., ethics knows that stealing is wrong; the law punishes thieves); ethics can provide context to law (e.g., law allows trading for the purpose of making a profit, but ethics provides input into ensuring trade is conducted fairly). Privacy breaches disturb trust and run the risk of diluting or losing security; it is a show of disrespect to the law and a violation of ethical principles.

Data privacy (or information privacy or data protection) is about access, use and collection of data, and the data subject’s legal right to the data. This refers to:

  • Freedom from unauthorized access to private data
  • Inappropriate use of data
  • Accuracy and completeness when collecting data about a person or persons (corporations included) by technology
  • Availability of data content, and the data subject’s legal right to access; ownership
  • The rights to inspect, update or correct these data

Data privacy is also concerned with the costs if data privacy is breached, and such costs include the so-called hard costs (e.g., financial penalties imposed by regulators, compensation payments in lawsuits such as noncompliance with contractual principles) and the soft costs (e.g., reputational damage, loss of client trust).

Though different cultures put different values on privacy or make it impossible to define a stable, universal value, there is broad consensus that privacy does have an intrinsic, core and social value. Hence, a privacy approach that embraces the law, ethical principles, and societal and environmental concerns is possible despite the complexity of and difficulty in upholding data privacy.

Data Privacy Protection

Indeed, protecting data privacy is urgent and complex. This protection is necessary because of the ubiquity of the technology-driven and information-intensive environment. Technology-driven and information-intensive business operations are typical in contemporary corporations. The benefits of this trend are that, among other things, the marketplace is more transparent, consumers are better informed and trade practices are more fair. The downsides include socio-techno risk, which originates with technology and human users (e.g., identity theft, information warfare, phishing scams, cyberterrorism, extortion), and the creation of more opportunities for organized and sophisticated cybercriminals to exploit. This risk results in information protection being propelled to the top of the corporate management agenda.

The need for data privacy protection is also urgent due to multidirectional demand. Information protection becomes an essential information security function to help develop and implement strategies to ensure that data privacy policies, standards, guidelines and processes are appropriately enhanced, communicated and complied with, and effective mitigation measures are implemented. The policies or standards need to be technically efficient, economically/financially sound, legally justifiable, ethically consistent and socially acceptable since many of the problems commonly found after implementation and contract signing are of a technical and ethical nature, and information security decisions become more complex and difficult.

Data privacy protection is complex due to socio-techno risk, a new security concern. This risk occurs with the abuse of technology that is used to store and process data. For example, taking a company universal serial bus (USB) device home for personal convenience runs the risk of breaching a company regulation that no company property shall leave company premises without permission. That risk becomes a data risk if the USB contains confidential corporate data (e.g., data about the marketing strategy, personnel performance records) or employee data (e.g., employee addresses, dates of birth). The risk of taking the USB also includes theft or loss.

Using technology in a manner that is not consistent with ethical principles creates ethical risk, another new type of risk. In the previous example, not every staff member would take the company USB home, and those who decide to exploit the risk of taking the USB may do so based on their own sense of morality and understanding of ethical principles. The ethical risk (in addition to technical risk and financial risk) arises when considering the potential breach of corporate and personal confidentiality. This risk is related partly to technology (the USB) and partly to people (both the perpetrator and the victims) and is, therefore, a risk of a technological-cum-social nature—a socio-techno risk. Hence, taking home a USB is a vulnerability that may lead to a violation of data privacy.

However, the problem of data privacy is not unsolvable. The composite approach alluded to earlier that takes into consideration the tangible physical and financial conditions and intangible measures against logical loopholes, ethical violations, and social desirability is feasible, and the method suggested in this article, which is built on a six-factor framework, can accomplish this objective.

Methods for Data Privacy Protection

The method is modeled on a framework originally perceived and developed to provide a fresh view to decision makers and is based on the following three major instruments:

  • The International Data Privacy Principles (IDPPs) 1 for establishing and maintaining data privacy policies, operating standards and mitigation measures
  • Hong Kong’s Data Protection Principles of personal data (DPPs) 2 for reinforcing those policies, standards and guidelines
  • The hexa-dimension metric operationalization framework 3 for executing policies, standards and guidelines

International Data Privacy Principles

Data privacy can be achieved through technical and social solutions. Technical solutions include safeguarding data from unauthorized or accidental access or loss. Social solutions include creating acceptability and awareness among customers about whether and how their data are being used, and doing so in a transparent and confidential way. Employees must commit to complying with corporate privacy rules, and organizations should instruct them in how to actively avoid activities that may compromise privacy.

Next to technical and social solutions, the third element of achieving privacy is complying with data protection laws and regulations, which involves two issues. The first concern is that legal regulation is slow and, thus, unable to keep up with the rapid developments of information technology. Legal solutions are usually at least one step behind technological developments. Data privacy by electronic means should, therefore, be based not only on traditional jurisdiction, but also on soft law, i.e., self-binding policies such as the existing data privacy principles. Soft law may be more effective than hard law. The reactions of disappointed customers, especially when those reactions are spread by social media, and the fact that noncompliance with corporate governance may result in unfair competition and/or liability toward affected customers (unfair competition by not complying with self-binding policies/liability toward customers by breach of contract) will often be more effective than mere fines or penalties.

The second problem of data protection has to do with the fact that these regulations are not internationally harmonized, causing severe complications (especially between the United States and the European Union) on a cross-border basis, which is the rule rather than the exception in modern business. To make data privacy rules work in a global environment, the principles outlined in this article consider US standards (e.g., the US Federal Trade Commission’s Fair Information Practices), European standards (e.g., Data Protection Directive 95/46/EC and the General Data Protection Regulation [GDPR]), Asian regulations (e.g., Hong Kong Personal Data Privacy Ordinance [PDPO]) and international benchmarks (e.g., the Organization for Economic Co-operation and Development [OECD] Privacy Framework Basic Principles).

This article also considers the fact that common data privacy regulations, especially in Europe, tend to focus on a traditional human rights approach, neglecting the fact that nowadays, data are usually given away voluntarily upon contractual agreement. When using sites such as Google, Baidu, Amazon, Alibaba or Facebook, users agree with the terms and conditions of these companies. Data privacy should consider not only mere data protection, but also contractual principles, among which one of the oldest and most fundamental is do ut des , meaning a contract in which there is a certain balance between what is given and what is received. That philosophy explains why companies such as Google or Facebook, for whose services the customer does not pay, have the right to use personal data. In other words, that tradeoff—data for services—is the balance. 4

The consumer being less protected when receiving free services is a basic element of the European E-Commerce Directive, which does not apply to services that are offered free of charge. But this consideration is only a first step. Applied to a modern data environment, a balance also has to be struck in relation to other parameters relevant to contractual aspects of data privacy. Since data are a contract matter, it is important to consider what kind of personal data are in consideration (e.g., sensitive and nonsensitive data have to be distinguished and treated differently), and since contracts are concluded by mutual consent, the extent of such consent also has to be taken into account. For example, does consent have to be declared explicitly or is accepting the terms of use sufficient?

The IDPPs approach takes into consideration the Asian, European, US and international data protection standards and focuses on personal data, but can apply to corporate data as well. These principles suggest that the three parameters (payment, consent and data category) should be balanced and combined with the previously mentioned, Asian, European, US and international standards, putting them into a set of privacy rules. Organizations in compliance with international data privacy standards should commit to the following 13 IDPPs: 5

  • Comply with national data protection or privacy law, national contract law, and other legal requirements or regulations relating to data privacy.
  • Comply with current security standards to protect stored personal data from illegitimate or unauthorized access or from accidental access, processing, erasure, loss or use.
  • Implement an easily perceptible, accessible and comprehensible privacy policy with information on who is in charge of data privacy and how this person can be individually contacted, why and which personal data are collected, how these data are used, who will receive these data, how long these data are stored, and whether and which data will be deleted or rectified upon request.
  • Instruct employees to comply with such privacy policies and avoid activities that enable or facilitate illegitimate or unauthorized access in terms of IDPPs.
  • Do not use or divulge any customer data (except for statistical analysis and when the customer’s identity remains anonymous), unless the company is obliged to do so by law or the customer agrees to such use or circulation.
  • Do not collect customer data if such collection is unnecessary or excessive.
  • Use or divulge customer data in a fair way and only for a purpose related to activities of the company.
  • Do not outsource customer data to third parties unless they also comply with standards comparable to these IDPPs.
  • Announce data breaches relating to sensitive data.
  • Do not keep personal data for longer than necessary.
  • Do not transfer personal data to countries with inadequate or unknown data protection standards unless the customer is informed about these standards being inadequate or unknown and agrees to such a transfer.
  • Inform the costumer individually and as soon as reasonably possible in the event of a data breach.
  • Inform the customer upon request about which specific data are stored, and delete such data upon request unless applicable laws or regulations require the company to continue storing such data.
  • Do not use or divulge content-related personal data.
  • Do not use or divulge any other personal data without the customer’s explicit, separate and individual consent.
  • Do not store, use or divulge any customer data, unless applicable laws or regulations require the company to continue storing such data.
  • Inform the customer as soon as reasonably possible in the event of data breaches.
  • Inform the customer upon request what types of sensitive data are stored and delete such data upon request when such data are outdated, unless applicable laws or regulations require the company to continue storing such data.
  • Do not use or divulge sensitive data without the customer’s explicit, separate and individual consent.

The Hong Kong Personal Data Privacy Ordinance

The 1980 OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (the OECD Privacy Guidelines) 6 are often the standard that data protection laws of many countries reference. 7

The OECD Privacy Guidelines have eight basic principles:

  • Collection Limitation Principle
  • Data Quality Principle
  • Purpose Specification Principle
  • Use Limitation Principle
  • Security Safeguards Principle
  • Openness Principle
  • Individual Participation Principle
  • Accountability Principle

Being a framework with the aim of providing guidelines to jurisdictions to enact their own privacy laws, the definitions of these principles are at a high level deliberately. When these high-level principles are converted to national laws, many jurisdictions take on the same principles-based approach. For example, the UK’s Data Protection Act uses eight DPPs, 8 Australia’s Privacy Act has 13 privacy principles, 9 and the Canadian Personal Information Protection and Electronic Documents Act has 10 principles. 10

For the purpose of illustration, the remaining part of this article will use Hong Kong’s PDPO, enacted in 1995 and Asia’s first privacy law, to highlight the salient points on how ethical considerations are built within the implementation of privacy legislation that is compatible with the OECD Privacy Guidelines.

The Six Data Protection Principles of PDPO

An explanation of the DPPs is provided by the Hong Kong, 11 Office of the Privacy Commissioner for Personal Data, and can be summarized as:

  • Personal data must be collected in a lawful and fair way for a purpose directly related to a function/activity of the data user (i.e., those who collect personal data).
  • Data subjects (i.e., individuals from whom personal data are collected) must be notified of the purpose and the classes of persons to whom the data may be transferred.
  • Data collected should be necessary, but not excessive.
  • Accuracy and Retention Principle —Personal data must be accurate and should not be kept for a period longer than is necessary to fulfill the purpose for which they are used.
  • Data Use Principle —Personal data must be used for the purpose for which the data are collected or for a directly related purpose, unless voluntary and explicit consent with a new purpose is obtained from the data subject.
  • Data Security Principle —A data user needs to take reasonably practical steps to safeguard personal data from unauthorized or accidental access, processing, erasure, loss or use, while taking into account the harm that would affect the individual should there be a breach.
  • Openness Principle —A data user must make personal data policies and practices known to the public regarding the types of personal data it holds and how the data are used.
  • Data Access and Correction Principle —Data subjects must be given access to their personal data and allowed to make corrections if the data are inaccurate.

The PDPO is principle-based and is not a piece of prescriptive law. Knowing the underlying ethical considerations for each principle will help an organization to better understand the spirit and the letter of the law when developing a compliance program. In particular, ethical relevance is clearly evident in the implications of PDPO privacy protection principles:

  • DPP1 explains that the collection of personal data must be fair and that personal data collected should not be excessive. Whether the collection is fair and excessive will have to be assessed under the circumstance. Given that fairness and excessiveness for one person may not be the same for another person, there is, inevitably, a judgment involved in the assessment. That relativistic judgement will, in turn, be influenced by the society’s acceptable behavior and value, i.e., its collective ethical belief.
  • DPP2 states that collected personal data are not to be kept for longer than is necessary. As there is also an element of judgment on necessity, it can be argued on utilitarian grounds that there could be an ethical dilemma in deciding on a short retention period that is protective of the individuals or a longer period that is protective of the interests (commercial or otherwise) of the organization that collects the personal data.
  • DPP3 states that data use that is not directly related to the original purpose may be carried out only with the consent of the individual. This may be translated as respecting the wishes of the individuals. Even if the organization thinks that the changed use would be beneficial to individuals, the organization has no right to take away the individual’s free will and choice.
  • DPP4 states that organizations should implement reasonable security protection on the collected personal data to prevent data leakage. While leaving aside the decision on how many resources and how much effort an organization should use to protect the personal data collected, DPP4 asks organizations to balance the resources and effort against the likely harm to individuals.

In 2010, ethical considerations related to data protection played a major role in testing existing laws. The Octopus card is an “electronic wallet” that many Hong Kong residents use for daily transportation and everyday purchases. In 2010, it was discovered that Octopus Cards Limited, the company that owned the cards, was selling card owners’ loyalty membership to insurance companies for direct marketing purposes. As a result of public outcry, the privacy commissioner investigated and concluded that while the sales of customer records was not prohibited by the law at the time, the company failed to make a meaningful effort to seek consent from customers when it informed them of this data use in a privacy policy statement.

The company denied contravening the law, but accepted that its actions felt short of customer expectations. Two major officers of the company stepped down during the investigation. 12 , 13

The heightened public awareness of personal data rights that arose in the wake of the incident changed expectations of organizational behavior. No longer will people accept companies doing only the bare minimum required by law; they must also act ethically. The chief executive officer (CEO) who took over Octopus Cards Limited after the incident captured the new expectations succinctly: “We need to do not just [what is] legal, but what is right.” 14

The Hexa-dimension Code of Conduct

A code of conduct serves a variety of functions, one of the most important of which is to serve as a guide for stakeholders based on a set of rules and standards. Despite official adoption, company policies and standards, which tend to be difficult for stakeholders, including employees to absorb, are not easy to enforce effectively and are probably ignored in the end. A code of conduct, if formulated and articulated well, should serve to communicate the policies and standards to stakeholders in a relatable way. While such codes may serve to deter potential offensive actions, they are limited in enforcing those rules or standards; they rely purely on the moral obligation of the stakeholders concerned, because violation by itself does not, in general, attract any criminal charges in the legal sense. However, despite the good intention and official adoption, the code by itself cannot guarantee more ethical behavior, and auxiliary measures must be in force to operationalize the rules and standards effectively.

Organizations of all varieties might have some kind of code of practice in place. However, the extant codes invariably tend to focus on technical, financial and legal issues and are insufficient when considering the ethical, social and ecological concerns that rapidly emerge and ascend to the top of corporate and IT management agendas. Different organizations have their own unique policies and a unique code of conduct; there can be no universal recipe, only a general guideline. As a general guideline for designing a code of conduct, the hexa-dimension framework is recommended. This framework comprises two major components: the theoretical hexa-dimension metric for measuring legal validity, social desirability, ecological sustainability, ethical acceptability, technical effectiveness and financial viability (the six requirements/factors) and a scheme for operationalizing the framework. The operationalization scheme is carried out in three major steps including:

  • Identify the relevant critical factors depending on the target end users (corporatewide or a functional unit or nature of operation). For example, environmental impact is critical for a mining company or a factory, but could probably be skipped for an information security unit.
  • Secure the support of the board of directors with respect to corporate policy aspects and the supporting infrastructures that include the organization’s human resources (HR) management, legal, finance, and information and communications technology functional units with respect to technical support and reference. An appraisal of ethical consistency in conduct should be included during annual performance reviews (by HR).
  • Determine a schedule for quantifying the elements of each factor for measuring, prioritizing and balancing the factors. The attributes/factors with help determine the steps to be taken to measure the effectiveness.

If properly and appropriately formulated and articulated, the code can be useful in disseminating the policies and standards throughout the organization and beyond, thus cultivating corporatewide ethical, professional conduct. While a code may deter potential offensive actions, it is limited in enforcing the rules or standards. The limitation exists because the code can rely only on the stakeholders’ sense of morality because violation of the code does not entail any criminal charges. Auxiliary measures must be put in place to arrive at desirable results, such as executive actions that provide rewards and impose punishment (e.g., discussing the hexa-dimensional code of conduct during an annual performance appraisal when those being appraised are asked to exemplify that their assigned duties were carried out in a manner consistent with data privacy protection policies, i.e., not breaking the law; not harmful to individuals and society at large; not wasteful of the resources available including the computer facilities, the workforce, the budget; and not harming the environment).

Information security professionals are in urgent need of effective and pragmatic guidance for developing data privacy protection standards for two major reasons. The first is that the information security function in a technology-driven information-intensive environment becomes more complicated due to new risk (e.g., socio-techno risk); the second is that data privacy protection becomes a primary concern to information security management as privacy infringement occurs frequently and attracts wide coverage in the media. Viewing privacy from the perspective of ethics can help enterprises establish and improve their code of conduct. Considering privacy from an ethical point of view and establishing a code of conduct makes all individuals in an organization, not just security personnel, accountable for protecting valuable data.

1 Zankl, W.; The International Data Privacy Principles, presented at Harvard University, Cambridge, Massachusetts, USA, October 2014, www.e-center.eu/static/files/moscow-dataprivacy-handout-russian.pdf 2 The Personal Data (Privacy) Ordinance, Chapter 486 , https://www.pcpd.org.hk/english/files/pdpo.pdf and https://www.pcpd.org.hk/english/data_privacy_law/ordinance_at_a_Glance/ordinance.html 3 Lee, W. W.; “A Hexa-dimension Metric for Designing Code of Ethical Practice,” Encyclopedia of Information Science and Technology, 4 th Edition , July 2017 4 It has been argued that companies using, for instance, their Internet platforms for advertising purposes are already being paid by the advertisers so there is no need for further payment with data. This point of view neglects the fact that do ut des refers to a balance between quid pro quo of the contracting parties and not between these parties and third parties. That is why readers have to pay for magazines despite the publisher receiving payments from third parties (advertisers). 5 Op cit , Zankl 6 Organization for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data , www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm 7 Greenleaf, G; “Global Data Privacy Laws 2015: 109 Countries, with European Laws Now a Minority,” 133 Privacy Laws & Business International Report , February 2015, p. 14-17 8 Information Commissioner’s Office, Data Protection Principles, United Kingdom, https://ico.org.uk/for-organisations/guide-to-data-protection/data-protection-principles/ 9 Office of the Austrailian Information Commissioner, Australian Privacy Principles, https://oaic.gov.au/privacy-law/privacy-act/australian-privacy-principles 10 Office of the Privacy Commissioner of Canada, PIPEDA Fair Information Principles, September 2011, https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/p_principle/ 11 Office of the Privacy Commissioner for Personal Data, Six Data Protection Principles, Hong Kong, https://www.pcpd.org.hk/english/data_privacy_law/6_data_protection_principles/principles.html 12 Ng, J.; “Octopus CEO Resigns Over Data Sale,” The Wall Street Journal , 4 August 2010 13 Chong, D.; “Second Octopus Boss Quits Amid Scandal,” The Standard , 20 October 2010 14 Cheung, S.; “The Challenges of Personal Data Privacy in A New Era,” International Conference on Privacy Protection in Corporate Governance, 11 February 2014, https://www.pcpd.org.hk/privacyconference2014/programme.html

Wanbil W. Lee, DBA, FBCS, FHKCS, FHKIE, FIMA Is principal director of Wanbil & Associates, founder and president of The Computer Ethics Society, and cofounder and Life Fellow of the Hong Kong Computer Society. He serves on committees of several professional bodies, editorial boards and government advisory committees. He has held professorial and adjunct appointments in a number of universities. His expertise is in information systems, and he has a strong interest in information security management, information systems audit and ethical computing.

Wolfgang Zankl, Ph.D. Is a professor of private and comparative law at the University of Vienna (Austria) and associate lecturer for social media law at the Quadriga University (Berlin, Germany). He founded and runs the European Center for E-commerce and Internet Law (e-center.eu) and is a board member of The Computer Ethics Society.

Henry Chang, CISM, CIPT, CISSP, DBA, FBCS Is an adjunct associate professor at the Law and Technology Centre, the University of Hong Kong. Chang is an appointed expert to the Identity Management and Privacy Technologies Working Group (SC27 WG5) of the International Organization for Standardization (ISO). His research interests are in technological impact on privacy, accountability and Asia privacy laws.

data privacy act case study

Animo Repository

  • < Previous

Home > FACULTY_WORKS > FACULTY_RESEARCH > 5993

Faculty Research Work

Data privacy act of 2012: a case study approach to philippine government agencies compliance.

Michelle Renee D. Ching , De La Salle University, Manila Bernie S. Fabito , De La Salle University, Manila Nelson J. Celis , De La Salle University, Manila

College of Computer Studies

Department/Unit

Information Technology

Document Type

Archival Material/Manuscript

Publication Date

The Philippine Data Privacy Act (DPA) of 2012 was enacted to protect the personal information of its citizens from being disclosed without its consent. The National Privacy Commission (NPC) was established in 2015 to promote, regulate, and monitor data privacy compliance of both Government and Private Institutions. This study sought to explore and explain how and why do the Philippine Government agencies comply with the DPA 2012. Additionally, it also tried to determine and understand the determinants of compliance as perceived by the government agencies. The Commission on Higher Education (CHED) and the Commission on Elections (COMELEC) were the focus of the interviews conducted by the researchers. The NPC was also included in the study to determine the status of the government’s compliance with the law. The study was a form of a qualitative case study following the context of (R. K. Yin, Case Study Research (2014)) study of research designs and methods. The case study is the recommended approach as the main question starts with how and why . As a result of the study, it was found out that there are three factors that somehow influence government agencies from hampering their compliance to the DPA 2012. These are (1) lack of awareness, (2) budget, and (3) time constraints. With regards to the determinants of compliance, (1) deterrence, and (2) legitimacy were the concluded causal factors on why they will comply with the DPA 2012. For future works, it is recommended that a follow-up study be conducted after the compliance deadline.

Recommended Citation

Ching, M. D., Fabito, B. S., & Celis, N. J. (2018). Data Privacy Act of 2012: A case study approach to Philippine government agencies compliance. Retrieved from https://animorepository.dlsu.edu.ph/faculty_research/5993

  • Disciplines

Civil Rights and Discrimination | Public Administration

Data privacy—Law and legislation—Philippines; Administrative agencies—Philippines—Rules and practice

Upload File

This document is currently not available here.

Since June 01, 2022

Advanced Search

  • Notify me via email or RSS
  • Collections
  • Colleges and Units
  • Submission Consent Form
  • Animo Repository Policies
  • Animo Repository Guide
  • AnimoSearch
  • DLSU Libraries
  • DLSU Website

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

Data privacy act of 2012 compliance performance of Philippine government agencies: a case study approach

New citation alert added.

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Please log in to your account

Information & Contributors

Bibliometrics & citations.

  • Walters R Novak M Walters R Novak M (2021) The Philippines Cyber Security, Artificial Intelligence, Data Protection & the Law 10.1007/978-981-16-1665-5_8 (197-220) Online publication date: 25-Aug-2021 https://doi.org/10.1007/978-981-16-1665-5_8
  • Presbitero J Ching M Ng V Park C Hou Y Huarng K Wollenberg A (2018) Assessing compliance of Philippine state universities to the data privacy act of 2012 Proceedings of the 2nd International Conference on E-commerce, E-Business and E-Government 10.1145/3234781.3234800 (90-94) Online publication date: 13-Jun-2018 https://dl.acm.org/doi/10.1145/3234781.3234800

Index Terms

Social and professional topics

Computing / technology policy

Privacy policies

Recommendations

Commitment on data privacy towards e-governance: the case of local government units.

The proliferation of ICT in the government sector is a crucial tactic in achieving different dimensions of public trust and services, especially that government offices and local government units (LGUs) are gearing toward e-governance as a way to manage ...

Performance compliance of Philippine national government agency on the data privacy act of 2012: a qualitative case study

The "Data Privacy Act of 2012" also known as the Republic Act No. 10173 of the Philippines aims to safeguard the personal data of its citizens, this gave rise to the creation of the National Privacy Commission (NPC) in 2015 and this agency is tasked to ...

Philippine SUCs compliance performance on RA 10173: a case study on Bukidnon State University

The advancements in technologies have accelerated educational institutions by improving service delivery while reducing costs. As a result, it provided avenues of learning, the use of online discussions, Virtual Learning Environments (VLEs) and mobile ...

Information

Published in.

cover image ACM Other conferences

  • Conference Chairs:

The Hong Kong Polytechnic University, Hong Kong

Korea University, South Korea

Tamkang University, Taiwan

  • Program Chairs:

Feng Chia University, Taiwan

St. George's University, Grenada

Association for Computing Machinery

New York, NY, United States

Publication History

Permissions, check for updates, author tags.

  • data privacy
  • data privacy compliance
  • e-governance
  • personal information
  • Research-article

Contributors

Other metrics, bibliometrics, article metrics.

  • 2 Total Citations View Citations
  • 118 Total Downloads
  • Downloads (Last 12 months) 3
  • Downloads (Last 6 weeks) 0

View Options

Login options.

Check if you have access through your login credentials or your institution to get full access on this article.

Full Access

View options.

View or Download as a PDF file.

View online with eReader .

Share this Publication link

Copying failed.

Share on social media

Affiliations, export citations.

  • Please download or close your previous search result export first before starting a new bulk export. Preview is not available. By clicking download, a status dialog will open to start the export process. The process may take a few minutes but once it finishes a file will be downloadable from your browser. You may continue to browse the DL while the export process is in progress. Download
  • Download citation
  • Copy citation

We are preparing your search results for download ...

We will inform you here when the file is ready.

Your file of search results citations is now ready.

Your search export query has expired. Please try again.

Accessibility Links

  • Skip to content
  • Skip to search IOPscience
  • Skip to Journals list
  • Accessibility help
  • Accessibility Help

Click here to close this panel.

Purpose-led Publishing is a coalition of three not-for-profit publishers in the field of physical sciences: AIP Publishing, the American Physical Society and IOP Publishing.

Together, as publishers that will always put purpose above profit, we have defined a set of industry standards that underpin high-quality, ethical scholarly communications.

We are proudly declaring that science is our only shareholder.

National Government Agency's Compliance on Data Privacy Act of 2012 a Case Study

V Pitogo 1,2

Published under licence by IOP Publishing Ltd Journal of Physics: Conference Series , Volume 1201 , International Conference on Electronics Representation and Algorithm (ICERA 2019) 29–30 January 2019, Yogyakarta, Indonesia Citation V Pitogo 2019 J. Phys.: Conf. Ser. 1201 012021 DOI 10.1088/1742-6596/1201/1/012021

Article metrics

7735 Total downloads

Share this article

Author affiliations.

1 College of Computing and Information Sciences, Caraga State University, Butuan City, 8600 Philippines

2 De La Salle University, Taft Avenue, Manila, 1004 Philippines

Buy this article in print

The Republic Act (RA) No. 10173 of the Philippines, or also known as "Data Privacy Act of 2012" (DPA of 2012) was established to protect and to safeguard the individual personal data in an information and communications systems within the government and the private sector while keeping the fundamental human right to privacy of communication. The creation of the National Privacy Commission (NPC) as an independent body mandated to administer the DPA of 2012; to monitor and to ensure compliance with privacy, and to provide proper data protection procedures. This empirical qualitative research using case study approach aims to discover and describe why and how the country's frontline agency ( Agency A ) complies with DPA of 2012. This paper also seeks to determine the challenges and better practices encountered by the agency relative to compliance and how far the level of commitment does Agency A has taken on. Results further showed that Agency A is Partially Compliant with the Act. It also encountered some challenges such as (a) lack of awareness, (b) wait-and-see attitude, and (c) time and resource constraints. Determinants of compliance like: (i) general deterrence, (ii) legitimacy of regulations, and (iii) reputation and publication had a compelling casual factor why it has conformed to the law. It is further recommended to conduct a follow-up study once the recent Information Systems Strategic Plan (ISSP) is available, vis-a-vis to March 8, 2018, extended deadline.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence . Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

DPO Centre

GDPR & Data Protection Act Case Studies

At The DPO Centre, we understand that every organisation is different, as even organisations in the same industry may not have the same requirements. We offer help and tailored support to organisations across a wide variety of sectors ,  assisting them to comply with data protection laws such as the UK and EU GDPR.  

On this page, we’ve featured case studies from a cross section of our client base of over 900 organisations. These case studies feature clients from a range of industry sectors, including tech, consumer products and services, health and medical, life sciences and clinical trials, charities and education.  They highlight our work with these clients where we’ve provided outsourced DPO consultancy, UK and EU Representation services, Data Subject Access Requests (DSARs) support , and staff training and awareness.  

data privacy act case study

Apreo Health

Sector : Medical & Health

Key Challenges : EU Representation, GDPR Understanding, Special category data

Services : Outsourced DPO

Leaf Case Study

Sector : Tech

Key Challenges : Complex Data Processes, Data Minimisation, Policies & Documentation

Reveal Media

Reveal Media

Key Challenges : Data Minimisation, Large volumes of information, Transparency

data privacy act case study

Copleston High School

Sector : Education

Key Challenges : Data sharing agreements with external third parties, FOI request, Large number of complex DSARs

Services : DSAR support

Eaton House Schools - Case Study

Eaton House Schools

Key Challenges : Complex Application of Exemptions, DSAR Timeline, Extensive Data & Documentation

data privacy act case study

Key Challenges : Complex Data Handling, Special category data

data privacy act case study

Sector : Charities & Not-For-Profits

Key Challenges : No Retention Policies in Place, Special category data

data privacy act case study

Birmingham & Solihull Mental Health NHS Foundation Trust

Sector : Government OrganisationsMedical & Health

Key Challenges : Complex DSAR, High number of data subjects, Large volumes of information

data privacy act case study

London Borough Barking & Dagenham Council

Sector : Government Organisations

Key Challenges : Complex DPIAs, Implementation of "Privacy by Design", Public Interest

data privacy act case study

Portman Dental Care

Key Challenges : Complex DSAR, Significant redaction, Special category data

data privacy act case study

Sector : Manufacturing & Engineering

Key Challenges : Complex records of processing activities (RoPA), Policies & Documentation, Staff upskilling

data privacy act case study

Sector : Professional Services

Key Challenges : Rapid business expansion, Staff upskilling

Services : Outsourced DPO, Staff Training & Awareness

data privacy act case study

Sector : Consumer Products & Services

Key Challenges : Complex Data Handling, Data security, Policies & Documentation, Unique Retail Environment

Services : Consultancy, Outsourced DPO

data privacy act case study

Key Challenges : Complex Data Processing Agreements, Implementation of "Privacy by Design", Staff upskilling

data privacy act case study

Key Challenges : Data subject rights requests, High number of data subjects, Large volumes of information

Services : Staff Training & Awareness

data privacy act case study

Clinisupplies

Key Challenges : Complex DPIAs, Data Processing Agreements, DSPT toolkit submission

data privacy act case study

Sector : Finance & Insurance

Key Challenges : Providing services in an FCA regulated industry, Rapid business expansion, Varying client size

data privacy act case study

Spencer Private Hospitals

Key Challenges : Complex records of processing activities (RoPA), DSPT toolkit submission, Special category data

data privacy act case study

Shard Capital

Key Challenges : Complex Data Handling, Large volumes of information, Rapid business expansion

data privacy act case study

Unbar Rothon

Key Challenges : Complex DSAR, Large volumes of information, Significant redaction

Services : Consultancy, DSAR support, Staff Training & Awareness

data privacy act case study

Key Challenges : Complex Data Handling, Data retention, Global Company

data privacy act case study

Key Challenges : EU Representation, Policies & Documentation, Special category data

Services : EU Representation, Staff Training & Awareness

data privacy act case study

NSPCC Fundraising

Key Challenges : Complex Data Handling, Handling sensitive data, Managing Consent, Special category data

NSPCC Child protection

NSPCC Children’s Services

Key Challenges : Complex Data Handling, Handling sensitive data, Special category data

What our clients say about us

The DPO Centre’s Data Protection Officers are proud to have worked with all of our clients, but what do our clients think of working with us? The testimonials below have been provided by the organisations featured in our case studies. Our clients continue to emphasise our expertise, systematic , risk-based approach, pragmatic, solution-oriented advice and transparent communication.  

quote

We are really pleased with our DPO from The DPO Centre, who understood our needs and was able to translate them into a workable plan that has greatly assisted our business’s compliance journey. The DPO Centre’s advice and support has assisted us in ensuring that our compliance level has remained high despite the challenges that rapid growth presents.

The DPO Centre’s help in dealing with a particularly complex DSAR that we received was invaluable. The support and advice that they provided throughout the entire process was extremely helpful… Overall, working with The DPO Centre greatly reduced the significant challenge of dealing with this DSAR

Professional Case Management

Jenifer mcintosh.

The DPO I had the pleasure of working with on that project is one of the best DPO/counsels I have worked with when it came to thoughtfully negotiating through a clinical trials-DPA, given his great working knowledge of the GDPR and the crossover with clinical trials regulations in both EU & UK.

Drew Davies

The DPO Centre’s team are always on hand to answer any queries we may have and to help us respond to any Data Subject Access Requests from any trial member across the EU.

Ufford Park

Josie hopps.

I cannot recommend The DPO Centre enough; from start to finish the process has been simple and the whole team here at Ufford Park Hotel have felt informed and supported with the suggested changes and improvements.

Hughes Electrical

Henrico doward.

We have had a positive working relationship with Rob and the team, from our first meeting and the insightful workshop – from review to implementation – the process has been straightforward and hassle-free.

MACC International Ltd

John morrison.

The work with our staff has been conducted in a thoroughly clear, concise and systematic way, with no stone left unturned.

West Suffolk College

Jules bridge.

With the excellent ongoing help, support and guidance from the DPO Centre team, we now have the comprehensive and prioritised action plan that we need to work toward and implement – meaning we feel we are GDPR ready.

Positive Steps PT

Kevin marshall.

It’s been really cost-effective for my business to enlist the help of the DPO Centre – not only was I able to get a full impact assessment of my business and resolutions to the issues identified but I have received training on so many aspects of the GDPR.

Data Protection Services

Do any of our case studies sound like your organisation? At The DPO Centre, we help organisations of all types to comply with UK and EU GDPR and the other UK, EU and global data protection laws. Our services will help your organisation to better understand your data and current level of compliance.  We provide tailored advice, expertise and resources that are backed up by the support, shared best practices, and model documentation we’ve developed from working with over 900 organisations worldwide. If your organisation could benefit from these services, please get in touch using the form below.  

Enquire Today

Fill in your details below and we’ll get back to you as soon as possible

  • DOI: 10.1166/ASL.2018.12404
  • Corpus ID: 117270177

Data Privacy Act of 2012: A Case Study Approach to Philippine Government Agencies Compliance

  • M. Ching , Bernie S. Fabito , Nelson J. Celis
  • Published in Advanced Science Letters 1 October 2018
  • Law, Political Science, Computer Science

13 Citations

Data privacy act of 2012 compliance performance of philippine government agencies: a case study approach.

  • Highly Influenced

Philippine SUCs compliance performance on RA 10173: a case study on Bukidnon State University

  • 25 Excerpts

RA 10173 and its challenges to Philippine state universities and colleges' compliance performance: the case of Mindanao State University - General Santos City

Understanding philippine national agency's commitment on data privacy act of 2012: a case study perspective, evaluating the usability and effectiveness of the ggo painfree telehealth system (beta version) for musculoskeletal pain management in the philippines, plights of tenured teachers in integrating technology in classroom instruction - a phenomenological inquiry, ethics in ai governance: comparative analysis, implication, and policy recommendations for the philippines, ad-gency: hospital patients' admission management information system and analytics development on health emergency situation, national government agency’s compliance on data privacy act of 2012 a case study, commitment on data privacy towards e-governance: the case of local government units, related papers.

Showing 1 through 3 of 0 Related Papers

The ICO exists to empower you through information.

Case studies and examples

Share this page.

  • Share via Reddit
  • Share via LinkedIn
  • Share via email

Our data sharing code provides real-world examples and case studies of different approaches to data sharing, including where organisations found innovative ways to share data while protecting people’s information.

Here are some case studies additional to those in the code.

Data sharing to improve outcomes for disadvantaged children and families

Sharing with partners in the voluntary or private sector, landlord and tenant data sharing, sharing medical records of care home residents, ensuring children’s welfare: data sharing by local authorities with ofsted, the regulator of social care and early years provision in england, effective information sharing between the police and ofsted in england.

  • Sharing medical records between GP practice and hospital trust

Improving data sharing processes and practices at an NHS trust

Improving health services with responsible data sharing, sharing health data for research purposes.

Social workers frequently need access to information about children and their families when deciding whether there is a safeguarding risk and what support is most appropriate.

Two councils in different areas of the UK partnered with a not-for-profit organisation to find a data sharing solution where social workers would have all the information they need from the start.

After extensive user research and workshops with stakeholders and families, they found that social workers needed access to the contact details of the lead practitioner of a case from other services (police, housing, schools and adult social care), and basic information about when the service was last involved with the family. The research found that sharing such data would:

  • reduce the amount of time social workers spend looking for information;
  • enable more joint working among services (eg children’s social care working more closely with adult social care);
  • ensure social workers have access to all the information they need when assessing safeguarding risk and making support decisions for children and their families; and
  • allow children and families to access better, more timely services.

At the same time, the two councils and the not-for-profit organisation explored the information governance and ethical implications of accessing and using sensitive personal data within social care. They ran ethics workshops with the project team and conducted user research with those most likely to be affected by the data sharing (residents who have had contact with social care and social workers).

The research enabled the two councils to design, build and embed a digital data sharing solution that empowers social workers, enables professional judgement, protects privacy, and ultimately enables children and their families to access the right support and reach their potential.

A group of voluntary sector organisations worked with health and social care partners (both private and public sectors) on a project to deliver improved outcomes for older people in the community and in hospital.

The project team recognised that it needed to establish a culture of shared information, along with a phased, proactive approach to seeking individuals’ consent. It also recognised that the involvement of volunteers could have implications for the sharing of data within the project team, as they have a different legal status to the agencies’ employees and might not have received the same level of training as employees in the work of the organisation.

The project was set up as follows:

  • The volunteers signed contracts setting out their roles, responsibilities and standards - including those for information security - equivalent to those of the agencies’ employees. The contracts were intended to formalise and support the volunteers’ responsibilities for gathering and sharing information. Training and ongoing support were provided to the volunteers.
  • GPs asked their elderly patients whether they would like to take part in the project. They were asked specifically whether they agreed to relevant information from their health record being shared with a multi-disciplinary project team consisting of health, social care and voluntary sector practitioners.
  • At the initial home visit, the volunteer explained the information-sharing aspects of the service and asked for written consent.
  • All of the organisations and GP practices involved in the project entered into a single data sharing agreement. This built accountability and trust between the agencies involved.

Note it was important to consider whether the necessary legal power or ability to share personal data was in place. The legal power is separate from the lawful basis for data processing.

A housing association occasionally received requests from organisations such as utility companies, debt collectors and councils for information about current and former tenants. However it was considered not to be appropriate to enter into a data sharing agreement as the sharing was not on a regular basis.

On one occasion, a utility company contacted the housing association and asked for the forwarding address of a former tenant who was in arrears on his gas and electricity account. The housing association disclosed the information because they had advised tenants at the start of their tenancy that they would make such disclosures because of the contractual relationship between tenants and the utility company. All tenants had agreed to this.

On another occasion, a debt collection company acting for a third party contacted the housing association for the forwarding address of a former tenant. The housing association decided that it could not disclose the information because it had no lawful basis for the disclosure. It withheld the tenant’s new address from the debt collection company.

The housing association dealt with requests for information effectively because it had put a system in place which required a senior person or group of people, trained in data protection, to decide whether or not to release personal information on a case-by-case basis.

This involved verifying the identity of the requester, insisting that all requests were in writing and ensuring that the requester provided enough information to make a proper decision. If the housing association decided to share the information, they only provided relevant, necessary information and, in every case, they made a record of the disclosure decision.

Staff in a privately-owned care home did not have access to the recent medical history of residents. Instead, the home used to phone the GP practice or call out a GP every time they needed more information. This could be a risk, as the staff might need to check quickly what medicines residents were taking and at what dosages.

To make the process more efficient, the care home and the local GP practice signed up to a formal data sharing agreement, so the care home staff would have access to their residents’ electronic medical records when necessary.

The GP practice and local Clinical Commissioning Group made potential residents aware that if they were admitted to the care home there was a possibility that their medical record would be accessed. In addition, when patients were admitted to the care home, their explicit consent - or that of their representatives – was sought before their electronic medical record was accessed. Where consent was not provided, the former system of contacting a GP would continue to be used.

Other key features of the data sharing agreement were:

  • access to residents’ records could only take place while they were under the care of the home;
  • access was restricted to the clinical and professional nursing staff at the care home;
  • access was only allowed where this was necessary to provide treatment and for residents’ safety;
  • access was restricted to information relevant to the provision of care to residents;
  • access to the information was by secure means; and
  • the information obtained was held securely and in accordance with good information management practice.

A formal data sharing agreement can put in place effective safeguards for residents and can ensure the various parties involved in data sharing are working to a common set of rules. An agreement can also help to deal with the ethical and confidentiality issues that can arise in health and social care.

Even if there is a data sharing agreement in place, organisations still need to make sure that individuals whose data may be shared are aware of what is taking place. This can be done through the privacy information they provide, using various methods. In the circumstances outlined here, it might be more effective to talk to individuals to explain the situation and to find out whether they agree to their information being shared. Their decision needs to be documented.

Data sharing can help ensure the welfare of children and other vulnerable individuals.

This example concerns the sharing of personal data by staff in local authorities with Ofsted, in its role as the regulator of social care and early years provision in England.

The example focuses in particular on the role of the Local Authority Designated Officer (LADO), who is responsible for managing child protection concerns or allegations made against staff and volunteers who work with children and young people.

Data protection enables fair and proportionate data sharing. That means that LADOs should be confident they can share relevant information with other local authorities and with Ofsted. The information shared by LADOs helps Ofsted to build a complete picture about an individual’s suitability to provide services to children.

Mr D wants to register with Ofsted to provide a holiday play scheme for children in the Westtown Borough Council area. He has previously worked in a setting providing social care to children in the Easttown Borough Council area. His home is in the Northtown Borough Council area.

In order for Ofsted to reach a properly informed judgement on the suitability of anyone to provide services to children, it needs all relevant information about them. It is essential for the LADOs in Easttown and Northtown to share the information about Mr D with Ofsted when requested. This is the case irrespective of where Mr D lives or works.

This data sharing is vital, in order for Ofsted’s registration system to be effective in ensuring the safety of children.

The Chief Constable in Barsetshire police force promotes a culture where the safety of children is paramount. That includes officers in the force alerting authorities and sharing information appropriately to protect children from harm.

Officers are familiar with the role of Ofsted as the regulator in England of early years settings including childminders and nurseries, and of children’s social care services including children’s homes. Because of this, officers know that the information they share can be used by Ofsted to make children safe.

The force has provided a named contact that Ofsted staff can get in touch with, if they need to talk about concerns at any institution that Ofsted inspects or regulates. The police have been given a regional contact in Ofsted that they can get in touch with about any new information.

Police receive a call-out to a children’s home because a child has gone missing. This is not the first occasion that this child has gone missing. The child has a history of unexplained absences and is found hanging around in a local park with older young people, some of whom are known to police as gang members.

The police officers have two linked concerns that lead to them sharing information with authorities: the safety of the child who went missing, and the safety of the children’s home.

Actions taken by the police officers:

1. To safeguard the child, they contact the children’s social care team in the local authority and share information with social workers about the child’s involvement with the gang.

2. The police also contact Ofsted to tell them they are concerned that there have been multiple police call-outs to this children’s home because of children going missing. The children are vulnerable and the police consider they are at a high risk of involvement with a local gang.

This information is valuable to Ofsted who can use it to help the young people concerned. The children’s home had notified Ofsted about the child going missing, but they did not include information about the child being at risk of gang involvement.

Ofsted now considers the intelligence from police and, under its regulatory role, decides to visit the children’s home to find out what the manager and staff are doing to keep children safe and to reduce the risk of children being groomed by local gangs.

An inspector from Ofsted visits the home and finds that staff were unaware of the possible gang involvement by children in the home. Staff had not talked to children to find out where they were going or what they were doing, and although they had noticed some changes in the behaviour of the child who went missing, they had not recorded this or notified the child’s social worker. The inspector’s view is that safeguarding arrangements in the home do not appear to comply with the relevant regulations.

Because of the information shared by the police and the findings of the inspector, Ofsted is able to take regulatory action to ensure that safeguarding arrangements at the children’s home are improved. Ofsted schedules further visits to monitor practice at the home and to check that improvements have been made. The inspector continues to liaise with police to monitor the welfare of children in the home.

Sharing medical records between a GP practice and hospital trust

These scenarios apply to England only, but the general principles are relevant in  Northern Ireland, Scotland and Wales where health services are a devolved matter.

A GP practice received a request for the records of one of their patients. They are receiving care in a hospital in another part of the country. This is outside of the local shared care record initiative, which is a system that governs patient records sharing locally. The practice is confused about whether they require the patient’s consent to share the data.

Health and care settings often use the concept of consent. However, it is often misunderstood due to the use of the term in different contexts. In this case, the consent required to view and share confidential medical information is different from the consent that the data protection legislation defines as a lawful basis for processing personal data.

To help the GP practice, the hospital directs them to information available on the NHS IG Portal , a service that provides specific information governance advice to organisations that provide care services. The hospital also reminds the care setting of their responsibilities under the Health and Social Care (Quality and Safety) Act 2015 and Caldicott Principle 7 . This allows them to share someone’s personal data where it is likely to enable them to receive health or social care, and this is in their best interests.

After reading the guidance, the practice understands how this separate legal requirement for consent in a health and social care context interacts with consent as a lawful basis under data protection legislation. In this circumstance (they are sharing data for direct care purposes), they can share the data without the explicit consent of the patient. Their consent is implied due to the provision of health and care (ie, it is within the reasonable expectation of the patient for the care home to share information for these purposes). In addition, health and care staff have a legal duty to share information to support direct care.

Through the use of sector-specific guidance, organisations can reach a shared understanding of the data protection requirements for sharing data. This can reduce the friction that occurs between organisations as they consider their separate obligations under data protection law.

After receiving criticism that their procedures are hindering data sharing, an NHS trust’s information governance department establishes a new process within the organisation. This ensures people consult them in good time as part of any new processing activity that requires personal data.

In order to do this, they:

  • seek senior or executive level support for the proposal eg by the Senior Information Risk Owner (SIRO) or board where applicable;  
  • identify and review the points within the organisation where they establish new data processing activities and build information governance into business case and procurement checklists;
  • ensure timescale allocation for setting up required legal and governance documents such as data sharing agreements;
  • devise new template data protection impact assessments and data sharing agreements for organisations to use to simplify their processes;
  • provide training to the relevant staff and issue further communications across the organisation to highlight the new processes;
  • build professional networks with information governance colleagues in local organisations to learn best practice approaches and improve the information governance culture;
  • establish a review process to help understand occasions where they could not share data and apply the lessons learnt to future data sharing plans; and
  • hold a drop-in ‘meet the team’ session or issue an information sheet about their work and how their early participation will benefit colleagues.

Following this review and process redesign, the information governance team are now informed in good time about any new processing. They can ensure the team takes the appropriate governance steps before new processing takes place.

A healthcare care provider is looking to improve the services they offer their patients. By sharing appropriate levels of data with other care organisations in the area, the organisation realises they can improve services. However, the organisation traditionally avoids risk when it comes to sharing data. This adversely impacts the quality of care they can provide.

As the organisation looks to improve their data sharing practices, they decide to find ways they can assure themselves that whenever they shares data, they are doing so responsibly. They want to make sure they are adhering to the requirements of data protection law, common law and their responsibilities to their service users.

They refer to the considerations the ICO lays out in the data sharing checklist in the data sharing code of practice . The organisation builds on this by adding the following checks:

  • The status of the organisation (with respect to the legal powers provided by the Health and Social Care Act 2015 etc).
  • The nature of the processing and purpose for which the organisation needs to share data.
  • The status of the organisation they plan to share the data with, which could include reviewing the information in the NHS Data Security and Protection Toolkit (DSPT).
  • Other appropriate due diligence checks such as the NHS Digital Technology Assessment Criteria (DTAC).
  • The amount of data being requested for the purpose or purposes they are using it for.
  • The necessity for the data sharing (does it need to happen, or can the organisation achieve the purpose another way, for example using anonymised data?)
  • Ensuring the organisation has suitably informed the patients or service users of the proposed sharing and of their data subject rights.

After implementing this approach, the organisation feels more confident about sharing data. By keeping a record of their decisions, they are also demonstrating their accountability for their actions.

A hospital trust is preparing to trial a medical device that they are developing to support clinical decision-making for patients suffering from heart disease. The device is a data-driven app that applies a risk model based on details from the patient’s medical history. Although members of the trust’s clinical team developed the risk model, a third-party private company are developing the app itself.

The trust wishes to use patient data to support the research phase of the app development, which is part of the approval process for medical devices. This involves sharing patient data with the app’s developers for research purposes. As the app developer will need health information, which is capable of identifying people for this research, the hospital trust needs a legal basis for lifting the common law obligation of confidentiality to disclose and use the information for the purposes of this research programme. Before the trust shares the data, they consider a number of questions as part of their data protection impact assessment (DPIA), which include the following:

  • What is the lawful basis under UK GDPR to process this data?
  • What can they do to minimise the amount of data they need to process to effectively perform this task?
  • Will the trust be able to get explicit consent (common law) from each patient to view their medical information for this purpose? Is this practical? Are there other ways to satisfy the common law?
  • What approvals do they require in order to carry out the research?

Following a review of guidance relating to confidentiality and consent available on the NHS IG Portal , the Trust understands that they can identify a lawful basis under UK GDPR. However, for common law purposes they need to make an application to the Confidentiality Advisory Group (CAG) under section 251 of the NHS Act 2006 for advice on whether the research group can access the data without the patients’ explicit consent. This is because the purpose of the processing is not direct care, and they do not have the implied consent of the patient to access this data (under common law).

Following a successful CAG application and approval, the trust could share the information from their patient records in order to carry out this research. Analysis of the confidential patient information meant that the trust could confirm the effectiveness of their risk model and seek approval for their medical device.

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Perceptions of Students, Faculty and Administrative Staff on the Data Privacy Act: An Exploratory Study

Profile image of David Paul Ramos

2019, JPAIR Multidisciplinary Research

The Data Privacy Act of 2012 was enacted to “protect the fundamental human right of privacy of communication while ensuring a free flow of information to promote innovation and growth.” Data privacy pertains to the right of an individual not to disclose his or her information. Since privacy is a universal human right, it is the responsibility of the government to protect the rights of its people to privacy and provide measures to protect their data. Given that the Data Privacy Act’s implementation is a relatively recent development in the Philippines, little is known about the various stakeholders’ perceptions towards it. A qualitative study which utilized semi-structured interviews was conducted to explore selected students’, faculty members’ and administrative staffs’ perceptions of the Data Privacy Act. Non-probability, purposive sampling was used to recruit six respondents. An interview guide was developed to help in the facilitation of the interviews. Data were analyzed through the 6-step thematic analysis by Braun & Clarke (2006). Four themes emerged: 1) Limited awareness of the law, 2) Somewhat familiar with the purpose/ functions of the law, 3) Issues in the implementation of the law in the academe, and 4) Ambiguity in the necessity of the law. Recommendations to improve compliance with the Data Privacy Act, such as the designation of personal information controllers or data privacy officers (DPO) to ensure that security measures are in place to protect personal and sensitive information, were also discussed.

Related Papers

Carlo Zamora

Information and communication technology (ICT) has been making its way into our lives since the invention of Internet and its applications, including the daily usage of internet social media. In recent years, it has conquered the education industry, providing school administrators and teachers a more challenging, yet effective and practical way of managing school operations. Teachers have been using technologyenhanced data collection and analysis as tools to aid their schools in planning, and implementing personalized, student-centered learning experiences for their students. While there are numerous positive effects, it goes without notice that privacy of students is being sacrificed. The Philippines enacted its privacy law, the Data Privacy Act of 2012 to protect its people from the growing use of data. As the law is relatively new, the researchers investigated the perceptions of high school teachers from public and private schools in Manila, Philippines towards data privacy and i...

data privacy act case study

International Journal of Contemporary Applied Researches

Rex Flejoles

Higher Education Institutions in the Philippines take initiatives to comply with the data privacy act. Among others, transparency principle is observed in their different processes. To provide additional literature on data privacy research, this study was conducted. This covered the assessment of Iloilo Science and Technology University’s data privacy implementation, particularly on the transparency principle. It focused on the registration and admission process in Miagao Campus. A 15-item questionnaire, in which items were extracted from the toolkit prepared by the National Privacy Commission, was prepared by the researcher for data gathering. The prepared questionnaire was distributed to the 30 employees and 480 students, identified through quota sampling method. The data were analyzed using mean, standard deviation, one-way ANOVA, Scheffe, and Pearson’s correlation. Although significant difference was found when students were grouped according to department and year level, their level of awareness on the data privacy implementation was “good”. Also, employees’ perception was “good”. Direct and strong, significant relationship was found between students’ awareness and employees’ perception.

David Cabonero

Journal of Physics: Conference Series

arif ridho lubis

Hawaii International Conference on System Sciences

Fay Cobb Payton

The right to privacy is not absolute and is often established by context and the need to know. The nature of the university environment sometimes distorts the sanctity of privacy because the &quot;need to know&quot; is so profuse. Although students are guaranteed the right to keep essential but confidential information private under the Family Educational Rights and Privacy Act of

Proceedings of the 5th UPI International Conference on Technical and Vocational Education and Training (ICTVET 2018)

Ronny Palilingan

International Journal of e-Education, e-Business, e-Management and e-Learning

Renata Mekovec

Varia Justicia

Edelweiss Putri

The disclosure of digital development and the openness of many online transactions often lead to data leakage. Furthermore, digital development on the one hand, provides benefits to the digital economy and at the same time also led to the new impact or threat to the conventional economy from the aspect of cyber-security vulnerabilities to harm customer information and challenge the concept of privacy. The lack of government consents the data protection against the 1945 Constitution. This study aims to propose accelerating the Indonesian Personal Data Protection Bill by The House of Representative Council (DPR). This study uses a normative juridical method with a statute approach, the data used is secondary data consisting of primary and secondary legal material. The result shows the urgency of designing new regulation prior to tackling the issue on data leakage and maintaining the confidentiality of the personal data of Indonesian citizens. Through the enacting PDP Law will benefit ...

Journal of Malaysian and Comparative Law

Dr. Md. Toriqul Islam

The world goes through diverse privacy dilemmas, particularly after the discovery of Information Communication Technologies (ICTs) in the 1960s. It can be argued that such a scenario shall continue in the coming days, as the vast majority of our works are done online using personal data. Perceivably, over the years, our online activities are expanded being facilitated by the pace, efficiency, accuracy, borderless connectivity, and commercial engagement of the ICTs. Eventually, we are always being captured, monitored, and identified by numerous public-private actors, and all these lead privacy to tremendous threats. There are no one-size-fits-all privacy problems due to the inefficiency of the regulatory measures, and the pace of the ICTs. These realities lead researchers across the globe to revisit the notion of 'privacy and data protection' to strike a balance between privacy invasion and enforcement, and Malaysia is not an exception. Nonetheless, there is no in-depth analysis of the adequacy of the current data protection regime of Malaysia. This article aims to fill that gap by revisiting the concepts of 'privacy' and 'data protection' and analyzing extensive literature in the field, keeping the Malaysian data protection regime in a special focus. The findings of this study reveal that in some respects, the data protection regime of Malaysia falls short of the global data protection standard. This study suggests that to strengthen the data protection regime of Malaysia, the policymakers may consider amending the Personal Data Protection Act 2010 (PDPA) in line with the international data protection standard, and especially, the General Data Protection Regulation (GDPR).

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

European Journal of Privacy Law & Technologies

Lucilla Gatt

Jurnal Perspektif

sophie cockcroft

Shirley Ou Yang

Maor Weinberger

Ján Valášek

International Review of Law, Computers and Technology

Ali Alibeigi

The Daily Observer

DATA PRIVACY IN SCHOOLS: Higher learning institutions personnel’s data privacy concerns while using learning management systems in Tanzania

Godefroid karake (Godfrey)

Universitas-XX1: Revista de Ciencias Sociales y Humanas

International Journal of Electronic Governance

Eleni Tzortzaki

Nurul Azurah Mohd Roni

INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING (ICoBE 2021)

tony dwi susanto

Moses Fabiyi

arXiv (Cornell University)

venessa darwin

Zenodo (CERN European Organization for Nuclear Research)

Sejarah: Journal of History Department, University of Malaya

Kudakwashe Maguraushe

Bangladesh Journal of Bioethics

Nader Ghotbi

IGI Global eBooks

Regina Connolly

Computer Law & Security Review

Rebecca Ong

Muhammad Firdaus

Alan Peslak

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Sensors (Basel)

Logo of sensors

A Case Study on the Development of a Data Privacy Management Solution Based on Patient Information

Arielle verri lucca.

1 Laboratory of Embedded and Distribution Systems, University of Vale do Itajaí, Rua Uruguai 458, C.P. 360, Itajaí 88302-901, Brazil; rb.ilavinu.ude@elleira (A.V.L.); se.lasu@sotsuguasiul (L.A.S.); rb.ilavinu.ude@logirdor (R.L.); [email protected] (L.G.)

Luís Augusto Silva

2 Expert Systems and Applications Lab, Faculty of Science, University of Salamanca, Plaza de los Caídos s/n, 37008 Salamanca, Spain; se.lasu@oamgnezux

Rodrigo Luchtenberg

Leonardo garcez, raúl garcía ovejero.

3 Expert Systems and Applications Lab., E.T.S.I.I of Béjar, University of Salamanca, 37008 Salamanca, Spain; se.lasu@jevoluar

Ivan Miguel Pires

4 Instituto de Telecomunicações, Universidade da Beira Interior, 6200-001 Covilhã, Portugal; tp.ibu.ti@seripmi

5 Computer Science Department, Polytechnic Institute of Viseu, 3504-510 Viseu, Portugal

6 UICISA:E Research Centre, School of Health, Polytechnic Institute of Viseu, 3504-510 Viseu, Portugal

Jorge Luis Victória Barbosa

7 Applied Computing Graduate Program, University of Vale do Rio dos Sinos, Av. Unisinos 950, São Leopoldo, RS 93.022-750, Brazil; rb.sonisinu@asobrabj

Valderi Reis Quietinho Leithardt

8 Departamento de Informática da Universidade da Beira Interior, 6200-001 Covilhã, Portugal

9 COPELABS, Universidade Lusófona de Humanidades e Tecnologias, 1749-024 Lisboa, Portugal

10 VALORIZA, Research Center for Endogenous Resources Valorization, Instituto Politécnico de Portalegre, 7300-555 Portalegre, Portugal

Data on diagnosis of infection in the general population are strategic for different applications in the public and private spheres. Among them, the data related to symptoms and people displacement stand out, mainly considering highly contagious diseases. This data is sensitive and requires data privacy initiatives to enable its large-scale use. The search for population-monitoring strategies aims at social tracking, supporting the surveillance of contagions to respond to the confrontation with Coronavirus 2 (COVID-19). There are several data privacy issues in environments where IoT devices are used for monitoring hospital processes. In this research, we compare works related to the subject of privacy in the health area. To this end, this research proposes a taxonomy to support the requirements necessary to control patient data privacy in a hospital environment. According to the tests and comparisons made between the variables compared, the application obtained results that contribute to the scenarios applied. In this sense, we modeled and implemented an application. By the end, a mobile application was developed to analyze the privacy and security constraints with COVID-19.

1. Introduction

Internet of Things (IoT) devices can be applied in various sectors, acting as a facilitating tool [ 1 ]. Devices may help monitor health conditions without the presence of healthcare professionals [ 2 ]. There are also wireless technologies that monitor older adults and remotely send data such as heart rate and blood pressure to their caregivers [ 3 ]. In addition to monitoring, other devices have auxiliary functions, such as automatic insulin injection devices [ 4 ]. These are directly linked to sensitive patient data and provide additional control in critical situations by, for example, setting the dose to be injected into the insulin pump. Both privacy settings and control information must have an extreme level of security.

For hospital environments, IoT devices are distributed not only for patient use but also for other functionalities. According to Farahani et al. [ 5 ], some of the IoT applications used in hospital settings collect patient data, such as heart rate, blood pressure, or glucose level. As far as the environment is concerned, some sensors detect temperature changes or control the air conditioning; cameras are used to detect intruders and send alerts. In this context, the devices’ scope ranges from patient monitoring to evaluate the environment and the equipment used by health professionals. Thus, the data is recorded from the moment that patients are registered at the reception until they are discharged.

When the patient is registered for admission, basic information is collected and complemented after screening. In a first-aid environment, to ensure all patients’ safety, many hospitals use a screening technique known as the Manchester Protocol [ 6 ]. After screening, the information is added to the patient’s record. Next, the person is given a classification according to their condition; this varies from non-urgent cases to emergency intervention cases. Sensitive information is added to the user record, whose preservation and confidentiality level must be treated as critical. There is information that should not be disclosed or related to the patient, as is the case with a patient suspected of having viral and infectious diseases.

The current pandemic of Severe Acute Respiratory Syndrome Coronavirus 2 (COVID-19 SARS-CoV-2) causes the patient to be identified as a possible carrier even during the screening process, based on certain symptoms. According to Rothan and Siddappa [ 7 ], those infected usually show symptoms after approximately five days, the most common signs of illness being fever, cough, and fatigue; the patient may also present headaches, phlegm, hemoptysis, diarrhea, shortness of breath, and lymphopenia. These symptoms are identifiable without specific examinations that are directly documented in the patient’s medical record. Liang et al. [ 8 ] mention that for most patients diagnosed with COVID-19, 85.7% had fever, 42.9% had cough, 33.3% had expectoration, 57.1% had fatigue and 38.1% had headache and dizziness. For this reason, one can see that fever is a common symptom. Thus, this condition must be checked as soon as the patient is admitted to the hospital. Due to COVID-19’s high rate of contagion, the patient’s referral to medical care and subsequent isolation should be done quickly and strictly in confirmation.

When it is confirmed that the patient has a COVID-19 infection, this information is directly linked to their record, which should remain confidential. Soares and Dall’Agnol [ 9 ] comment that privacy is considered an individual right that includes the protection of the intimacy of the subjects, respect for dignity, limitation of access to the body, intimate objects, family and social relationships. In addition, in this same bias, the concern also covers the complete information collected during the patient care process. Even though patients’ data must be confident among all parties in general, due to the current pandemic situation and contagion rate, an extra precaution must be taken to join the statistics without having their information revealed. The application of privacy on patient data must be given to all levels with access to any information, be it registration, device, or image.

The main purpose of this work is to apply privacy constraints in patients with suspected COVID-19. The basis for the application of privacy is the same for patients in general, but using as a basis the fact that it is a pandemic situation, and the discretion in handling data of a suspected patient is crucial. Also, as it is a highly contagious virus, the process from admission to the emergency room to the patient’s referral must be done quickly. In this way, a taxonomy was proposed that covers four topics and five subtopics regarding the entities/environments participating in the hospital admission process.

The scientific contribution of this paper is a system to support the privacy constraints related to COVID-19. It started with the study of the state-of-the-art in hospital environment. Next, we defined a taxonomy, and a mobile application was implemented to test and validate the use of the mobile application to cover the privacy constraints defined in the taxonomy.

The main results of this study are related to the identification of the users. Cryptography methods were implemented control the users according to the diagnosis of COVID-19. As these data are related to health, it must be secure and anonymous. The data collected included reliable data related to temperature parameters for the detection of the symptoms, such as fever.

For a better understanding of the matter and a clearer overview of the relevant details, this work is organized as follows: Section 2 lists the related works; Section 3 describes the taxonomic definition developed for this project and the attributes of the user parameter, environment, privacy, and device; Section 4 demonstrates the modeling of the project, including the use cases, sequence and context diagrams; in Section 5 , we present the prototype with the application developed to be validated. Section 6 presents experiments and results. Finally, in Section 7 , we conclude and discuss the future work.

2. Related Work

Studies on the application of privacy in hospital settings cover different aspects. Various studies were selected to identify privacy targeting, including encryption, profile privacy, device privacy, and taxonomic definitions. The focus among the related papers vary from studies on security over mobile application to systems conceived to protect user privacy.

Barket et al. [ 10 ] present a broad study on the context of privacy, developing a taxonomy meant to connect privacy and technology based on the following aspects: purpose, visibility, and granularity. According to the authors, the aim is related to why the information is requested; depending on the cause, more or fewer details about the user are passed on. Visibility refers to who is allowed to access the user data. Granularity designates the data transfer required for the type of access and purpose for that particular request.

The work of Asaddok et al. [ 11 ] involves mobile devices in the area of health (Mobile Health (mHealth) and the parameters: usability, security, and privacy. The authors propose a taxonomy that involves the three parameters mentioned, and, for each, it branches into taxonomies. One taxonomy is defined by usability, effectiveness, efficiency, satisfaction, and learning. Next, for security, confidentiality, integrity, and availability is restricted to another taxonomy. Finally, for privacy, identity, access, and disclosure, the last taxonomy is defined.

Coen-Porisini et al. [ 12 ] describe a conceptual model for defining privacy policies that cover the user, the user’s profile, the information, and the action that will be taken by a third party to request the information. The authors revealed the link between the three topics mentioned in a Unified Modeling Language (UML) format. The user is divided into personnel—the person to whom the data is referred; processor—the person who will request the data; controller—the person who controls the actions requested by the processor. Data is divided into: identifiable—in situations when it is clear who the data refers to, such as the name; sensitive—it refers to information, processing, and purpose. We can also observe that there is an interaction between the medical user and the controller, along with the processes of access (processing), treatment (purpose), and communication (obligation). The diagram demonstrates how information is delivered to the medical user through requests, based on their access profile.

Silva et al. [ 13 ] use a notification management system focused on user privacy in this context. It contributed to the development of an application that can handle different types of notifications. Moreover, the network made it possible for those involved to ensure that the messages sent and received followed the rules defined earlier. If applied to health notifications or to alert cases of COVID-19, this is a strategic tool, addressing messages with defined priorities while also linking privacy in the traffic sent. Therefore, this work contributes to finding a link between IoT requirements and definitions. In [ 14 ], the authors implemented a system for monitoring and profiling based on data privacy in IoT. From the results obtained in the tests, they identified different profiles assigned to random situations. In this case, the health system user’s profile priorities would apply and determine which profiles would be authorized to receive data. In this work, it was also possible to address the evolution and reduction of the hierarchy based on factors that identify users’ frequency in the environments tested.

Concerning the relationship between data privacy and its use in situations such as the COVID-19 crisis, Zwitter et al. [ 15 ] deals with the basic concept of human rights that relates data privacy with the need to use certain information, such as someone’s location. The authors mention features of applications developed by China, South Korea, and the United States that use tracking techniques to indicate close contact with virus carriers or identify specific individuals or groups’ movements. The study concludes that location data is important in the fight against the spread of the virus, but other relevant information, such as genetic data, should be considered. It is necessary to use this information correctly, as stipulated by the law. It also states that data sensitivity classification is contextual; data protection and privacy are important and must be maintained even in crisis. Information leaks are inevitable, so organizations should always protect themselves; ethics in data manipulation is mandatory for more efficient analysis.

Yesmin et al. [ 16 ] deal with the privacy of patients’ data in terms of the interoperability of systems and the employees’ access to information. Also, they tell us that there is no framework for evaluating privacy audit tools in hospitals yet. The application of a framework would help identify any trend in accessing the data and allow the hospital to improve its performance in detecting possible data leaks. According to the authors, the literature reveals that the most significant leakage of information occurs through employees (nurses, doctors, sellers, and others). An evaluation framework was then developed and tested using the black box concept, which uses usability testing information. The following must be monitored through machine learning or artificial intelligence tools: employee access to information, validation of entry and non-standard behavior, and unexplained access to files.

The work of Islam et al. [ 17 ] deal with a survey on the application of IoT devices in the health system. The authors deal with the IoT network’s topology for health, which facilitates the transmission and reception of medical data and enables data transmission on demand. They also mention features of wearable devices, which capture and store patient data. These may include blood sugar levels, cardiac monitoring, body temperature, and oxygen saturation. The authors explain that the security requirements applied to healthcare IoT equipment are similar to those of other communication scenarios. Therefore, the following must be considered: confidentiality, integrity, authentication, availability, data update, non-denial, authorization, resilience, fault tolerance, and fault recovery.

Sun et al. [ 18 ] designed the HCPP (Healthcare System for Patient Privacy) system to protect privacy and enable patient care in emergency cases. The entities defined for the system are the patient, the doctor, the data server, the family, the personal device, and the authentication server. According to the authors, the system meets the following security criteria: privacy, data preservation by backup, access control, accountability, data integrity, confidentiality, and availability.

Samaila et al. [ 19 ] developed a survey in which information was collected regarding work on security and privacy in IoT in general. The study’s scope ranges from security, encryption, communication protocols, authentication to privacy, among others. The authors also collected information on applications, reliability, and other technical issues, combining ten related works. Additionally, the authors claim that the work covers a system model, a threat model, protocols and technologies, and security requirements. The work discusses the IoT architecture considering nine application domains: home automation, energy, developed urban areas, transport, health, manufacturing, supply chain, wearables, and agriculture. Security measures and system and threat models were defined for each application domain, including protocols and communications. The security properties covered were confidentiality, integrity, availability, authenticity, authorization, non-repudiation, accountability, reliability, privacy, and physical security. These also describe mechanisms that can be applied to achieve the desired security requirements: authentication, access control, encryption, secure boot, security updates, backup, physical security of the environment, and device tampering detection.

Plachkinova, Andrés and Chatterjee [ 20 ] elaborated a taxonomy focused on privacy over mHealth apps. Downloadable apps through the app store do not have a unified way to provide terms of use or privacy policies for the user. Apps mostly communicate between patients and doctors, access to patient medical records, self-diagnosis based on symptoms, etc. The management of user data after the app is installed may not be precise. The authors elaborated a taxonomy that embraces the following three dimensions: mHealth app (patient care and monitoring; health apps for the layperson; communication, education and research; physician or student reference apps), mHealth security (authentication; authorization; accountability; integrity; availability; ease of use; confidentiality; management; physical security) and mHealth privacy (identity threats; access threats; disclosure threats).

Alsubaei, Abuhussein, and Shiva [ 21 ] proposed a taxonomy aiming to enhance security among IoT medical devices, as it has life-threatening risks when a device is not secure. According to the authors, since security and privacy are becoming challenging due to the sensitivity of data in healthcare, it is crucial to enhance these measures. The taxonomy is based on the following topics: IoT layer, intruders, compromise level, attack impact, attack method, CIA compromise, attack origin, attack level, and attack difficulty. For each topic, some subsections embrace items from that topic. Since new attacks are always being created, this taxonomy can be updated, according to the authors. The related works we have selected cover the topics that we cited as critical to privacy. Some applied cryptography in the study as a reference of types of attacks, and others used cryptography to prevent data from being accessed from third parties. Most of them applied user profile privacy to prevent any unauthorized access or mitigate when it happens.

Data encryption is necessary so that in the event of an attack, a third party cannot gain access to information [ 22 ]. Cryptography is part, directly, from [ 17 , 18 ]. Islam et al. [ 17 ] mentioned cryptography among security threats, where cryptographic keys can be stolen to collect user sensitive data. The work of Sun et al. [ 18 ] mentioned encryption as a way to protect health information and applied identity-based cryptography for encryption, authentication, and deriving shared keys for their Healthcare system for Patient Privacy (HCPP) protocols. Also, they made use of searchable symmetric encryption to return encrypted documents to the owner.

The application of private profile was mentioned in all works, except by [ 21 ]. The user’s profile privacy serves to protect any information from being used by third parties [ 23 ]. A security layer should be applied at the device level to prevent third parties from accessing information or even gaining control of it [ 24 ]. The work of Alsubaei, Shiva, and Abuhussein [ 21 ] mentions about attacks that influences on Confidentiality, Integrity and Availability (CIA) triad, which is a basic thread on privacy, but does not explore ways to protect user privacy concerning data access based on authorization. Barker et al. [ 10 ] are concerned about private profile through who can access the data and which data can be accessed, based on the purpose of this access request.

Asaddok and Ghazali [ 11 ] defined data access based on access to patient identity information, personal health information, and personal health records, moreover defined in their taxonomy as identity, access, and disclosure. Coen-Porisini et al. [ 12 ] say that data access must be based on access control based on the users and their roles. Thus, data access must be granted based on a consent given by the patient. Silva et al. [ 13 ] defined their privacy requirements based on the user permissions, environment, and hierarchy. Leithardt et al. [ 14 ] proposed a middleware in which the user’s permission can be changed due to the environment and the frequency in which the user frequent it. This way, the given information will vary based on this environment, and the rules of its context.

Zwitter and Gstrein [ 15 ] say that data collection and its use must be done concerning the principle of proportionality and individual’s interests. Their work is based on data collected over the individual’s location and genetic data. Thus, the authors exposed user data principles as: sensitivity, privacy and protection, breaches precaution, ethics. The study of Yesmin and Carter [ 16 ] was concerned about the patient data through authorized and unauthorized access. The authors developed a framework that audits this access, although the study was limited as real patient information could not validate the tool. Instead, they used real data and could evaluate the amount of unauthorized/unexplained accesses to the patient’s data.

Islam et al. [ 17 ] treated data with CIA triad, so that confidentiality is related to the medical information and its protection against unauthorized users. Their study gathered information on various aspects related to the use of IoT devices in medical care. Thus, they say that policies and security measures must be introduced for data protection when sharing data with users, organizations, and applications. Sun et al. [ 18 ] combined cryptography with user privacy and their trust relationship with entities, such as family members, physicians, or his device. Thus, these entities are allowed to access the patient’s protected health information. In Plachkinova, Andrés, and Chatterjee [ 20 ], the authors studied mHealth apps and the concern about the use of information, terms of use, and privacy policies. The authors mentioned that it is not clear how the data is managed, neither who gets access to it. They developed a taxonomy in which user data is part of the identity threats, access threats, and disclosure threats.

The concern for privacy regarding the device was found in most papers. In Alsubaei, Shiva, and Abuhussein [ 21 ], the IoT device is part of the proposed security taxonomy. As their work concerns about mHealth devices, it is part of the proposed taxonomy’s wearable devices, which embraces numerous sensors. The authors describe potential attacks for these devices, as side-channel, tag cloning, tampering devices, and sensor tracking. In the work of Asaddok and Ghazali [ 11 ], the authors classified mobile devices as part of the application dimension of the taxonomy, present in the topic ’patient care and monitoring’, as they are used for observation of the patient.

The work of Silva et al. [ 13 ] applies privacy over mobile devices regarding aspects such as the environment. Thus, privacy used on mobile devices is part of their taxonomy and a base point of their study. Leithardt et al. [ 14 ] are guided on device privacy. This topic is the central part of their work. Zwitter and Gstrein [ 15 ] mention mobile devices, although their concern focuses on apps and location data, not the device itself. Islam et al. [ 17 ] treat devices like mobile, connected to the Internet through IoT providers. Thus, they are vulnerable to security attacks, which may originate within or outside the network. The authors mention that IoT health devices are part of an attack taxonomy, including information, host, and network. Sun et al. [ 18 ] define the Private Device (P-device) as an entity involved in the HCPP system, such as smartphones or wearable devices. The patient uses the P-device to manage privileges on access to his health data. In Plachkinova, Andrés, and Chatterjee [ 20 ], the device must be secured, as it can leak data about the location or sensor of the patient. As the apps mentioned in their work fail to provide accurate data management information, the device can be a tool for misusing information.

The use of the data acquired from different sensors needs the implementation of several privacy and security rules. In [ 25 ] is presented a low-cost system that embeds the measurement of temperature, heart rate, respiration rate and other parameters to define the health state of the person. This system performs the networking with the healthcare professional to prevent several situations. In addition to these sensors’ data, it includes the tracking of the location of the user to present several contagious. This system may be used for a preliminary diagnosis. Mobile devices are capable of acquiring different types of data in several conditions. Spain was one of the fustigated countries with this pandemic’s situation, and the authors of [ 26 ] proposed the implementation of online sensing networks to provide social quarantine and reduce the contagious with the virus.

The monitoring of the COVID-19 needs the use of secured technologies, and the IEEE 802.11ah technology was used in [ 27 ] to support the prevention of the contamination with COVID-19. It can be implemented in telemonitoring technologies to provide reliable information and prevent the contact. The network should previously know which are the persons that are contaminated with the virus. The tacking of the location and movements may be performed with location, inertial, and proximity sensors that communicates the data to social networks to reduce the social contact with infected individuals. The authors of [ 28 ] studied different privacy constraints related to the real-time monitoring with the mobile devices. The monitoring with mobile devices can be considered to be a digital vaccine that help in the reducing number of contagious with massive sharing of the data.

The creation of a taxonomy was proposed by most of the related works. Alsubaei, Shiva, and Abuhussein [ 21 ] proposed a taxonomy regarding IoT layer, intruder type, compromise level, impact, attack method, CIA compromise, attack origin, attack level, and attack difficulty. As can be seen, the taxonomy embraces the security and privacy aspects of medical IoT devices. Barker et al. [ 10 ] explored three dimensions to develop a taxonomy, based on visibility, granularity, and purpose. These three dimensions focus on privacy aspects, where visibility deals with who is permitted to access the data. Granularity is focused on the characteristics of that data to direct it to the appropriate use and a dimension that deals with the data’s purpose. In the work of Asaddok and Ghazali [ 11 ], the authors developed a taxonomy containing usability, security, and privacy aspects to mHealth applications. Each item of the taxonomy is derived in three or more sub-items. Silva et al. [ 13 ] developed a taxonomy or notifications on mobile devices, including communication protocols, message transmission technologies, privacy, and criteria. Plachkinova et al. [ 20 ] proposed a taxonomy for mHealth apps regarding security and privacy. The items involve app dimension, security dimension, and privacy dimension.

Table 1 presents a comparison with the related works concerning the application of the privacy aspects described above with the additional taxonomy application.

Scope of related works

WorkCryptographyPrivate ProfileDevicesTaxonomy
[ ] (2007)
[ ] (2009)
[ ] (2015)
[ ] (2015)
[ ] (2015)
[ ] (2017)
[ ] (2017)
[ ] (2019)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
[ ] (2020)
Proposal

Even though cryptography is one of the main concerns when dealing with data privacy, descriptions of how to apply it were found explicitly only in the works of Islam et al. [ 17 ], Sun et al. [ 18 ] and Riza and Gunawan [ 27 ]. As Horst Feistel [ 29 ] said almost 50 years ago: “personal data needs protection, which can be achieved when enciphering the material”. Cryptography will prevent the plaintext from being accessible to people who are not authorized to have it, whereas it is an important tool when dealing with personal data. The work of Islam et al. [ 17 ] comprises a survey of IoT in health care, including analysis regarding the security and privacy aspects. However, the authors did not expose how cryptography can be applied, instead, mentioned that some parts of the flow can be tampered by attackers to obtain the cryptographic secrets. This way, IoT systems should be designed with protections against stealing of cryptographic keys.

The work of Sun et al. [ 18 ] is focused on cryptography, as it describes a system based on this aspect. The authors designed protocols for a healthcare system in which the security aspect leverages on cryptographic tools. The HCPP allows the patient to store their medical record even on public servers, where only the patient can retrieve the information. The patient’s medical record is encrypted to ensure privacy, and its content can only be retrieved by the patient and the physician when some treatment is being carried out. If by any means the patient is unable to retrieve the medical record, the system can provide the relevant information to the physician without compromising the patient’s secret key. In our work, cryptography is used to prevent unauthorized access to the patient’s medical records. As it can be seen in our proposed taxonomy in Figure 1 , cryptography is part of the User’s items, as it is a critical tool to protect the patient data. The patients’ medical records should be stored and transmitted in encrypted ways, in a way that only the personnel who has the authorization and, therefore, the secret keys, can decrypt the data. Therefore, patients’ medical records are encrypted and can only be accessed by the authorized staff.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g001.jpg

Proposed taxonomy.

In comparison to the selected works, ours stands out because it includes the indication of encryption, profile privacy, concerns on device, and the definition of the taxonomy meant to define the theme and scenario of the application more clearly. Our taxonomic definition aims to embrace the necessary aspects to be covered to enhance security measures throughout the patient’s sensitive data. We developed a mobile application to validate the data flow of information from the moment patients are being admitted in the hospital until they are discharged.

The use of a mobile application that implements data privacy parameters related to the data of patients infected with COVID-19 is another contribution of this study. The data of patients may be its location, temperature, history of navigation, among others. Therefore, we consider that contagion can be identified in the first moments spent in the emergency room using basic information on the health status and the monitoring of the feverish state with the use of IoT devices. The degree of privacy applied in each user’s registration process should enable identifying infected patients without the exposure of sensitive data.

To this end, we have developed a taxonomy that highlights how important it is for confidential information to be handled with care. We have included examples of privacy applications in the use of IoT devices to receive, screen, and providing patient care with a focus on the COVID-19 pandemic.

3. Taxonomy

We have developed a taxonomic definition for a better classification of the items related to the privacy parameters. A taxonomy is necessary to identify the critical aspects where security measures and policies need to be applied. Based on the goals of this paper and the comparisons made with the related works, we selected the principal parameters to manage privacy, which are divided into other levels to better embrace the desired security aspects.

As presented in works [ 13 , 30 , 31 ], a taxonomy allows the systematic organization of relevant data in the form of a hierarchy. The keywords and concepts used to define a taxonomy establish parameters throughout the information production cycle, in which distributed professionals can participate in the knowledge creation process in an organized way. This definition covers four parameters for managing privacy standards in hospital settings within the previously defined context. The selected parameters with five attributes were considered necessary for this scenario. Figure 1 shows the taxonomic definitions proposed in this paper.

3.1. User Parameter

The user parameter designates the person who provides, controls, or operates the sensitive data used in privacy handling. This parameter refers not only to the patient but also to the participants in the data’s provision or control. For this parameter, we set the following attributes: profile, collaborative, hierarchy, cryptography, data. The profile attribute covers several items that will be part of the process. According to Fengou et al. [ 32 ], six entities participate in interactions taking place in the hospital environment:

  • the patient himself/herself;
  • the clinical network that will care for the patient, including doctors, family members, volunteers, health insurance provider, among other things;
  • the hospital;
  • smart home as an environment with ubiquitous equipment’s capable of providing security and quality of life;
  • the environment in which the patient works, the vehicle with which the patient is transferred to the clinical center.

Based on the entities listed, it can be observed that the user profile is one that must be substantiated, along with the profiles of other entities. The patient’s cooperativeness in providing their registration data is fundamental for a better experience in the given setting. According to Leithardt [ 33 ], the user must provide access to their information and services, thus favoring both their expertise in using the service and the system’s improvement whole. The hierarchy enables proper separation of the levels and permissions of each user type. Viswanatham and Senthilkumar [ 34 ] proposed the so-called hierarchy-based user privacy, where the information is encrypted and decrypted based on access levels and releases.

The General Data Protection Regulation (GDPR) deals with the need to protect confidential data and the inevitable risk of data theft. Encryption reinforces that all sensitive information must be covered by an acceptable security level, either at its source or at its destination. Ibraimi et al. [ 35 ] said that patient confidentiality is one of the significant obstacles in obtaining medical data, as some information is not shared for fear of it being saved in databases that do not comply with security regulations. The protection of sensitive patient information is an essential task. The Department of Health and Human Services, 2002 (HIPAA) Privacy Standard [ 36 ] deals with the security of sensitive patient information in the medical field. It is a US federal law created in 1996 to impose standards for protecting such information and preventing it from being shared without the patient’s consent. Cooper et al. [ 37 ] deal with privacy and security in data mining in the medical field and cites HIPAA in information privacy matters. In 2002, they suggested that protective measures be imposed by health plans, clinical centers, and other entities involved.

3.2. Environment Parameter

The environment parameter represents the smart physical location where user data will flow between different systems and devices. For this parameter, we define the following attributes: topology, interoperability, policies, risks, hierarchy. Topology refers to the architecture of a hospital environment. Costa [ 38 ] comments that hospitals used to be built with an emphasis on the utility of the building and the technique used. The health field’s processes and dynamics are often determined by how the wards, sectors, and departments that house distinct functions are arranged. In many of the methods that occur during the patient’s journey through the emergency room, one or more systems are used.

Interoperability between systems is strongly present in the medical field presently. According to Lopes [ 39 ], strategies used to be designed and developed from an internal perspective of organizations, with no motivation for integration with other systems. In all the smart environments that people transit, data is shared between information systems and IoT devices. The data are a vital part of the operation of a health institution. Several policies need to be established to apply access security to these environments and define what data will be exchanged between systems and devices. According to Yildirim et al. [ 40 ], information security management is an activity that aims to implement a set of policies that help to define an acceptable level of security in these environments, minimizing the potential risks inherent in the exploitation of this information.

Risk management in hospital settings is a crucial activity for the proper functioning of the operation. According to Florence et al. [ 41 ], the risk is an estimated value that considers the probability of occurrence of damage and the severity of said damage. Therefore, procedures are meant to minimize those factors that need to be mapped, controlled, and defined. The dimension in the patient’s care is large and complex. It occurs at various times and in multiple environments in the course of service, along with several interactions between the patient, other participants, and technologies. Soares et al. [ 9 ] emphasize that due to its characteristics and complexity, the hospital environment favors establishing power and asymmetrical relationships between the nursing team and patients. The asymmetry results from the patients’ fragility and vulnerability in the face of health-diseases processes.

3.3. Privacy Parameter

The privacy parameter designates how each piece of information will be handled according to its characteristics. For this parameter, we define the following attributes: communication, applicability, controller, consent, operator. The transmission is linked to the type of user profile and will usually involve unsafe transmitting the information. According to Machado [ 42 ], anonymization or encryption in particular pass through the means of communication, i.e., the very existence of communication drives the need to apply security measures to data. It is a basic human right to have one’s sensitive data handled with care. Thus, its applicability is significant. The General Data Protection Law (LGPD) [ 43 ], as the Brazilian Data Protection Law, aims to apply standards and laws regulating and protecting individuals’ data. Without this application of standards and regulations, sensitive information could easily be used by those who should not have access to it in the first place.

A categorization determines who has the authority to decide the type of treatment that personal data will be submitted. As mentioned in the LGPD [ 43 ], the controller must obtain the consent of the individual owner or holder of the concerned data. The user may, in turn, deny or grant access to their information by a third party. The user must give their consent, a manifestation by which they agree that their information be used in a specific way for a particular purpose. As mentioned in the LGPD [ 43 ], if the controller wishes to use this data at another time, consent will be requested once more. The operator shall be responsible for carrying out the data processing determined by the controller. As mentioned in the LGPD [ 43 ], the operator is jointly and severally liable for the damages caused by data handling if the strategy does not comply with legal provisions or is not in line with the controller’s instructions. The user provides their consent, and the operator is responsible for processing the information made available when for personal use or transfer to third parties.

3.4. Device

The device represents either the IoT equipment present in the smart environment that will interact with the patient’s data or the wearable IoT device that will be set to monitor the patient’s temperature. There may be devices that are fixed in the environment, such as surveillance cameras or devices that can be used to monitor the patient, which can be fixed or mobile. For this parameter, we define the following attributes: function, location, communication, accessibility, interactivity. The device must meet the needs of the process to which it will be directed. According to Lupiana and O’Driscolle Mtenzi [ 44 ], one of the relevant requirements for devices is their storage and processing capacity. The location attribute refers to the location where the device is installed. For Leithardt [ 14 ], the attribute that controls location must be linked to a database where all user data must be included. This database will be accessible only for updating and validating some data. The other information should be processed from the point where the user has accessed the system to provide greater security and reliability. Figure 2 shows both fixed and wearable IoT devices and how the parameters are applied. For both fixed and wearable IoT devices, all five parameters are used. The last column on the Figure 2 shows some of the possible options for each attribute.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g002.jpg

Devices with its parameters.

The way the device communicates with the user is addressed through the communication attribute and fits in heterogeneity, a feature that ensures information is handled evenly. According to a study presented by Pradilla, Esteve, and Palau [ 45 ], the devices are responsible for taking data acquisition through sensors, supporting data treatment with processing units, and acting in conjunction with IoT. Therefore, it is necessary to use heterogeneity in the communication protocols handled by the device and the number of services and types available. This attribute is associated with the protocols of the device, providing security in data transfer. The possibility to access the device whenever necessary is crucial, and interactivity between the device and the client must be ensured. With this in mind, we have developed a model based on the characteristics and functionalities defined in the described taxonomy.

4. Project Modeling

The model consists of use case diagrams, sequence diagrams, and context diagrams. All these notations are based on UML. The described model refers to the process from the patient’s arrival at the hospital until his discharge.

4.1. Use Cases Diagrams

The first use case represents the entry of a patient into the emergency room. The patient interacts with the receptionist and performs some procedures. This use case includes some of the attributes of the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems; user, represented by the patient and the receptionist; environment, represented by the emergency room, shown in Figure 3 .

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g003.jpg

Reception at the Emergency Room.

After first care and registration, the transfer of the patient to the screening area is demonstrated in the use case pictured in Figure 4 . The screening process aims to establish the urgency of the case and the risk classification. This use case includes some of the attributes present in the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems and the wearable IoT device; user, represented by the patient and the nurse; environment, represented by the screening room; device, represented by the wearable IoT device that will receive an identification to record the data and the classification of this patient.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g004.jpg

Screening Room.

And the last use case represents the patient being attended to by the doctor in the office after going through the screening process. The wearable IoT device identifies the patient so that the data is made available, and the doctor proceeds with the consultation. The doctor performs the anamnesis and records the data in the Electronic Health Record (EHR). This use case uses some of the attributes of our taxonomy as follows: privacy, represented by the data which the patient grants access to and is registered in the systems; user, represented by the patient and the doctor; environment, represented by the office; device, represented by the wearable IoT device used by the patient. This case is illustrated in Figure 5 .

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g005.jpg

Reception at the Office.

Sequence diagrams of each use case were also developed. Sequence Diagram is a UML tool used to represent interactions between objects in a scenario, performed through operations or methods.

4.2. Sequence Diagrams

The sequence diagram displayed in Figure 6 represents the entry of a patient into the emergency room. It demonstrates the arrival of the patient (user) to the emergency room (environment), where they request assistance from the receptionist (user). The receptionist provides a password to the patient waiting to be called on. Upon being called on, the patient offers data for registration updates (privacy) recorded by the receptionist in the hospital system. The receptionist checks if the patient has a health plan and then records how this service’s billing issue will be managed. After this procedure, the patient will be referred to as screening.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g006.jpg

Sequence—Reception at the Emergency Room.

The sequence diagram displayed in Figure 7 represents the patient’s entry into the screening room after completing the first stage in the emergency room. It means the arrival of the patient (user) to the screening room (environment), where they will convey their data as requested by the nurse (user). The nurse records the hospital system’s data and the entry into the system that configures the wearable IoT device that will monitor the patient (device). The receptionist then hands the wearable IoT device over to the patient and starts the assessment. The patient answers the questions (privacy), and the nurse records all the information in the hospital system. All patient data is in the system, and the hospital from their wearable IoT device can track it.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g007.jpg

Sequence—Screening Room.

The sequence diagram displayed in Figure 8 shows the patient’s entry into the office after going through the screening process. It represents the arrival of the patient (user) to the office (environment), where they will convey their identification data as requested by the doctor (user). The latter records the electronic record data and refers to the patient’s wearable IoT device in the hospital system. The doctor performs the anamnesis on the patient, who must answer the questions (privacy). The doctor also records this information in the patient’s electronic record. The patient has already been attended to, so they are drugged and released or referred to another hospital ward based on the clinical condition’s evolution.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g008.jpg

Sequence—Office.

4.3. Context Diagrams

The Context Diagram is a UML tool that represents the entire system as a single process. It consists of data streams that show the interfaces between the system and external entities [ 46 ]. The diagram illustrates the object of the study, the project, and its relationship to the environment. Figure 9 represents the context diagram of this project.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g009.jpg

Context Diagram.

The patient (user) requests assistance from the receptionist (user), who will fill in the data (privacy) in the hospital system. The hospital system interacts with the operator’s system that is outside the physical environment of the hospital. In the screening process, the nurse (user) conducts the questionnaire with the patient (user), entering the basic health data in the central hospital system, which interacts with the wearable IoT devices system. Finally, the doctor (user) performs the anamnesis, entering the central hospital system’s consultation information. These information registration processes are focused on privacy determinations, and all processes occur in a clinical setting, explicitly represented within the context diagram.

5. Prototype

A mobile application was developed as a prototype to illustrate the basic principles, from the admission of the patient to the emergency room and referral to the office or discharge indication. The goal of the developed mobile application is to validate some taxonomy items, for it embraces the environment (interoperability among the system and the wearable IoT device). The application was developed using NodeJS.

The application comprises an initial customer registration screen, which simulates the process of filling out the registration form upon admission to the emergency room. The prototype only contains the primary fields: name, gender, age, and address. The ’encrypt data?’ checkbox has been included to select the encryption/hash algorithm. Since it is merely a prototype for demonstrating the flow of information and its security application, the hashes SHA-256 and SHA-512 were made available. In the real application, they would not serve to encrypt data because hashes are not reversible and are considered a one-way function [ 47 ]; the prototype also includes the Advanced Encryption Standard (AES) symmetric encryption algorithm. Figure 10 illustrates the first registration screen of the application with the fields mentioned above.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g010.jpg

Application Flow.

As shown in Figure 10 , the application’s flow is as follows: initially, the patient fills out a form with personal data. The data is encrypted and sent to the systems through the hospital network, as necessary. The patient, then, is sent to the screening room to answer more questions and thus help medical personnel assess their situation. He receives a wearable device to monitor his health status. All the information collected about the patient and their health status is included in their digital record. If communication with other systems is required, the information to be sent is encrypted.

The link between this device and the patient’s file allows the information to be collected without a health professional’s intervention. Based on the information provided by the wearable, the system makes a temperature analysis. If the patient remains in a feverish state, they are referred to the doctor’s office. Since fever is one of the symptoms that prevail in detecting COVID-19, its absence can prompt a discharge. However, the lack of fever is not a guarantee that there is no infection with the virus [ 48 ], so careful monitoring is needed. In addition to the factors described, comparative tests were performed to validate the application based on the initially defined requirements in the taxonomy.

Pseudo-Code

The algorithms applied in the development of the prototype application are described below in pseudo-code format. Pseudo-code covers the generation of a service number, temperature monitoring, and referral in case of emergency.

Algorithm 1 deals with the generation of the service number, where the patient’s data will be saved in an encrypted form and forwarded to the monitoring room. In the monitoring room, the service number to be linked to the customer will be generated.

POST New medical care
1 Service Number;
  : Attendance number
2 save encrypted packet data;
3 send to monitoring room

Algorithm 2 deals with the process of monitoring temperature. The wearable IoT device collects the patient’s temperature during the period defined by the medical team and sends it to the server for the monitor process. First, if it is higher than 38.5 °C, the patient is referred to the ICU. If it is equal to or above 37 °C for five minutes, the patient is referred to another ward for medical assistance. Finally, if it is less than 37 °C for ten minutes, the patient can be released.

Monitoring

Algorithm 3 deals with the alert generated for the ICU in cases where the patient is classified as an emergency. If there is no emergency, the alert is generated for the doctor, informing that the patient will be referred for care.

Alert

6. Tests and Results

The flow of controlled information in the application starts after the registration data has been filled in; it is also possible to apply other requirements such as encryption to the patient’s data. Figure 11 illustrates the integration of basic patient information and reports that the patient was sent for temperature control. The temperature was captured and sent to the system, which will classify the feverish state, suggesting different referrals for each scenario. If the patient exhibits a feverish state and has other symptoms that may characterize COVID-19, their care must be provided in a differentiated way.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g011.jpg

Saved Information.

When choosing the type of encryption in the data registration process, the data security level increases, and the information should only be made available to those who have permission. For prototype demonstration purposes, we use the AES symmetric key encryption method. The encryption application aims to secure data while transferring it to other devices. Figure 12 shows the encrypted patient registration data.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g012.jpg

AES Encryption.

After the patient has been registered, and the information is stored safely, the data is sent to a system that continually gets updates on body temperature. With this prototype’s application, we also tested the hypothesis that an IoT device can monitor the patient for changes in temperature. To test the idea, we implemented a set of random values read by the program to simulate this monitoring process. Every minute, the device will check the temperature of the patients who have entered the system and are waiting at the emergency room’s reception. If their temperature can be characterized as feverish, then they are taken to the office with priority. Figure 13 describes the monitoring of a patient whose temperature remains stable, and hospital discharge is suggested.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g013.jpg

Patient record simulating discharge.

If the patient’s state remains feverish for five minutes, a message will be sent to the doctor in charge, as shown in Figure 14 . If the temperature remains stable for ten minutes, the patient will be released.

An external file that holds a picture, illustration, etc.
Object name is sensors-20-06030-g014.jpg

Patient registry simulating medical care admittance.

After testing and validating the application, it was possible to observe that the information flows through different devices. For the simulation environment, we experimented with only one system that communicates with a wearable device. In real applications, there could be more than one device interacting with more than one method. However, the information’s fluidity would be similar: the patient’s registration at the time of admission to the emergency room, the system being accessed by the screening sector to insert health status data, and the information is received from monitoring devices. At the medical consultation time, the system would receive more details regarding anamnesis, referrals for exams, or hospital discharge.

Given that the feverish state is strongly associated with a COVID-19 diagnosis, the patient should be monitored continuously and receive adequate care as long as the symptoms persist. The high contagion of the virus makes such care essential. The monitoring interval parameters, indicative of medical discharge or a possible disease carrier, are defined according to medical protocols. We emphasize that the interval and discharge suggestion present in this work are meant to simulate features.

7. Conclusions

The COVID-19 scenario requires particular solutions for providing the emergency care process and security in the data generated in all environments. In this sense, this work proposed a taxonomy that was designed to support the development of privacy mechanisms for health environments.

The taxonomy is branched into four items containing five attributes each; all the items and their respective attributes are justifiable. For the information flow tests, we developed a prototype and application that addresses the main questions about data privacy despite being simple. The application was developed with registration data inputs and different encryption/hash to be applied according to environmental criteria. The application communicates with a wearable that monitors the patient’s temperature and provides treatment in line with the patient’s feverish state, guiding the referral to the doctor’s office or the possibility of discharge. With the application of taxonomic definitions and the agility of medical professionals in the care of patients with suspected COVID-19, the registration data is kept confidential through encryption and privacy requirements. Temperature monitoring should be continuously done; in the case of feverish states that persist for a period defined by the entity and other symptoms suggestive of the disease, the system suggests the patient’s referral without exposing personal data.

The main contribution of this research consists of the analysis of different privacy parameters with a mobile application that considers the different rules proposed in our taxonomy. There is no concrete analysis previously performed that analyzes the privacy constraints with a mobile application. Mobile technologies are commonly used by people, and it may help in the prevention of COVID-19. In addition, more search should be performed, and the taxonomy developed may be improved to be adapted with the real world.

We believe that the research we have carried out contributes to several other studies currently in progress in several countries, which propose monitoring without consent and put forward definitions of use and data privacy criteria. For future work, we are developing improvements for privacy requirements that can be adapted to different countries, thus expanding variable monitoring features to identify patients with COVID-19 and obtain new tests and results.

Acknowledgments

This work was partially supported by CAPES—Financial code (001). The authors also like to acknowledge the collaboration of Computer Laboratory 7 of Instituto de Telecomunicações—IT Branch Covilhã—Portugal. Furthermore, we would like to thank the Politécnico de Viseu for their support.

Abbreviations

The following abbreviations are used in this manuscript:

AESAdvanced Encryption Standard
CIAConfidentiality, Integrity and Availability
CPFCadastro de Pessoa Física
COVID-19Coronavirus 2
COVID-19 SARS-CoV-2Severe Acute Respiratory Syndrome Coronavirus 2
EHRElectronic Health Record
GDPRGeneral Data Protection Regulation
HIPAADepartment of Health and Human Services, 202l
HCPPHealthcare system for Patient Privacy
ICUIntensive Care Unit
IoTInternet of Things
LGPDGeneral Data Protection Law
mHealthMobile Health
P-devicePrivate Device
UMLUnified Modeling Language

Author Contributions

Conceptualization, A.V.L., L.A.S., L.G.; Investigation, A.V.L., L.A.S. and V.R.Q.L. Methodology, L.A.S. and A.V.L.; Project Administration, V.R.Q.L.; Resources, I.M.P.; Supervision, V.R.Q.L.; Validation, V.R.Q.L. and L.A.S.; Writing—original draft, A.V.L., R.L. and L.G. Writing—review and editing, V.R.Q.L., L.A.S., X.M., J.L.V.B. and I.M.P., Financial R.G.O. and V.R.Q.L. All authors have read and agreed to the published version of the manuscript.

The project Smart following systems, Edge Computing and IoT Consortium, CONSORCIO TC_TCUE18-20_004, CONVOCATORIA CONSORCIOTC. PLAN TCUE 2018-2020. Project managed by Fundación General de la Universidad de Salamanca and co-financed with Junta Castilla y León and FEDER funds. This work was partially supported by Fundação para a Ciência e a Tecnologia under Project UIDB/04111/2020. This work was partially funded by FCT/MEC through national funds and co-funded by FEDER–PT2020 partnership agreement under the project UIDB/50008/2020. This work was partially funded by National Funds through the FCT—Foundation for Science and Technology, I.P., within the scope of the project UIDB/00742/2020.

Conflicts of Interest

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

data privacy act case study

Tags: ATTY. EPHRAIM GARNET M. SALEM , RA 10173 , DATA PRIVACY ACT OF THE PHILIPPINES , BPO INDUSTRY , FIRST CONVICTION

data privacy act case study

First Conviction Under the Data Privacy Act RA10173

First conviction in the philippines for ra 10173 by the benitez salem baldonado law firm.

Recently, the Benitez Salem Baldonado Law Firm secured the country’s first-ever conviction for a crime involving R.A. No. 10173 otherwise known as the “ Data Privacy Act of 2012 .” On February 6, 2017, Presiding   Judge Hon. Carlito B. Calpatura   of Branch 145 of the Regional Trial Court (RTC) of Makati City handed down the judgment against the female accused in criminal case no. 16-01376 after the latter pleaded guilty to the charge.

The Complaint

According to the Complaint filed by the complainant BPO on June 4, 2015, it was alleged that the accused accessed several credit card accounts of a client credit card company without a call or actual request from their real owners. Furthermore, according to the Complaint, the accused also illegally accessed personal identification cards and changed them into temporary PINs, and subsequently, a consistent amount of $500.00 was withdrawn as cash advances from all the said credit cards.

It was stated in the Information that the accused, “being a   customer care professional ” of a   multinational BPO company in the Philippines   “unlawfully, willfully and feloniously accessed and processed without authority” the account of one of said company’s American client account “by enrolling it to express cash and issuing a temporary PIN for the said account, for the unauthorized purpose of withdrawing $500 from the said account,” which was in violation of Section 28 of Republic Act (R.A.) No. 10173 otherwise known as the “Data Privacy Act of 2012”.

According to the dispositive portion of the Judgment, the accused was sentenced to suffer   imprisonment for one (1) year and six (6) months minimum and five (5) years as maximum , and   a fine of Five Hundred Thousand Pesos (PhP 500,000.00)   pursuant to Sec. 28 of the R.A. 10173.

Philippine BPO Industry

According to   Atty. Ferdinand S. Benitez , one of the founding partners of   Benitez Salem Baldonado Law Firm  who  actively handled the prosecution of the case, this is a great development not only for our client but more importantly the entire local BPO industry. Indeed, the Philippines has become a major hub for international business process outsourcing companies (BPO).   Industry leaders have projected its continuous expansion in the coming years. Information technology, being the backbone of the industry, is dynamic and fast-paced in all aspects. Unfortunately, these same characteristics make the industry vulnerable to   online fraud ,   hacking, and other cybercrimes , which greatly affects the trust of multinational corporations in investing in the Philippines.

Link to full article:

http://www.bsblaw.ph/philippines-first-conviction-under-the-data-privacy-act-of-2012-by-atty-ephraim-garnet-m-salem/

Archive One   is a document management system designed to help companies easily classify, store, secure, and retrieve essential company documents that are needed for retention and audits. With the help of our partners, we   provide an end-to-end document management solution from scanning to document storage, at a highly competitive price.

data privacy act case study

Achieve documentation compliance with Archive One! Book a 1:1 appointment now.

Featured Article

The biggest data breaches in 2024: 1 billion stolen records and rising

Thanks to unitedhealth, snowflake and at&t (twice).

render of a data breach

We’re over halfway through 2024, and already this year we have seen some of the biggest, most damaging data breaches in recent history. And just when you think that some of these hacks can’t get any worse, they do.

From huge stores of customers’ personal information getting scraped, stolen and posted online, to reams of medical data covering most people in the United States getting stolen, the worst data breaches of 2024 to date have already surpassed at least 1 billion stolen records and rising. These breaches not only affect the individuals whose data was irretrievably exposed, but also embolden the criminals who profit from their malicious cyberattacks.

Travel with us to the not-so-distant past to look at how some of the biggest security incidents of 2024 went down, their impact and. in some cases, how they could have been stopped. 

AT&T’s data breaches affect “nearly all” of its customers, and many more non-customers

For AT&T, 2024 has been a very bad year for data security. The telecoms giant confirmed not one, but two separate data breaches just months apart.

In July, AT&T said cybercriminals had stolen a cache of data that contained phone numbers and call records of “nearly all” of its customers, or around 110 million people , over a six-month period in 2022 and in some cases longer. The data wasn’t stolen directly from AT&T’s systems, but from an account it had with data giant Snowflake (more on that later).

Although the stolen AT&T data isn’t public (and one report suggests AT&T paid a ransom for the hackers to delete the stolen data ) and the data itself does not contain the contents of calls or text messages, the “metadata” still reveals who called who and when, and in some cases the data can be used to infer approximate locations. Worse, the data includes phone numbers of non-customers who were called by AT&T customers during that time. That data becoming public could be dangerous for higher-risk individuals , such as domestic abuse survivors.

That was AT&T’s second data breach this year. Earlier in March, a data breach broker dumped online a full cache of 73 million customer records to a known cybercrime forum for anyone to see, some three years after a much smaller sample was teased online.

The published data included customers’ personal information, including names, phone numbers and postal addresses, with some customers confirming their data was accurate . 

But it wasn’t until a security researcher discovered that the exposed data contained encrypted passcodes used for accessing a customer’s AT&T account that the telecoms giant took action. The security researcher told TechCrunch at the time that the encrypted passcodes could be easily unscrambled, putting some 7.6 million existing AT&T customer accounts at risk of hijacks. AT&T force-reset its customers’ account passcodes after TechCrunch alerted the company to the researcher’s findings. 

One big mystery remains: AT&T still doesn’t know how the data leaked or where it came from . 

Change Healthcare hackers stole medical data on “substantial proportion” of people in America

In 2022, the U.S. Justice Department sued health insurance giant UnitedHealth Group to block its attempted acquisition of health tech giant Change Healthcare, fearing that the deal would give the healthcare conglomerate broad access to about “half of all Americans’ health insurance claims” each year. The bid to block the deal ultimately failed. Then, two years later, something far worse happened: Change Healthcare was hacked by a prolific ransomware gang; its almighty banks of sensitive health data were stolen because one of the company’s critical systems was not protected with multi-factor authentication .

The lengthy downtime caused by the cyberattack dragged on for weeks, causing widespread outages at hospitals, pharmacies and healthcare practices across the United States. But the aftermath of the data breach has yet to be fully realized, though the consequences for those affected are likely to be irreversible. UnitedHealth says the stolen data — which it paid the hackers to obtain a copy — includes the personal, medical and billing information on a “substantial proportion” of people in the United States. 

UnitedHealth has yet to attach a number to how many individuals were affected by the breach. The health giant’s chief executive, Andrew Witty, told lawmakers that the breach may affect around one-third of Americans , and potentially more. For now, it’s a question of just how many hundreds of millions of people in the U.S. are affected. 

Synnovis ransomware attack sparked widespread outages at hospitals across London 

A June cyberattack on U.K. pathology lab Synnovis — a blood and tissue testing lab for hospitals and health services across the U.K. capital — caused ongoing widespread disruption to patient services for weeks. The local National Health Service trusts that rely on the lab postponed thousands of operations and procedures following the hack, prompting the declaration of a critical incident across the U.K. health sector.

A Russia-based ransomware gang was blamed for the cyberattack, which saw the theft of data related to some 300 million patient interactions dating back a “significant number” of years. Much like the data breach at Change Healthcare, the ramifications for those affected are likely to be significant and life-lasting. 

Some of the data was already published online in an effort to extort the lab into paying a ransom. Synnovis reportedly refused to pay the hackers’ $50 million ransom , preventing the gang from profiting from the hack but leaving the U.K. government scrambling for a plan in case the hackers posted millions of health records online. 

One of the NHS trusts that runs five hospitals across London affected by the outages reportedly failed to meet the data security standards as required by the U.K. health service in the years that ran up to the June cyberattack on Synnovis.

Ticketmaster had an alleged 560 million records stolen in the Snowflake hack

A series of data thefts from cloud data giant Snowflake quickly snowballed into one of the biggest breaches of the year, thanks to the vast amounts of data stolen from its corporate customers. 

Cybercriminals swiped hundreds of millions of customer data from some of the world’s biggest companies — including an alleged 560 million records from Ticketmaster , 79 million records from Advance Auto Parts and some 30 million records from TEG — by using stolen credentials of data engineers with access to their employer’s Snowflake environments. For its part, Snowflake does not require (or enforce) its customers to use the security feature, which protects against intrusions that rely on stolen or reused passwords. 

Incident response firm Mandiant said around 165 Snowflake customers had data stolen from their accounts, in some cases a “significant volume of customer data.” Only a handful of the 165 companies have so far confirmed their environments were compromised, which also includes tens of thousands of employee records from Neiman Marcus and Santander Bank , and millions of records of students at Los Angeles Unified School District . Expect many Snowflake customers to come forward. 

(Dis)honorable mentions

Cencora notifies over a million and counting that it lost their data:

U.S. pharma giant Cencora disclosed a February data breach involving the compromise of patients’ health data, information that Cencora obtained through its partnerships with drug makers. Cencora has steadfastly refused to say how many people are affected, but a count by TechCrunch shows well over a million people have been notified so far. Cencora says it’s served more than 18 million patients to date. 

MediSecure data breach affects half of Australia:

Close to 13 million people in Australia — roughly half of the country’s population — had personal and health data stolen in a ransomware attack on prescriptions provider MediSecure in April. MediSecure, which distributed prescriptions for most Australians until late 2023, declared insolvency soon after the mass theft of customer data.

Kaiser shared health data on millions of patients with advertisers:

U.S. health insurance giant Kaiser disclosed a data breach in April after inadvertently sharing the private health information of 13.4 million patients, specifically website search terms about diagnoses and medications, with tech companies and advertisers. Kaiser said it used their tracking code for website analytics. The health insurance provider disclosed the incident in the wake of several  other telehealth startups, like Cerebral , Monument and Tempest , admitting they too shared data with advertisers.

USPS shared postal address with tech giants, too:

And then it was the turn of the U.S. Postal Service caught sharing postal addresses of logged-in users with advertisers like Meta, LinkedIn and Snap, using a similar tracking code provided by the companies. USPS removed the tracking code from its website after TechCrunch notified the postal service in July of the improper data sharing, but the agency wouldn’t say how many individuals had data collected. USPS has over 62 million Informed Delivery users as of March 2024.

Evolve Bank data breach affected fintech and startup customers:

A ransomware attack targeting Evolve Bank saw the personal information of more than 7.6 million people stolen by cybercriminals in July. Evolve is a banking-as-a-service giant serving mostly fintech companies and startups , like Affirm and Mercury. As a result, many of the individuals notified of the data breach had never heard of Evolve Bank, let alone have a relationship with the firm, prior to its cyberattack.

More TechCrunch

Get the industry’s biggest tech news, techcrunch daily news.

Every weekday and Sunday, you can get the best of TechCrunch’s coverage.

Startups Weekly

Startups are the core of TechCrunch, so get our best coverage delivered weekly.

TechCrunch Fintech

The latest Fintech news and analysis, delivered every Tuesday.

TechCrunch Mobility

TechCrunch Mobility is your destination for transportation news and insight.

Made by Google 2024: A few AI features you might’ve missed

We rounded up some of the more intriguing AI-related announcements that didn’t get a ton of play, like Pixel Studio.

Made by Google 2024: A few AI features you might’ve missed

Thiel’s Gawker takedown could be coming to a theater near you

Ben Affleck and Matt Damon have acquired a screenplay called “Killing Gawker,” which presumably delves into billionaire VC Peter Thiel’s campaign to bury the media outfit for posting excerpts from…

Thiel’s Gawker takedown could be coming to a theater near you

Gemini Live first look: Better than talking to Siri, but worse than I’d like

Google launched Gemini Live during its Made by Google event in Mountain View, California, on Tuesday. The feature allows you to have a semi-natural spoken conversation, not typed out, with…

Gemini Live first look: Better than talking to Siri, but worse than I’d like

Texas sues GM, saying it tricked customers into sharing driving data sold to insurers

Texas filed a lawsuit Tuesday against GM over years of alleged abuse of customers’ data and trust. New car owners were presented with a “confusing and highly misleading” process that…

Texas sues GM, saying it tricked customers into sharing driving data sold to insurers

Chinese robotaxi startup WeRide gets approval to carry passengers in California 

Chinese autonomous vehicle company WeRide has received the green light to test its driverless vehicles with passengers in California.  The step comes as WeRide begins the process to go public…

Chinese robotaxi startup WeRide gets approval to carry passengers in California 

Winning a gold medal is a lot like being a VC, according to Olympic champion Kristen Faulkner

Kristen Faulkner astonishing Olympic success of two gold medals stems from lessons learned from her former career as a venture capitalist, she says.

Winning a gold medal is a lot like being a VC, according to Olympic champion Kristen Faulkner

California AI bill SB 1047 aims to prevent AI disasters, but Silicon Valley warns it will cause one

SB 1047 has drawn the ire of Silicon Valley players large and small, including venture capitalists, big tech trade groups, researchers and startup founders.

California AI bill SB 1047 aims to prevent AI disasters, but Silicon Valley warns it will cause one

The first post-quantum cryptography standards are here

For many companies, this also means that now is the time to start implementing these algorithms.

The first post-quantum cryptography standards are here

UAW files federal labor charges against Trump, Musk for intimidating workers at X Spaces event

The United Auto Workers union said Tuesday that it filed federal labor charges against Donald Trump and Elon Musk. The union alleges that Trump and Musk attempted to “threaten and…

UAW files federal labor charges against Trump, Musk for intimidating workers at X Spaces event

Pixel Watch 3 adds a life-saving ‘loss of pulse’ detection feature

With the introduction of the Pixel Watch 3 smartwatches, which now come in two sizes, Google is also introducing a new, potentially life-saving feature: loss of pulse detection. At the…

Pixel Watch 3 adds a life-saving ‘loss of pulse’ detection feature

Google’s Made You Look uses Pixar characters to trick kids into smiling for the camera

Made You Look will be available on the Pixel 9 Pro Fold when it launches next month.

Made by Google 2024: How to watch Google unveil the Pixel 9, a new foldable and more

Made by Google 2024 kicks off at 10 a.m. PT on August 13. Get ready for a slew of new hardware, including the Pixel 9 and a new foldable.

Made by Google 2024: How to watch Google unveil the Pixel 9, a new foldable and more

Pixel phones get an AI-powered weather app

With the new app, users “won’t have to scroll through a bunch of numbers to get a sense of the day’s weather,” according to Google.

Pixel phones get an AI-powered weather app

Made by Google 2024: All of Google’s reveals, from the Pixel 9 lineup to Gemini AI’s addition to everything

Let’s dive right into what the Google Pixel 9 lineup looks like, how Google’s Gemini AI will be incorporated in the devices, and more.

Made by Google 2024: All of Google’s reveals, from the Pixel 9 lineup to Gemini AI’s addition to everything

Threads may offer its own take on Fleets with a disappearing posts feature

Meta has been spotted working on a new feature that would allow a post — as well as all its replies — to disappear in 24 hours.

Threads may offer its own take on Fleets with a disappearing posts feature

Tesla posts job listing for the 1950s-style diner Elon Musk has proposed building

In line with Elon Musk’s increasing rhetoric on X of going back to the good old days, Tesla looks like it’s finally getting ready to open its 1950s-style diner.  Tesla…

Tesla posts job listing for the 1950s-style diner Elon Musk has proposed building

Google adds new AI-powered features for photo editing and image generation

Google is adding features for photo editing, plus new apps for storing and searching through screenshots on-device and an AI-powered studio for image generation.

Google adds new AI-powered features for photo editing and image generation

Google’s Pixel Buds Pro 2 bring Gemini to your ears

The $229 Pixel Buds Pro 2 start shipping September 26.

Google’s Pixel Buds Pro 2 bring Gemini to your ears

Google Gemini is the Pixel 9’s default assistant

At Made by Google 2024, the company announced that the new Pixel 9 phones are the first devices to ship with Gemini as the assistant by default.

Google Gemini is the Pixel 9’s default assistant

Google’s Pixel Watch 3 comes in two sizes

In addition to the Pixel Watch 3’s 41mm model, the smartwatch will also be available in 45mm.

Google’s Pixel Watch 3 comes in two sizes

Gemini Live, Google’s answer to ChatGPT’s Advanced Voice Mode, launches

Gemini Live lets users have “in-depth” voice chats with Gemini, Google’s generative AI-powered chatbot, on their smartphones.

Gemini Live, Google’s answer to ChatGPT’s Advanced Voice Mode, launches

Google’s Pixel 9 line offers more size options, better cameras and Gemini by default

The Pixel 9 starts at $799, the Pixel Pro at $999 and Pixel 9 Pro XL at $1,099. Preorders open Tuesday.

Google’s Pixel 9 line offers more size options, better cameras and Gemini by default

Google’s $1,799 Pixel 9 Pro Fold arrives with 8-inch inner display and Gemini

Time will tell whether the new “Pro” bit in Pixel 9 Fold presages the arrival of a lower-cost foldable, or if it’s simply a nod to the high-end pricing.

Google’s $1,799 Pixel 9 Pro Fold arrives with 8-inch inner display and Gemini

US appeals court rules geofence warrants are unconstitutional

The U.S. Appeals Court for the Fifth Circuit said geofence search warrants are “categorically prohibited” under the Fourth Amendment.

US appeals court rules geofence warrants are unconstitutional

Payoneer scoops up Skuad, Robinhood’s strong Q2, and X is making progress on payments

Welcome to TechCrunch Fintech! This week, we’re looking at Payoneer’s $61 million acquisition of Skuad, Robinhood and Dave’s second-quarter results, X’s progress on its payments and more. To get a…

Payoneer scoops up Skuad, Robinhood’s strong Q2, and X is making progress on payments

Encord lands new cash to grow its data dev tools for AI

Labeling and annotation platforms might not get the attention flashy new generative AI models do. But they’re essential. The data on which many models train must be labeled, or the…

Encord lands new cash to grow its data dev tools for AI

Singaporean investment app Syfe pulls in $27M to hasten growth in Asia Pacific

The Asia Pacific region has long been an important market for wealth management firms with the plethora of developing economies and a burgeoning retail investment market. But there’s still a…

Singaporean investment app Syfe pulls in $27M to hasten growth in Asia Pacific

Twitch rolls out video stories to challenge Instagram

After launching photo and text stories last year, Twitch is now introducing video stories. Streamers can now film 60-second videos in the Twitch mobile app or upload a video from…

Twitch rolls out video stories to challenge Instagram

Flipboard users can now follow anyone in the fediverse, including those on Threads

With the update, any Flipboard user can follow user profiles from any other federated service.

Flipboard users can now follow anyone in the fediverse, including those on Threads

ArborXR secures $12M to boost its management platform for AR and VR devices

ArborXR, a startup that helps companies remotely manage AR and VR devices, believed that enterprise customers would be the primary targets for AR and VR devices. Now, that bet is…

ArborXR secures $12M to boost its management platform for AR and VR devices

data privacy act case study

IMAGES

  1. 1 Data Privacy Act Presentation.pptx

    data privacy act case study

  2. Data Privacy

    data privacy act case study

  3. Data Privacy Act (Pertinent Codal Provisions)

    data privacy act case study

  4. Learn More About the Data Privacy Act of 2012

    data privacy act case study

  5. (PDF) 2019 Bar Notes on Data Privacy Act Data Privacy Act of 2012

    data privacy act case study

  6. Data privacy pro and con

    data privacy act case study

COMMENTS

  1. Top 10 Privacy and Data Protection Cases of 2021: A selection

    Inforrm covered a wide range of data protection and privacy cases in 2021. Following my posts in 2018, 2019 and 2020 here is my selection of most notable privacy and data protection cases across 2021:. Lloyd v Google LLC [2021] UKSC 50 In the most significant privacy law judgment of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 ...

  2. Case Studies: High-Profile Cases of Privacy Violation

    The settlement: On June 15, 2018, the enforcement action brought by the FTC led to a shutdown of the website and permanently prohibited the defendants from posting intimate photos and personal information of other individuals without their consent. The defendants were also ordered to pay more than $2 million. 3.

  3. A bipartisan data-privacy law could backfire on small businesses − 2

    The goal of privacy regulation, we argue, should be to give consumers control of their data rather than to slow the flow of data for all. Coarser personalization can exclude marginalized consumer ...

  4. Data Privacy Act of 2012: A Case Study Approach to Philippine

    The study was a form of a qualitative case study following the context of (R. K. Yin, Case Study Research (2014)) study of research designs and methods. The case study is the recommended approach ...

  5. Understanding Philippine national agency's commitment on data privacy

    Understanding Philippine national agency's commitment on data privacy act of 2012: a case study perspective Authors : Vicente A. Pitogo , Michelle Renee D. Ching Authors Info & Claims ICEEG '18: Proceedings of the 2nd International Conference on E-commerce, E-Business and E-Government

  6. An Ethical Approach to Data Privacy Protection

    Data privacy (or information privacy or data protection) is about access, use and collection of data, and the data subject's legal right to the data. This refers to: Freedom from unauthorized access to private data; Inappropriate use of data; Accuracy and completeness when collecting data about a person or persons (corporations included) by ...

  7. PDF Data Privacy Act of 2012: A Case Study Approach to Philippine

    Based on the study conducted by Bonsol, Carillo, Ching M., Ching K, Duran, and Tatel (2011), the top three information systems that are prioritized for development were the (1) Automated Election ...

  8. Data Privacy Act of 2012: A case study approach to Philippine

    The NPC was also included in the study to determine the status of the government's compliance with the law. The study was a form of a qualitative case study following the context of (R. K. Yin, Case Study Research (2014)) study of research designs and methods. The case study is the recommended approach as the main question starts with how and ...

  9. Data privacy act of 2012 compliance performance of Philippine

    Data privacy act of 2012 compliance performance of Philippine government agencies: a case study approach Authors : Michelle Renee D. Ching , Nelson J. Celis Authors Info & Claims ICEEG '18: Proceedings of the 2nd International Conference on E-commerce, E-Business and E-Government

  10. Data privacy act of 2012 compliance performance of Philippine

    Understanding Philippine national agency's commitment on data privacy act of 2012: a case study perspective. Conference Paper. Jun 2018; Vicente Pitogo; Michelle Renee Domingo Ching;

  11. PDF ODC, Complainant, NPC Case No. 17-001

    ODB & AE, Respondents. ------------xRESOLUTIONLIBORO, P.C.For this Commission's resolution is the Motion for Reconsideration dated 20 December 2017 assailing the Commissio. er 2017.The facts are the following:On 3 February 2017, Complainant filed a formal complaint before this Commission alleging that Respondent ODB, without consent, deducted ...

  12. The Changing Wind of Data Privacy Law: A Comparative Study of the

    as a case study to determine whether the U.S. data privacy environment is veering away from its hands-off approach and drawing closer to the comprehensive approach of the EU data privacy regime.

  13. The Cadajas case and data privacy

    This perspective lends itself to the view that not all privacy issues are data privacy issues. Meanwhile, even if it had adopted a narrower view and applied the DPA's provisions to the case, it was still possible for the Court to uphold the data processing activities that went on.

  14. National Government Agency's Compliance on Data Privacy Act of 2012 a

    This empirical qualitative research using case study approach aims to discover and describe why and how the country's frontline agency (Agency A) complies with DPA of 2012. This paper also seeks to determine the challenges and better practices encountered by the agency relative to compliance and how far the level of commitment does Agency A has ...

  15. National Government Agency's Compliance on Data Privacy Act of 2012 a

    Purpose-led Publishing is a coalition of three not-for-profit publishers in the field of physical sciences: AIP Publishing, the American Physical Society and IOP Publishing.. Together, as publishers that will always put purpose above profit, we have defined a set of industry standards that underpin high-quality, ethical scholarly communications.

  16. GDPR & Data Protection Act Case Studies

    At The DPO Centre, we help organisations of all types to comply with UK and EU GDPR and the other UK, EU and global data protection laws. Our services will help your organisation to better understand your data and current level of compliance. We provide tailored advice, expertise and resources that are backed up by the support, shared best ...

  17. Data Privacy Act of 2012: A Case Study Approach to Philippine

    This qualitative research using case study technique aimed to evaluate, explore and explain the level of compliance of Bukidnon State University (BukSU) with RA 10173, finding out that BukSU, just like most government agencies in the Philippines, is qualitatively described as Partial Compliant.

  18. Case studies and examples

    Here are some case studies additional to those in the code. Data sharing to improve outcomes for disadvantaged children and families. Sharing with partners in the voluntary or private sector. Landlord and tenant data sharing. Sharing medical records of care home residents. Ensuring children's welfare: data sharing by local authorities with ...

  19. Perceptions of Students, Faculty and Administrative Staff on the Data

    Specifically, some of the functions of the NPC include: 1) ensuring the compliance of personal information controllers with the provisions of the Act, 2) receiving complaints, instituting investigations, facilitating or enabling settlement of complaints through the use of alternative dispute resolution processes, 3) monitoring the compliance of ...

  20. Perceptions of Students, Faculty and Administrative Staff on the Data

    The study was a form of a qualitative case study following the context of (R. K. Yin, Case Study Research (2014)) study of research designs and methods. The case study is the recommended approach ...

  21. A Case Study on the Development of a Data Privacy Management Solution

    This use case includes some of the attributes present in the proposed taxonomic definition: privacy, represented by the data which the patient grants access to and is registered in the systems and the wearable IoT device; user, represented by the patient and the nurse; environment, represented by the screening room; device, represented by the ...

  22. Top 10 Privacy and Data Protection Cases of 2018: a selection

    The British Broadcasting Corporation [2018] EWHC 1837 (Ch). This was Sir Cliff Richard's privacy claim against the BBC and was the highest profile privacy of the year. The claimant was awarded damages of £210,000. We had a case preview and case reports on each day of the trial and posts from a number of commentators including Paul Wragg ...

  23. First Conviction Under the Data Privacy Act RA10173

    It was stated in the Information that the accused, "being a customer care professional" of a multinational BPO company in the Philippines "unlawfully, willfully and feloniously accessed and processed without authority" the account of one of said company's American client account "by enrolling it to express cash and issuing a ...

  24. California's Invasion of Privacy Act: A New Frontier for Website

    A frequently cited case in these lawsuits is Greenley v. Kochava. In this case, the court denied the defendant's motion to dismiss and rejected the argument that a privacy company's surreptitiously embedded software did not constitute a "pen register."

  25. The biggest data breaches in 2024: 1 billion stolen ...

    Evolve Bank data breach affected fintech and startup customers: A ransomware attack targeting Evolve Bank saw the personal information of more than 7.6 million people stolen by cybercriminals in July.

  26. (PDF) National Government Agency's Compliance on Data Privacy Act of

    Thus, this study aimed to create new and simple management security standards utilizing the PDCA model and ISO/IEC 27001 framework with the guidance of the Republic Act 10173, also known as the ...

  27. Microsoft Power BI and Microsoft Defender for Cloud

    Ensure that the necessary permissions are in place for accessing the required ARG data. Import Data: The Power BI report is set up to query ARG data and import the full dataset, bypassing the 1000-record limit. You can modify the queries if needed to suit your specific requirements. Review the imported data to ensure completeness and accuracy.

  28. Data Beta: Statement on Financial Data Transparency Act Joint Data

    See Release at 31 ("For the joint standard for data transmission and schema and taxonomy formats, the Agencies propose to establish that the data transmission or schema and taxonomy formats used have, to the extent practicable, four properties, derived from the requirements listed in section 124(c)(1)(B) of the Financial Stability Act.").."). Section 5811 of the FDTA amends subtitle A of ...