Healthcare Data Privacy

Senator Calls for FTC, SEC to Hold Data Broker Accountable for Misuse of Geolocation Data

U.S. Senator Ron Wyden (D-OR) has written to the Federal Trade Commission (FTC) and the Securities and Exchange Commission (SEC) calling for action to be taken to protect consumers and investors from “the outrageous conduct” of the publicly owned data broker, Near Intelligence Inc. Sen. Wyden launched an investigation in May 2023 of Near Intelligence after a report in The Wall Street Journal revealed the Wisconsin-based non-profit anti-abortion group, The Veritas Society, used geolocation data obtained from Near Intelligence to conduct a misinformation campaign on women suspected of seeking abortion.

Geolocation data is collected through code that is incorporated into mobile phone apps. The code receives location data and transfers it along with other information from the user’s device. The data collected reveals a person’s movements, including visits to sensitive locations such as reproductive health clinics, places of worship, healthcare providers, and other sensitive locations. The geolocation data can be tied to an individual and reveals how long they were present at a particular location, with the data accurate to a few meters.

The Veritas Society’s advertising agency, Recrue Media, used Near Intelligence to obtain the geolocation data of individuals who visited Planned Parenthood clinics and used that data for the advertising campaign. Recrue Media conducted the campaign for The Veritas Society From November 2019 through the summer of 2022, when Roe vs. Wade was overturned following the decision of the Supreme Court in Dobbs v. Jackson Women’s Health Organization.

Sen. Wyden spoke with Steven Bogue, Co-Founder and Managing Principal of Recrue Media, on May 19, 2023, who revealed that to conduct the targeted campaign, his employees used the Near Intelligence website to geofence Planned Parenthood clinics and parking lots. Individuals who visited any of the 600 Planned Parenthood clinics in 48 states were then targeted. The Veritas Society said that in 2020 alone, it conducted a campaign that served 14.3 million ads to women who had visited abortion clinics, with the ads pushed out to their social media pages on Facebook, Instagram, and Snapchat.

A second investigation by The Wall Street Journal into Near Intelligence revealed in October 2023 that the company had also sold geolocation data to the U.S. government. Near Intelligence had provided the data to a defense contractor, which sold the data to the Defense Department and U.S. intelligence agencies. Sen. Wyden spoke with Near Intelligence’s Chief Privacy Officer, Jay Angelo, who explained that the company did not have the technical capabilities to prevent customers from targeting individuals who visited sensitive locations. He also confirmed that Near Intelligence had been providing location data to the defense contractor, AELIUS Exploitation Technologies, for three years and that the geolocation data had been collected without user consent. The Near Intelligence website stated that the data collected would not be provided to governments. Angelo joined Near Intelligence in June 2022 and conducted a review of the company’s practices, which revealed the company was facilitating the sale of geolocation data to the U.S. government. When the review was concluded, those statements were removed from the website.

Near Intelligence had a particularly bad financial year and has filed for bankruptcy. A statement provided in its December 11, 2023 bankruptcy hearing confirmed that former executives are under criminal investigation and that the SEC has initiated an investigation of the company related to a data breach in France, which involved transferring the data of E.U citizens to the U.S. government.

The Federal Trade Commission is cracking down on the collection and sale of geolocation data that has been obtained without consent and has recently settled a complaint with the data broker X-Mode Social/Outlogic. Sen. Wyden requested FTC Chair, the Honorable Lina Khan, prevent Near Intelligence from selling off the data it has collected to another company or data broker during the company’s bankruptcy proceedings and to ensure that the geolocation and device data it holds is permanently deleted. Sen. Wyden explained that in this instance, The Veritas Society conducted a misinformation campaign, but the same geolocation data could be used by right-wing prosecutors in states with bans on abortions to prosecute women who visit abortion clinics in states where abortions are legal.

Sen. Wyden also requested the SEC Chair, the Honorable Gary Gensler, expand the SEC’s investigation of Near Intelligence and investigate whether the misleading statements Near Intelligence provided to Congress about whether geolocation data was obtained with users’ consent violated securities laws. “Federal watchdogs should hold [Near Intelligence] accountable for abusing Americans’ private information,” said Sen. Wyden. “And Congress needs to step up as soon as possible to ensure extremist politicians can’t buy this kind of sensitive data without a warrant.”

The post Senator Calls for FTC, SEC to Hold Data Broker Accountable for Misuse of Geolocation Data appeared first on HIPAA Journal.

California AG Agrees $5 Million Settlement with Quest Diagnostics Over Improper Disposal of Waste; Patient Data

California Attorney General Rob Bonta has announced that a $5 million settlement has been agreed with Quest Diagnostics to resolve allegations it illegally dumped hazardous and medical waste and disposed of the unredacted personal health information of patients in regular trash dumpsters. An investigation was conducted into the business practices of Quest Diagnostics that involved 30 inspections at four Quest Diagnostic Laboratories and several of its patient service centers in the state to determine if Quest Diagnostics was complying with California’s Hazardous Waste Control Law, Medical Waste Management Act, Unfair Competition Law, and civil laws that prohibit the disclosure of the personal health information of Californians.

The inspections included reviews of the contents of compactors and dumpsters at Quest facilities which found hundreds of containers of chemicals including reagents and bleach, and electronic waste and batteries. The dumpsters also contained medical waste such as specimen containers that included blood and urine, hazardous waste such as flammable liquids, solvents, and batteries, and unredacted medical information.

Quest Diagnostics was notified about the findings of the inspections and hired an independent environmental auditor to review its waste disposal policies and procedures, which have now been modified. Staff training on the updated policies and procedures has been provided across its four laboratories and more than 600 patient service centers in the state to ensure full compliance with California laws.

“Quest takes patient privacy and the protection of the environment very seriously and has made significant investments to implement industry best practices to ensure hazardous waste, medical waste, and confidential patient information are disposed of properly,” said a spokesperson for Quest Diagnostics. “These include investing in technologies for treatment of biological waste, secured destruction of patient information, programs to maximize recycling efforts and minimize waste-to-landfill disposal, waste-to-energy recovery of non-recyclable wastes, and enhanced waste audit and inspection measures to ensure continued compliance with applicable laws.”

The settlement includes $3,999,500 in civil monetary penalties, $700,000 in costs, and $300,000 for a Supplemental Environmental Project to support environmental training and enforcement in California, and injunctive relief requiring Quest Diagnostics to maintain an environmental compliance program and hire a third-party waste auditor to conduct annual audits and report on its status. The civil monetary penalties will be divided between 10 California counties. The investigation was a collaboration between the office of Attorney General Bonta and the District Attorney’s offices in Alameda, Los Angeles, Monterey, Orange, Sacramento, San Bernardino, San Joaquin, San Mateo, Ventura, and Yolo counties.

“Quest Diagnostics’ illegal disposal of hazardous and medical waste and patient information put families and communities at risk and endangered our environment,” said Attorney General Rob Bonta. “Let today’s settlement send a clear message that my office will hold corporations, including medical services providers, accountable for violations of state environmental and privacy laws. I appreciate the partnership of the district attorneys’ offices across our state that led to this critical settlement.”

Kaiser Foundation Health Plan Foundation Inc. and Kaiser Foundation Hospitals were also investigated over their waste disposal practices and were similarly found to have improperly disposed of hazardous waste, medical waste, and patient information, in violation of state laws. The case was settled for $49 million last September.

The post California AG Agrees $5 Million Settlement with Quest Diagnostics Over Improper Disposal of Waste; Patient Data appeared first on HIPAA Journal.

HHS Issues Final Rule Modifying the Confidentiality of Substance Use Disorder (SUD) Patient Records Regulations

The U.S. Department of Health and Human Services (HHS) has finalized the proposed modifications to the Confidentiality of Substance Use Disorder (SUD) Patient Records regulations at 42 CFR part 2 (Part 2). “The Final Rule strengthens confidentiality protections while improving care coordination for patients and providers. Patients can seek needed treatment and care for substance use disorder knowing that greater protections are in place to keep their records private, and providers can now better share information to improve patient care,” said OCR Director Melanie Fontes Rainer.

The Part 2 regulations have been in effect since 1975 and protect “records of the identity, diagnosis, prognosis, or treatment of any patient which are maintained in connection with the performance of any program or activity relating to substance use disorder [SUD] education, prevention, training, treatment, rehabilitation, or research, which is conducted, regulated, or directly or indirectly assisted by any department or agency of the United States.” These records are subject to strict protections due to the sensitivity of the information contained in those records and avoid deterring people from seeking treatment for SUD due to fears about discrimination and prosecution.

The bipartisan Coronavirus Aid, Relief, and Economic Security Act (CARES Act) called for the Part 2 regulations to be more closely aligned with the Health Insurance Portability and Accountability Act (HIPAA) Privacy, Breach Notification, and Enforcement Rules. On December 2, 2022, the HHS, via the Office for Civil Rights (OCR) and the Substance Abuse and Mental Health Services Administration (SAMHSA), published a Notice of Proposed Rulemaking (NPRM) to implement the changes required by the CARES Act. The comments received from industry stakeholders in response to the NPRM have been considered and appropriate modifications have been made before finalizing the changes.

The modifications include permitting the use and disclosure of Part 2 records based on a single patient consent. Once that consent has been given by a patient it covers all future uses and disclosures for treatment, payment, and health care operations. The final rule also permits disclosure of records without patient consent to public health authorities, provided the records are first deidentified using the methods stated in HIPAA. Redisclosure of Part 2 records by HIPAA-covered entities and business associates is permitted, provided those disclosures are in accordance with the HIPAA Privacy Rule, with certain exceptions. Separate consent is required for the disclosure of SUD clinician notes, which will be handled in the same way that psychotherapy notes are handled under HIPAA.

Patients’ SUD treatment records were already protected and could not be used to investigate or prosecute the patient unless written consent is obtained from the patient or as required by a court order that meets Part 2 requirements. Prohibitions on the use and disclosure of Part 2 records in civil, criminal, administrative, and legislative proceedings have also been expanded in the final rule. The final rule clarifies the steps that investigative agencies must follow to be eligible for safe harbor. Before any request for records is made, the agency is required to search the SAMHSA treatment facility directory and check the provider’s Notice of Privacy Practices to determine if they are subject to Part 2.

The final rule gives patients new rights to obtain an “accounting of disclosures,” request restrictions on certain disclosures, and opt out of receiving fundraising communications, as is the case under the HIPAA Privacy Rule. Patients will also be able to file a complaint about Part 2 violations directly with the Secretary. In the event of a breach of Part 2 records, the requirements for notifications are now the same as the HIPAA Breach Notification Rule. The HHS has also been given enforcement authority, including the ability to impose civil monetary penalties for Part 2 violations. The criminal and civil penalties for Part 2 violations will be the same as those for violations of the HIPAA Rules.  Other changes that have been introduced based on comments received on the NPRM include a statement confirming that Part 2 records do not need to be segregated and that it is not permitted to combine patient consent for the use and disclosure of records for civil, criminal, administrative, or legislative proceedings with patient consent for any other use or disclosure.

“Patient confidentiality is one of the bedrock principals in health care. People who are struggling with substance use disorders must have the same ability to keep their information private as anyone else. This new rule helps to ensure that happens, by strengthening confidentiality protections and improving the integration of behavioral health with other medical records,” said HHS Secretary Xavier Becerra. “The Biden-Harris Administration has made it a priority to end the stigmatization of those living with substance use disorders and give health care providers the tools they need so they can treat the whole patient while continuing to protect patient privacy. We will not rest until behavioral health is fully integrated into health care and those struggling with behavioral health challenges get the best treatment available.”

The final rule is due to be published in the Federal Register in mid-February. The compliance date has been set as 2 years from the date of publication. A fact sheet has been published by the HHS summarizing the changes that have been made in the Final Rule.

The post HHS Issues Final Rule Modifying the Confidentiality of Substance Use Disorder (SUD) Patient Records Regulations appeared first on HIPAA Journal.

1.3 Million-Record Database of Netherlands COVID-19 Testing Lab Exposed Online

A medical laboratory in the Netherlands that served as a COVID-19 testing facility has left a database exposed on the Internet that contained the sensitive data of almost 1.3 million individuals including names, dates of birth, appointment details, email addresses, COVID-19 testing information, and passport numbers.

The exposed database was found by Jeremiah Fowler, co-founder of Security Discovery and security researcher at vpnMentor. The database did not require any authentication to access and the entire database could be accessed by anyone who knew the path name. The database included an estimated 1,285,277 records, including 118,441 certificates, 506,663 appointments, 660,173 testing samples, and a small number of internal application files. The database also contained thousands of QR codes that linked to web pages that included appointment details and email addresses.

The documents had the name and logo of a now inaccessible website, Coronalab.eu, which belongs to Coronalab. Coronalab is owned by the Amsterdam-based ISO-certified laboratory, Microbe & Lab, one of the top two commercial medical test providers in the Netherlands. Fowler tried to contact Coronalab on several occasions to inform the company about the exposed database but received no response. The database remained exposed online for three weeks until Fowler contacted the cloud hosting company, Google, which secured the database to prevent further unauthorized access. It is unclear how long the database was exposed online and how many people found it.

Since names, dates of birth, testing information and email addresses were present in the database, the information could be used by cybercriminals in phishing attacks impersonating Coronalab employees. As Fowler explained, phishing emails could be crafted with information only known to the individuals concerned and Coronalab, increasing the chance of a response.“In my professional opinion, now that the pandemic is mostly behind us, it is time for organizations to review the massive amounts of data they have stored and determine if these records are still needed,” said Fowler. “If they are, organizations must ensure the data is secured from unauthorized access. The records should be encrypted or anonymized to prevent unwanted data exposures or threats from malicious actors.”

The post 1.3 Million-Record Database of Netherlands COVID-19 Testing Lab Exposed Online appeared first on HIPAA Journal.

White House Announces New Actions in Response to Roe v. Wade

To mark what would have been the 51st anniversary of Roe v. Wade, the White House Task Force on Reproductive Healthcare issued a fact sheet announcing new actions to strengthen access to contraception and medication abortions, and ensure that patients receive the emergency medical care they need.

The Task Force explained that the overturning of Roe v. Wade resulted in extreme state abortion bans. “These dangerous state laws have caused chaos and confusion, as women are being turned away from emergency rooms, forced to travel hundreds of miles, or required to go to court to seek permission for the health care they need,” wrote the Task Force.

The fact sheet explains some of the actions that have been taken by federal agencies in response to President Biden’s three Executive Orders and a Presidential Memorandum on access to reproductive health care, strengthening access to contraception and affordability for women with health insurance, reinforcing obligations to cover affordable contraception, educating patients and care providers about rights and obligations for emergency medical care, and protecting access to safe and legal medication abortion.

The Task Force has confirmed that while the overturning of Roe V. Wade removed the Federal right to abortion, it did not prohibit women from traveling to another state to seek the care they need. The Alabama Attorney General had threatened to prosecute people who provided assistance to women seeking lawful out-of-state abortions, and in November 2023, the Department of Justice filed a statement of interest in two lawsuits challenging the Alabama Attorney General’s threats stating that “prosecutions infringed the constitutional right to travel and made clear that states may not punish third parties for assisting women in exercising that right.”

The HHS has written to U.S. governors to invite them to apply for Section 1115 waivers to expand access to care under the Medicaid program to women who are prohibited from receiving abortion care in the states where they live and may be denied care under the Medicaid program. The HHS continues to encourage state leaders to consider and develop new waiver proposals to support access to reproductive health care services.

In April 2023, the HHS issued a notice of proposed rulemaking that strengthened reproductive health privacy under HIPAA. The proposed rule prevents an individual’s information from being disclosed to investigate, sue, or prosecute an individual, a health care provider, or a loved one simply because that person sought, obtained, provided, or facilitated legal reproductive health care, including abortion. The new rule will strengthen patient-provider confidentiality and help healthcare providers give complete and accurate information to patients.

The Federal Trade Commission (FTC) is taking steps to prevent the illegal use and sharing of sensitive health information, such as reproductive health information, and has already taken action against companies that are alleged to have disclosed sensitive data without consumers’ consent, including precise geolocation information that could indicate a visit to a reproductive health center. In 2022, the FTC sued Kochava over the collection and sale of precise location data and settlements have recently been proposed that prohibit the data companies X-Mode Social/Outlogic and InMarket Media from selling precise location data.

The The Federal Communications Commission (FCC) has recently published a new guide for consumers on best practices that can be adopted to protect personal data, including geolocation data on mobile phones and the HHS has also guidance for consumers on how to protect data on personal cell phones or tablets when using mobile health apps such as period trackers, which are generally not protected by HIPAA.

Guidance has also been issued by the HHS that affirms that doctors and other medical providers can take steps to protect patients’ electronic health information, including reproductive health care information, and confirms that patients have the right to ask that their electronic health information generally not be disclosed by a physician, hospital, or other health care provider. The HHS has also launched a website –  ReproductiveRights.gov – that provides individuals with timely and accurate information about their rights concerning reproductive healthcare.

The Department of Education has issued guidance to school officials reminding them of their obligations to protect student privacy under the Family Educational Rights and Privacy Act (FERPA) and that they must obtain written consent from eligible students or parents before disclosing personally identifiable information from students’ educational records, including student health information. The department has also created a new resource for students to explain their rights with respect to health information privacy.

The post White House Announces New Actions in Response to Roe v. Wade appeared first on HIPAA Journal.

FTC Proposes Settlement Prohibiting InMarket from Selling Consumers’ Precise Location Data

The Federal Trade Commission (FTC) has proposed a settlement with the digital marketing platform provider and data aggregator InMarket Media LLC that resolves allegations the company’s business practices violated the Federal Trade Commission (FTC) Act.

According to the FTC complaint, InMarket Media obtains vast amounts of consumer data including information from mobile devices about consumers’ movements, purchasing habits, demographic data, and information on their socioeconomic background. InMarket Media retains consumer data for 5 years and uses that data to facilitate targeted advertising on consumers’ mobile devices through its InMarket Software Development Kit (SDK). InMarket Media categorizes consumers into advertising audiences and allows its clients to target consumers on third-party advertising platforms. The FTC alleges that InMarket Media failed to notify consumers that their personal data will be used to serve targeted advertisements and did not verify that mobile applications that incorporate the InMarket SDK have notified consumers about such uses of their personal data.

Apps that incorporate the InMarket SDK request access to location data from the mobile device’s operating system. If the user gives the app those permissions, their precise latitude and longitude will be collected and transmitted back to InMarket Media along with a timestamp and a unique mobile device identifier. When a user is moving, the location data is sent every few seconds. According to the FTC, between 2016 and the present, around 100 million unique devices have transmitted location data to InMarket Media each year.

The location data reveals where the user lives and works, where their children go to school or obtain child care, and where medical treatment is provided, which can reveal the existence of medical conditions. The location data can also reveal other sensitive information such as where they go to rallies, demonstrations, or protests, which can reveal political affiliations. The location data can also be used to determine how long an individual is present in a particular location.

The FTC alleges InMarket Media misled consumers by providing “misleading half-truths” about its data uses. For instance, the consent screens for the CheckPoints and ListEase apps state that consumers’ data will be used for the app’s functionality such as earning points and keeping lists, but the consent screens do not state that users’ precise location will be collected and transmitted along with data collected from multiple other sources and that the data will be used to build extensive profiles on users to precisely target them with advertising.

While InMarket Media states in its privacy policy that consumer data will be used for targeted advertising, the consent screen does not link to the privacy policy language, and misleading prompts do not inform consumers of the apps’ data collection and use practices. InMarket is alleged to do very little to verify that third-party apps incorporating its SDK obtain informed consumer consent before granting InMarket access to their sensitive location data and does not require apps that incorporate the SDK to obtain informed consumer consent.

Consequently, InMarket does not know whether users of hundreds of third-party apps that incorporate the InMarket SDK have been informed that their data is being collected and used for targeted advertising. The FTC alleges InMarket Media violated Section 5(a) of the FTC Act, 15 U.S.C. § 45(a) which prohibits unfair or deceptive acts or practices affecting commerce, given that misrepresentations or deceptive failures to disclose a material fact constitute deceptive or unfair practices under Section 5(a) of the FTC Act and the acts are likely to cause substantial injury to consumers that consumers cannot reasonably avoid themselves.

The complaint alleges four counts of FTC Act violations: unfair collection and use of consumer location data; unfair collection and use of consumer location data from third-party apps; unfair retention of consumer location data; and deceptive failure to disclose InMarket’s use of consumer location data. A settlement has been proposed that prohibits InMarket Media from selling, licensing, transferring, or sharing any product or service that categorizes or targets consumers based on sensitive location data. “All too often, Americans are tracked by serial data hoarders that endlessly vacuum up and use personal information. Today’s FTC action makes clear that firms do not have free license to monetize data tracking people’s precise location,” said FTC Chair Lina M. Khan. “We’ll continue to use all our tools to protect Americans from unchecked corporate surveillance.”

A spokesperson for InMarket Media said the company disagrees with the FTC’s allegations and is expanding its existing sensitive location protections. Also, in December 2023, the company engaged a nonprofit to identify location information close to reproductive healthcare clinics to remove that information from its databases. InMarket Media also confirmed that it is working with its partners to ensure that their notice and consent processes are clear.

The FTC has recently proposed a similar settlement with the data broker X-Mode Social (Outlogic) that also prohibits the sale of precise location data that could be used to track people’s visits to sensitive locations such as medical and reproductive health clinics. The FTC also sued the data broker Kochava for selling geolocation data that could identify visits to sensitive locations.

The post FTC Proposes Settlement Prohibiting InMarket from Selling Consumers’ Precise Location Data appeared first on HIPAA Journal.

FTC Prohibits Data Broker from Selling Sensitive Location Data

The Federal Trade Commission (FTC) has announced its first settlement with a data broker over the sale of the precise geolocation data of consumers. Under the terms of the settlement, X-Mode Social is prohibited from selling or sharing sensitive location data with third parties unless it obtains consent from consumers or de-identifies the data.

Virginia-based X-Mode Social, now Outlogic LLC, works with app developers and provides a software development kit (SDK) that can be integrated into smartphone apps that allows data to be collected via the apps, including precise geolocation data. Precise geolocation data can identify where an individual lives and works, the residences of friends and family members, and other locations they visit. Some of those locations may be highly sensitive, such as places of worship, domestic violence centers, addiction treatment centers, places offering services to the LGBTQIA+ community, and reproductive health facilities. If precise geolocation data is collected that confirms consumers’ visits to sensitive locations such as reproductive health clinics and places of worship, they could face discrimination, physical violence, emotional distress, and other harms. Sen Ron Wyden determined that X-Mode had sold sensitive location data to U.S. military contractors in 2020, and another customer, a private clinical research company, paid X-Mode for access to consumer information that included visits to medical facilities, pharmacies, and specialty infusion centers across Columbus, Ohio, according to the FTC complaint.

FTC Alleges X-Mode Social Engaged in Unfair and Deceptive Practices

The FTC launched an investigation to determine whether the data broker had engaged in unfair or deceptive acts or practices. The FTC alleged that X-Mode sold raw data to third parties that did not have sensitive locations removed. X-Mode is also alleged to have failed to implement reasonable and appropriate safeguards against downstream use of that data. In addition to purchasing geolocation data from third-party apps, X-Mode also has its own apps – Drunk Mode and Walk Against Humanity. The FTC alleges users of those apps were not fully informed about how precise geolocation data would be used.

According to the FTC, X-Mode did not have policies and procedures in place to remove sensitive locations from its raw data before it was sold, and users of its own apps were not informed about who would receive their data, and safeguards were not put in place to ensure that they could honor requests by users to opt out of the tracking of movements and the serving of personalized advertisements.  The FTC alleged these failures constituted violations of section 5 of the FTC Act.

“With this action, the commission rejects the premise so widespread in the data broker industry that vaguely worded disclosures can give a company free license to use or sell people’s sensitive location data,”  said FTC chair Lina M. Khan.

Settlement Reached to Resolve FTC Complaint

Under the terms of the settlement, X-Mode and Outlogic are required to implement a program for maintaining a comprehensive list of sensitive locations and that information cannot be shared, sold, or transferred unless consent is obtained from consumers. X-Mode and Outlogic are also prohibited from using location data when they cannot determine if a consumer has provided consent.

X-Mode and Outlogic must develop a supplier program to ensure that all companies it purchases data from are obtaining consent from consumers covering the collection, sale, and use of data, and all precise geolocation data that indicates visits to sensitive locations that has been collected without consent must be deleted or destroyed, unless the data has been de-identified.

X-Mode and Outlogic are also required to implement procedures to ensure that recipients of its location data do not associate the data with locations that provide services to LGBTQ+ people, such as bars or service organizations, with locations of public gatherings of individuals at political or social demonstrations or protests, or use location data to determine the identity or location of a specific individual.

Consumers must also be provided with a simple and easy-to-find method of withdrawing their consent to collect and use their location data and request that data be deleted, and also provide a clear and concise way for consumers to request that any businesses or individuals that have been provided with personal data remove location data from commercial databases.

Outlogic’s public relations firm provided a statement in response to the FTC complaint and settlement. “We disagree with the implications of the FTC press release. After a lengthy investigation, the FTC found no instance of misuse of any data and made no such allegation. Since its inception, X-Mode has imposed strict contractual terms on all data customers prohibiting them from associating its data with sensitive locations such as healthcare facilities. Adherence to the FTC’s newly introduced policy will be ensured by implementing additional technical processes and will not require any significant changes to business or products.”

The agreement will be published in the Federal Register and comments will be accepted for 30 days, after which the FTC will decide whether to make the proposed consent order final.

The post FTC Prohibits Data Broker from Selling Sensitive Location Data appeared first on HIPAA Journal.

Michigan Attorney General Calls for New Data Breach Notification Law

Michigan Attorney General Dana Nessel has called for legislative changes to hold companies in the state more accountable for data breaches after Corewell Health failed to disclose a data breach promptly. Corewell Health has been affected by two massive data breaches this year, both of which occurred at vendors and affected more than a million Corewell Health patients. The first breach occurred at Corewell Health vendor Welltok, which had data stolen in May when the Clop hacking group exploited a vulnerability in Progress Software’s MOVEit Transfer solution. Corewell Health patients were notified about the breach on December 1, 2023, more than 6 months after the breach occurred.

Michigan Attorney General, Dana Nessel

AG Nessel’s comments came in response to a second such breach, which occurred at HealthEC, a vendor used by Corewell Health for analyzing patient data. HealthEC discovered the breach in July 2023 and notified Corewell Health in October that the data of its patients had been compromised. AG Nessel explained that the department in the state that is responsible for consumer protection did not hear about the breach until December 27, 2023, more than 5 months after the breach was detected.

It often takes several months for individual data breach notification letters to be issued, but when sensitive data is stolen it can be misused immediately. Individuals need to know that their data has been stolen quickly so they can take steps to protect themselves against identity theft and fraud. In both cases, complimentary credit monitoring and identity theft protection services have been offered but some of the affected individuals have already fallen victim to identity theft and fraud. Had those individuals been made aware of the breaches sooner, losses could have been prevented. Nessel is advocating for legislation that requires companies to notify the state immediately when a data breach is discovered.

Currently, 34 U.S. states have laws that require the state Attorney General or state agencies to be issued with timely notifications about data breaches that exceed certain thresholds, but there are no such requirements in Michigan. Without mandatory data breach reporting to improve transparency, there is little the state can do regarding enforcement.

“What we would like to be able to do is to say, ‘You know, look, if you don’t properly secure and store data, or if you don’t report a data breach, you’re going to be subjected to significant fines.’ That’s what they do in other states, but not here in Michigan,” said Nessel. “Michigan residents have been subjected to a surge of healthcare-related data breaches and deserve robust protection.”

Regarding data security failures that result in data breaches, Michigan could take action and fine companies that are discovered to have violated the Health Insurance Portability and Accountability Act. Several state Attorneys General have imposed financial penalties for HIPAA violations, including Connecticut, Indiana, Massachusetts, Minnesota, New York, and New Jersey.

The post Michigan Attorney General Calls for New Data Breach Notification Law appeared first on HIPAA Journal.

FTC Prohibits Rite Aid from Using Facial Technology System for Surveillance for 5 Years

Rite Aid has been banned from using facial recognition technology for security surveillance for five years as part of a settlement with the Federal Trade Commission (FTC), which determined the pharmacy chain failed to mitigate potential risks to consumers from misidentification.

Between 2012 and 2020, Rite Aid used artificial intelligence-based facial recognition technology in hundreds of its stores to identify customers who may have been engaged in shoplifting or other problematic behaviors. While the system correctly identified many individuals who had engaged in these behaviors, the system also recorded thousands of false positives, where the facial recognition technology incorrectly matched individuals with others who had previously been identified as shoplifters or had engaged in other problematic behaviors. The misidentified individuals were then erroneously accused of wrongdoing by Rite Aid employees.

The FTC found that the facial recognition technology was more likely to record false positives in communities that were predominantly Black or Asian, compared to plurality-White communities, indicating bias in the technology and heightened risks to certain consumers because of race or gender. According to the FTC, Rite Aid contracted with two technology firms to build a database of images and videos of “persons of interest,” who were thought to have engaged in shoplifting or other problematic behaviors in Rite Aid stores, and that database was used for the AI-based facial recognition system. Tens of thousands of images and videos were collected along with names and background information, including background criminal data. Many of the images in the database were of low quality and had been collected from store security cameras, the mobile devices of employees, and in some cases, from news stories. “The technology sometimes matched customers with people who had originally been enrolled in the database based on activity thousands of miles away, or flagged the same person at dozens of different stores all across the United States”, according to the FTC.

“Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices.”

Rite Aid was alleged to have failed to consider and mitigate risks to consumers from misidentification, failed to take into account the limitations of the technology and the high risk of misidentifying Black and Asian individuals, did not properly test, assess, measure, document, or inquire about the accuracy of the technology before deployment, failed to prevent low-quality images from being fed into the system, failed to monitor or test the accuracy of the technology after deployment, and failed to adequately train employees tasked with operating the technology and flag that it could generate false positives.

The FTC also said Rite Aid violated a previous 2010 data security order with the FTC that resolved a complaint that Rite Aid failed to protect the medical privacy of customers and employees, which required Rite Aid to implement a comprehensive information security program. As an example, the FTC alleged that Rite Aid conducted many security assessments of service providers orally and did not obtain or possess backup documentation of those assessments, including those that were considered by Rite Aid to be high-risk.

Rite Aid has been ordered to delete or destroy all photos and videos of consumers used in connection with the operation of the facial recognition or analysis system within 45 days, and within 60 days, to identify all third parties that received photos or videos as part of the facial recognition and analysis and instruct them to also delete the photos and videos.

In addition to the ban on facial recognition technology, Rite Aid is prohibited from using any automated biometric security or surveillance system that is not otherwise prohibited by the order unless a comprehensive automated biometric security or surveillance system monitoring program is established and maintained to identify and address risks that could result in physical, financial, or reputational harm to consumers, stigma, or severe emotional distress.

Rite Aid must also notify consumers when their biometric information is enrolled in a database used in connection with a biometric security or surveillance system and when Rite Aid takes some kind of action against them based on an output generated by such a system, and must investigate and respond to consumer complaints about actions taken against them based on automated biometric security or surveillance system.

Rite Aid said it is pleased to have reached an agreement with the FTC which means the company can put the matter behind it; however, said, “We fundamentally disagree with the facial recognition allegations in the agency’s complaint.” Rite Aid also explained that the allegations related to a facial recognition technology pilot program that was deployed in a limited number of stores. “Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC’s investigation regarding the Company’s use of the technology began.” All parties have agreed to the consent order but it has yet to be approved by a judge.

The post FTC Prohibits Rite Aid from Using Facial Technology System for Surveillance for 5 Years appeared first on HIPAA Journal.