Preparation is key to HIPAA compliance for health IT vendors

Health IT vendors are not breach proof but should be “breach ready,” according to a Health Care Compliance Association webinar entitled, HIPAA: Marketing and Contracting Solutions for Health IT Vendors. William J. Roberts, partner at Shipman & Goodman LLP, discussed strategies for vendors to incorporate compliance with the Health Insurance Portability and Accountability Act (HIPAA) (P.L. 104-191) into negotiations, agreements, and policies.

HIPAA landscape

HIPAA privacy continues to grow in importance for the health care sector, for both covered entities and their vendors. Roberts said that health IT vendors face two challenges: managing covered entity customers that have concerns about HIPAA compliance, a “major undertaking” when a vendor has thousands of covered entity customers, and a regulatory and enforcement landscape that is shifting its focus from covered entities to vendors (see 2017 OCR resolution agreements off to a strong start, June 30, 2017; Business associates no longer second to covered entities as OCR increases focus, November 22, 2016). He pointed out that 60 percent of business associates have suffered a data breach, and in 2016 HHS imposed a $650,000 penalty in the first HIPAA enforcement action against a business associate (see $650K payment, 6 year CAP resolve nursing home ePHI loss, July 1, 2016).

Pitches

A vendor should already have developed a formal HIPAA compliance program before reaching out to potential customers, and HIPAA compliance should be at the forefront of a vendor’s pitch or response to a request for proposals. The vendor should provide a summary of its HIPAA compliance policies, including its establishment, review, security, and training. A policy summary, said Roberts, is preferable to disclosing the policies themselves, which would be a “roadmap to being hacked.” Roberts also advised vendors to highlight certifications and set forth clear expectations for the privacy aspects of the proposed relationship.

Business associate agreements

The business associate agreement is a vendor’s first opportunity to make a good impression regarding its commitment to privacy. Vendors should have at least one template agreement, or more than one for different types of customers. Roberts advised knowing what a vendor can and cannot agree to before a negotiation and educating the sales team to avoid later back-pedaling on a promise. He also suggested empowering the customer by providing a “menu” of choices that are acceptable to the vendor—for example, barebones breach notice within five days or a more thorough notice at 15 days.

If customers are or might someday be substance abuse treatment providers, the vendor should consider this same approach for qualified service organization agreements. The vendor should review its customers and potential targets for the application of the “Part 2” confidentiality rules and include a provision in the agreement requiring the customer to notify the vendor of the customer’s status as a Part 2 program.

Data breach response

No human or service is perfect, and a vendor will probably have a data breach at some point, said Roberts, which makes a detailed data breach response plan “vital.” He identified the following elements of a breach response plan:

  • Develop an incident intake procedure.
  • Identify the leaders and members of the response team.
  • Rely on standard templates and standard works.
  • Consider a “playbook” and/or a breach reporting decision tool.
  • Develop a customer relations strategy before the breach occurs.
  • Have support vendors ready to act.

The vendor should not simply notify the customer that a breach has occurred; it should have a plan and proposal that it can offer the customer. The process should:

  • provide the covered entity the information it needs to fulfill its own legal obligations;
  • reassure the customer that the situation is under control and being handled properly;
  • inform the customer of steps the vendor has taken and is willing to take on behalf of the covered entity;
  • provide a “menu” of services available to the customer; and
  • create a plan for the future—a holistic look at what the company is doing, not just boilerplate language.

Health apps need regulation, the question is, how much?

Although the health app market is exploding—with more than 165,000 health and wellness apps available for download—the apps are not necessarily achieving the goal of keeping people healthy. It is undisputed that health apps present significant promise for innovation and the integration of health and technology. However, in the current and largely unregulated health app market, innovation is outpacing oversight and, in many cases, the result is that health apps are not helpful, or, worse, are harming users. In some cases, as University of Michigan Professor Dr. Karandeep Singh put it, “It’s like having a really bad doctor.”

Potential

The potential uses for health apps are broad. Developers have designed apps for health uses from the identification of skin cancer to detection of early onset dementia. Other apps (some of which are useful and others that are fraudulent) include those that remind users to drink water, track heart rate, measure sun exposure, treat acne, test urine samples, and monitor sleep. While the level of assistance provided by a reminder to drink water is arguable, the lifesaving potential of some apps is unquestionably dramatic. For example, apps that allow continuous, remote heart rhythm monitoring can help doctors identify whether someone is having a heart attack—turning smartphones into an electrocardiogram (EKG).

Usefulness

A Commonwealth Fund study authored by Singh evaluating the usefulness of 1046 health care related and patient-facing apps determined that 43 percent of iOS apps and 27 percent of Android apps appeared likely to be useful. The study evaluated the apps for usefulness in terms of patient engagement, quality, and safety. While some apps were deemed helpful, many were not. In the worst cases, physicians and regulators are alarmed. For example, Nathan Cortez, a medical technology law and regulation expert at Southern Methodist University’s law school in Dallas, warned, “There’s just no plausible medical way that some of these apps could work.”

Regulation

There is some regulation of apps. For example, those that perform higher-risk functions—EKGs and blood glucose measurers—require FDA approval before they can be marketed. However, in some cases, there are concerns that the current regulatory protections aren’t enough. Some diabetes apps, for example, don’t prompt users to call 911 if their blood sugar drops dangerously low (low enough to cause a diabetic coma) and instead rewards users for entering data. The emphasis on data entry as opposed to treatment is common. Other apps devoted to depression and post-traumatic stress disorder asks users to log mood states but does not take steps to encourage users to access a suicide hotline if they report feeling suicidal. Or, in more dire cases, for example, Cortez cautioned “If you’re diabetic and your app is misreading your blood glucose levels, you may give yourself more insulin than you need and go into diabetic shock.” Regulators have stopped some fraudulent app developers—in 2011, the FTC fined the developer of AcneApp who claimed that his app could treat acne with the light from an iPhone screen.

Stifling

At the same time that regulation seems necessary to prevent harm and stop fraud, there is concern that too much regulation would be worse than the status quo because it would stifle important innovation; and the innovation is increasingly significant. The Mental Indicator App (MIa), developed by Virginia Tech students is a prime example of the pace of progress. The app seeks to replace traditional paper-based mental aptitude tests for dementia with a test that can be administered by a user, anytime, and be remotely sent to a physician to allow a more comprehensive, day-to-day analysis of a patient’s mental health. The concern is that if innovation becomes too bogged down in regulation, students like MIa’s developers could be discouraged from undertaking similar groundbreaking efforts.