Mobile application (“app”) development is the new boon for technology companies of all sizes, and the phrase “There’s an app for that” tells the story of just how much this market has grown and matured.  Most of the early app development focused on low risk opportunities—those involving free or low-cost social media or gaming apps.  While protecting privacy and security of personally-identifiable information is generally important, privacy and security concerns typically do not rank as high priorities in decision-making when developing these types of apps.

By contrast, some developers focused on creating apps that promote healthcare.  Current estimates suggest that about 13,000 healthcare apps exist today.  Recent research suggests the healthcare app marketplace will be valued at nearly $12 billion by 2018.  Due to the sensitive nature of health information, privacy and security have become important considerations throughout the lifecycle of the app from financing through end-user adoption.  Yet, it is unclear to what extent technology companies have factored health-related privacy and security standards into their development and marketing plans.  Just saying that you are a technology company, and not a healthcare company, does not insulate you from risk should you decide to work in the space between both sectors.

There are four main reasons why technology companies should evaluate whether a healthcare app adequately safeguards privacy and security of health information.

1.      Tech companies must build trust with healthcare entities to facilitate health information sharing. 

The healthcare sector currently constitutes approximately $2 trillion on spending in the United States.  Even in the current down economy, healthcare entities continue to spend money to seek ways to improve efficiency through use of health information technologies.  However, healthcare companies are acutely aware of the importance of privacy and security related to health information. Partnering with healthcare entities as their business associates to develop apps necessitates that technology companies take privacy and security just as seriously.

2.      Technology companies must build trust with end users of apps to increase adoption. 

Recent financial incentives, such as those offered by the federal government’s Innovations (i2) Initiative and private entities have drawn technology companies into the healthcare sector, but consumers are still concerned about the privacy and security of their information.  For example, non-governmental organizations have recently published best practices to protect privacy recognizing that end users of health apps are acutely sensitive to the privacy and security of their health information.  Even the  Department of Health and Human Services (“HHS”), Office of the National Coordinator for Health Information Technology (“ONC”) in cooperation with the HHS  Office of Civil Rights (“OCR”) have teamed up to launch a Privacy & Security Mobile Device project which aims to develop best practices to help developers better protect health information while using mobile devices.  Thus, if a technology company seeks to obtain one of the many financial incentives, it should prioritize privacy and security issues as these are on top of mind for consumers.

3.      Numerous government agencies have regulatory jurisdiction over health apps.

A consequence of creating health apps is that a developer may become subject to a great deal of scrutiny by government agencies including FDA, FCC, FTC, CMS, OCR and ONC.   In addition to OCR, which typically enforces privacy and security rules, agencies that do not typically focus on privacy or security concerns have also exercised their authority to require certain safeguards.  For example, CMS has required privacy and security compliance in its requirements to achieve meaningful use of an electronic health record.  Further, the FDA has recently been authorized to issue regulations governing certain medical apps.  We have yet to see what these rules say, but the FDA has already been criticized for failing to ensure adequate privacy and security safeguards exist before approving medical devices for the marketplace.  As such, it is possible the FDA will also include privacy and security requirements in its rulemaking.

4.      States laws are sometimes more aggressive than federal laws.

Some states have taken steps to expand the scope of privacy and security protections beyond what is required under federal standards.  For example, Texas and California have certain requirements that go beyond federal privacy and security requirements.  Thus, technology companies should assess whether the states in which they operate or sell their technologies have more stringent privacy and security standards.

In short, technology companies looking to be successful in the healthcare space should seriously consider building privacy and security into the app lifecycle as a way to increase trust with healthcare partners, increase trust with end users, and comply with applicable federal and state legal requirements and meet industry best practices.

Follow me on Twitter: @HealthITLawyers