Law360 is providing free access to its coronavirus coverage to make sure all members of the legal community have accurate information in this time of uncertainty and change. Use the form below to sign up for any of our weekly newsletters. Signing up for any of our section newsletters will opt you in to the weekly Coronavirus briefing.
Law360 (April 17, 2020, 5:43 PM EDT) --
Two large technology companies recently announced a partnership to develop such an application in the U.S. while many other developers are working on various applications related to COVID-19.
This new technology requires detailed personal information such as geolocation, proximity to others, biometrics and health care diagnosis in order to be useful and effective. Accordingly, developers must keep privacy and security at the forefront and implement privacy-by-design measures when developing this technology.
The Federal Trade Commission's Fair Information Practice Principles address the privacy of individuals' personal information and provide the foundation for many U.S. state and federal privacy laws, including the California Consumer Privacy Act and the Health Insurance Portability and Accountability Act, and international privacy laws (e.g., the General Data Protection Regulation).
While compliance with FIPPs does not equate compliance with specific privacy laws, it will give developers working under significant time constraints invaluable guidance and direction toward developing effective, safe and secure applications that are consistent with the spirit of many privacy laws in the U.S.
Among those countries that have already developed or are in the process of developing applications to combat COVID-19, the level of sensitivity to the privacy rights of individual citizens tends to depend upon societal preferences and traditions of governance.
In general, Eastern countries and countries with more authoritarian governments tend to be less conscientious of the privacy rights of individual citizens and require citizens to use the applications.
Western countries and more democratic governments are developing applications that are reflective of the strict privacy laws of those jurisdictions and the use of applications in those regions is mostly voluntary and data is anonymized before use.
The applications used to combat COVID-19 work in much the same way around the world. A user downloads an application to his or her smartphone and grants access to his or her location data and Bluetooth. As the user travels about, the application tracks the user's location and stores the information. When a user comes within six feet of another person's device, the Bluetooth technology notes the proximity and the specific interaction is recorded.
If a user is diagnosed with COVID-19 the data collected by the user's smartphone is uploaded to a central location where it is processed. Notification is automatically sent to the smartphones of individuals who crossed paths with or came within six feet of the infected user.
Some of the specific applications in use or development around the world include the following:
Alipay Health Code (China)
Use of the application is required by the government. Built on Alibaba Group Holding Ltd.'s payment processing platform, the app uses quick response codes to track infected citizens as they pass checkpoints around the city in order to regulate movement within quarantine zones. Citizens are given a color code each day which informs them whether they are authorized to travel. There are concerns that the application reports individual data directly to the government for use in public surveillance measures.
The Israeli Health Ministry requires all citizens to use the application, which tracks a user's location and alerts the user if he or she has been in the vicinity of someone who has been diagnosed with COVID-19. A user hopes to receive the message: "no points of intersection have been found with coronavirus patients."
Israel is also developing an application that will listen to the user's voice and identify unique sound patterns that only occur in those with COVID-19 to provide quick diagnosis. It is not clear whether the diagnostic application will be linked directly with the location application to automatically alert other users of potential infection.
India's application is still in development. It is likely that use of the application will be voluntary. It works by tracking a user's location and alerts the user in real time if the user is near an infected individual. If the user is not near any infected individuals, the application displays the following message: "You are in a safe zone with low risk of infection." The app primarily uses Bluetooth technology to identify when other devices come within six feet of the user's smartphone.
Trace Together (Singapore)
A voluntary application that tracks a user's interactions by allowing smartphones to connect by short-distance Bluetooth signals. The phone stores detailed records of those who came within six feet of the user for 21 days at a time. If a user receives a positive diagnosis, they voluntarily allow authorities to access the data on his or her application to track who else might have been infected, and those users receive notices on their phones that they should self-isolate.
PrivateKit: Safe Paths (U.S.)
Developed privately by the Massachusetts Institute of Technology and Harvard University with assistance from engineers from large tech companies, the application is still a prototype but will be fully deployed in the coming weeks.
The application uses encrypted location data between phones in the network without any centralized authority. If a person shares his or her positive diagnosis, users will be alerted if they have crossed paths with the person but will not learn the person's identity. The application may also be used to show areas of high risk.
Coronavirus Symptom Tracker (U.K./EU)
Developed by the private sector, the app currently has more than 2 million voluntary users in the U.K. users report their symptoms in the app.
The anonymized information is then uploaded to the government so it can determine where clusters of infection are located, how many may be infected, how far along the infections have progressed and how critical the symptoms are. The government uses the information to direct resources to the locations that are most in need.
The technology has the potential to perform many important tasks required to manage and control the spread of the novel coronavirus. Most importantly, the time- and resource-consuming process of contact tracing can be carried out automatically and with much greater accuracy.
The technology may also be used to alert users to the presence of a sick person in real time or let users know if they are in a danger zone for risk of infection. Daily activities like commuting and going to the grocery store could be rationed to maximize social distancing and minimize the risk of infection while allowing individuals the ability to move around with less restriction.
Certain locations that have been trafficked by infected individuals can be flagged for disinfecting. The technology could also be used to ensure those with a positive diagnosis for the virus maintain their quarantines. Of course, the efficacy of these applications is always entirely dependent on users carrying their devices with them and not sharing devices with others.
While there is no question that development of these applications is important, in the U.S., navigating the myriad federal, state and local privacy laws that are potentially implicated by such technology presents a significant legal challenge.
In order to streamline privacy law considerations and establish a foundation for broad compliance, developers should look to FIPPs to implement privacy by design when engineering COVID-19 related applications.
Below we highlight a few core privacy principles every developer should consider as they work on applications to combat the effects of COVID-19.
Notice is a fundamental privacy law principle. When provided before or at the point of collection (i.e., a just-in-time notice), notice allows individuals to make informed decisions as to whether they want to provide the information requested. Many privacy laws mandate these types of notices including, for example, the CCPA, Illinois' Biometric Information Privacy Act and HIPAA.
Choice and Consent
Another core privacy principle is individual choice and consent. Developers must, at a minimum, provide adequate notice of what information they may access and how such information will be used. Because some privacy statutes (e.g. BIPA) require written consent, developers should ensure they obtain written consent from all users. Notice and consent forms should be clear, complete, concise and simple to understand without being misleading or omitting any materially relevant information.
Limiting the scope of information collected by the application is in line with the FIPPs data minimization principle, which provides that the collection of personal information should be limited, and any such collection should be directly relevant and necessary to accomplish the specified purpose(s).
In other words, COVID-19 related applications should not scrub a user's social media accounts, download a user's browser history, review a user's purchase information or scan a user's personal photo album. This principle also applies to the duration during which the information is stored. Data minimization requires that developers consider only retaining the information as long as necessary or as otherwise required by law.
The security safeguards principle provides that personal information should be protected by reasonable security procedures to protect against unauthorized access, destruction, use, modification or disclosure of data. While what constitutes reasonable security during a pandemic is fluid, developers need to assess what safeguards are needed when handling sensitive personal information — and a large quantity of it.
From a technical perspective, this may include ensuring that data is encrypted, stored locally and only uploaded when required to fulfill the purpose of the application (i.e. after a positive diagnosis for the virus). Information should be deidentified before it is shared. Data should be stored in a decentralized manner. Applications should be developed with the latest security measures to prevent interference or access by unauthorized parties.
From an individual user's perspective, developers must ensure that the personal identity of the user is secure from inadvertent disclosure. Measures must be taken to ensure that individual diagnoses, location and interactions cannot be reverse engineered or reidentified for any purpose.
Finally, developers must consider what safeguards are necessary to verify the accuracy of the information collected, especially information provided voluntarily and anonymously, such as a positive diagnosis for the virus.
The application could be used by a malicious actor to trick others into needlessly entering a lengthy period of self-isolation, cause public panic by falsely indicating a large number of infections at a specific location, or shut down a competitor's business by making it appear as if it were heavily trafficked by infected users. It will be important to identify ways to eliminate swatting-type outcomes.
Core privacy principles can only be effective if there is a mechanism in place to enforce them. Given how quickly new practices and procedures are being rolled out across all levels of government, developers should work with attorneys in real time to ensure best practices and compliance with the many privacy laws that could be implicated by this technology.
Developers will need to ensure that they continue to track developments, even after launch, and be flexible in implementing changes as required by law.
Developers in the U.S. and abroad must be cognizant of and sensitive to U.S. data privacy laws when attempting to develop applications that collect, store, use and transmit the personal information of citizens. The legal and regulatory hurdles are significant. Following the FIPPs outlined above should give developers a head start on compliance.
Ron Raether, Ashley Taylor and Stephen Piepgrass are partners, and Sadia Mirza and Daniel Waltz are associates at Troutman Sanders LLP.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the firm, its clients, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
For a reprint of this article, please contact firstname.lastname@example.org.