Regulating Health Apps to Comply with Health Rights

Lyla Latif

Digital health apps can play a crucial role in fulfilling core components of the right to health: availability, accessibility, acceptability, and quality (AAAQ) of health services. Their use in the progressive realization of health rights could be significant in regions where resources are scarce, especially the Global South where the gaps in healthcare access and quality are acute. Innovations such as telemedicine and mobile health apps are examples of health apps that can enhance the availability and accessibility of health services.

In Africa, where there are 228.05 million app users, (in a total population of nearly 1.5 billion) there is a noticeable absence of specific (‘sui generis’) regulations governing health apps. Ideally, these apps should fall under data protection laws that regulate their terms and conditions—including what can be done with the data they collect. However, 17 African countries are yet to establish any data protection laws or effective privacy frameworks, with six others currently drafting such legislation. This regulatory void has significant repercussions with no governance in place to control the privacy, collection, tracking, and transmission of user data.

A study by Tangari et al (2021), which analyzed 20,000 health-related apps, revealed the potential for serious privacy breaches. There is also a widespread problem of non-compliance with regulations; for example, only 43% of medication management apps adhere to existing regulations.

Most users are not aware of the privacy risks posed by health apps and only about one-third of these apps implement basic encryption measures, further jeopardizing data security. This situation threatens the security of health data as well as rights to privacy and other health rights.

Balancing technological advancement with rights protection

When health apps gather sensitive health information without adequate data protection or regulatory compliance, it can lead to the misuse of, or unauthorized access to, personal health data, undermining the trust and integrity of health services and potentially causing harm.

The UN Special Rapporteur’s report on digital innovation, technologies and the right to health (April 2023) issues a cautionary note on the use of technology. For instance, biased algorithms or data practices that overlook the varied needs of different user groups can lead to discriminatory outcomes. When app owners fail to address or prevent biases in data handling, and lack provisions that mandate inclusive and equitable practices, the discrimination becomes perpetuating. Even in countries such as Kenya and South Africa, where legislation to protect user data is progressing, the challenge often lies in translating these laws into actionable and enforceable clauses within the terms of health apps.

The Special Rapporteur’s report emphasises the critical need for technological governance to be in line with human rights principles so that digital health technologies might enhance rather than hinder the right to health. Health regulators can face challenges in penetrating the often-opaque terms of apps, which could violate medical ethics or expose users to rights violations. Research into the design practices and data stewardship of mobile health apps has identified significant deficiencies. Notably, the unauthorized sharing of personal health data collected by the apps with insurance companies can lead to increased premiums or denial of coverage. Mental health apps have been found to share sensitive details, such as mood or therapy sessions, with third-party advertisers, resulting in targeted ads that could harm the user’s mental well-being. Additionally, fitness apps that inadvertently share location data can compromise user safety by exposing daily routines.

The current lax approach to data privacy and security in mobile health apps highlights the urgent need for stronger protective measures and ethical guidelines. This urgency is compounded by the research gap in how the data collected by these apps is actually being used, particularly in low- and middle-income countries.

The principles of non-discrimination, equality, and privacy emphasized in the Special Rapporteur’s report provide a crucial lens through which to evaluate data governance. Terms of use, as the legal foundation of the user-provider relationship, are paramount in defining and upholding users’ rights within digital health services. Ensuring non-discrimination requires that these terms are crafted with inclusivity at their core, considering the diverse needs of users and aiming for fair service for all. However, addressing discrimination in both design of the app and its use involves more than just inclusive terms—it calls for proactive design ethics and broader efforts to improve access to technology, especially in regions with low app accessibility, such as Africa. Transparent and user-friendly data management policies, prioritizing privacy, consent, and data protection are needed. Regulatory frameworks should enforce privacy and data security standards that apps must meet before entering the market. This regulatory approach would shift the responsibility from users to governing bodies, to ensure that all health apps provide a secure and private service.

Health apps must be mandated to incorporate standards for informed consent in their terms and conditions. This should include the presentation of data usage policies in a clear, concise format that is easily comprehensible by all users. It is crucial that these explanations are not buried within lengthy documents but are highlighted in a way that makes the key points regarding data use and user rights immediately apparent and understandable. This approach shifts the onus from users deciphering complex terms to developers providing straightforward, transparent consent procedures, thereby fostering a trust-based user experience.

Conclusion

Although the Special Rapporteur’s report does not specifically address the terms of health apps (and therefore the data they gather and own), it lays a foundation for a human rights-based approach to their governance. However, expecting developers to adopt these principles in the absence of regulatory requirements is unrealistic. The imperative lies with governments to create and enforce data protection laws that are grounded in human rights, thereby mandating app developers to meet these standards. Advocacy efforts should focus on influencing policy and legislation, ensuring that privacy and equality are embedded within the digital health landscape. Only with such targeted regulation will health apps uphold the rights and dignity of users.

Lyla Latif, PhD, is a research fellow, Warwick Law School, University of Warwick, UK.