Working off-campus? Learn about our remote access options
Corresponding Author
IT & Systems, Indian Institute of Management Raipur, Raipur, Chhattisgarh, India
Sourya Joyee De, IT & Systems, Indian Institute of Management Raipur, Atal Nagar, Kurru (Abhanpur), Raipur, Chhattisgarh 493 661, India.
Email: [email protected]
Economic Environment and General Management, Indian Institute of Management Raipur, Raipur, Chhattisgarh, India
Corresponding Author
IT & Systems, Indian Institute of Management Raipur, Raipur, Chhattisgarh, India
Sourya Joyee De, IT & Systems, Indian Institute of Management Raipur, Atal Nagar, Kurru (Abhanpur), Raipur, Chhattisgarh 493 661, India.
Email: [email protected]
Economic Environment and General Management, Indian Institute of Management Raipur, Raipur, Chhattisgarh, India
Give access
Share full-text access
Use the link below to share a full-text version of this article with your friends and colleagues. Learn more.
Share a link
Governments around the world are utilizing their digital ecosystems to respond to the COVID-19 pandemic. To increase awareness among the beneficiaries about privacy risks, they must proactively publish the data handling practices for their digital initiatives through appropriate privacy policies. This study analyzes the privacy policies of public COVID-support mobile applications (apps) in the context of India. We found a total of 63 government initiated COVID-support apps in India out of which 38% were found to have an app-specific privacy policy. These policies were analyzed further to assess their coverage of key principles, such as “Purpose,” “Data Categories,” and “Data Retention,” derived from legal requirements. We also analyzed the extent of the specificity of policies with a high coverage. In India, only one nation-wide app stood out to have both considerable coverage of key principles as well as a high level of specificity. Other national/regional apps fail to display the desired levels of coverage and/or specificity. The broader policy recommendations of this study are that the government should better address privacy concerns regarding its existing and future disaster management apps as well as its other digital initiatives by (a) establishing and enforcing a comprehensive legislative framework for data protection and (b) increasing privacy awareness among the beneficiaries.
On March 12, 2020, the World Health Organization (WHO) declared COVID-19 as a pandemic (WHO, 2020). It stressed that early detection, testing, putting infected persons under quarantine, social distancing as well as tracing every contact of infected individuals are essential to contain the disease transmission (UN News, 2020). Owing to the highly infectious nature of the disease, contact tracing has been considered to play a very important role in controlling the spread of COVID-19. Digital contact tracing is faster, more efficient, and accurate than its highly labor-intensive traditional counterpart (Rowe, 2020). To this end, countries around the world have deployed national level mobile applications such as COVIDSafe (Australia), TraceTogether (Singapore), Aarogya Setu (India), Bluezone (Vietnam), Corona-Warn-App (Germany), and TousAntiCovid (France). Over time, many other types of support apps have been released to serve various needs of the citizens during this pandemic. These include providing information related to the risk-status of a person, infection status of a region, number of fatalities and important services like helplines and teleconsultation, obtaining e-passes for interstate traveling, and vaccination registration and related information. Amidst fresh waves of the infection and the emergence of more virulent strains of the virus (Harris & Moss, 2021), the role of these apps remains critical.
To fulfill their functions, COVID-support apps collect and process a lot of personal data (such as name, phone number, and gender), some of which are highly sensitive (such as location, health and contact traces). In the absence of appropriate data protection measures, such data processing activities can lead to a variety of privacy harms (Calo, 2011; De & Métayer, 2016; de Montjoye et al., 2013, 2018; Pyrgelis et al., 2017; Solove, 2005) such as surveillance (Fahey & Hino, 2020; Rowe, 2020) and discrimination. Thus, technologists and governments across the world are considering privacy-first (Fahey & Hino, 2020; Ingram & Ward, 2020) approaches meaning that the personal data of the app users remain protected even as they enjoy the benefits of the app.
Both transparency and privacy are important democratic and societal values (Janssen & van den Hoven, 2015). With some exceptions, governments (as institutions) should generally not operate in secrecy and should ensure that their decisions and actions are transparent and auditable (Janssen & van den Hoven, 2015). When apps are open and upfront to users about what personal data they collect and process, for what purposes, how long these data will be stored and so on, they foster transparency and increase user control on their personal data by enabling them to decide whether to use a service and which service to use, rectify their data and give consent to any secondary uses of their information (Reidenberg et al., 2015; Waldman, 2017, 2018). A well-designed privacy policy with appropriate content, following the Fair Information Practices (FIPS) principle of notice and choice is a common way to convey these information to users (Reidenberg et al., 2015). Privacy policies also enable an important accountability function by leading organizations to a path of self-examination and professionalization and helping motivated outsiders, such as researchers, consumer advocates, and the press, to assess and criticize the organization’s privacy practices (Calo, 2011). Privacy policies of e-government services can help citizens to understand how their personal data are used by the government and hence empower them to hold the government accountable (De & Shukla, 2020).
In India, close to 81 COVID-support apps at both national and regional levels were deployed within a short span of time.11 Based on the search results with the keywords (COVID and Corona) in Google Play Store during the time of the study. A national level initiative, India’s flagship COVID-support app Aarogya Setu has played a crucial role in digital contact tracing. It also offers services such as the provision of important information, self-assessments, vaccination status, and enrolment. First launched in April, 2020, it has been downloaded more than 100 million times till now. As privacy debates around contact tracing apps emerged worldwide, many privacy concerns were voiced for this and other Indian apps as well, especially regarding the collection of too much personal and sensitive information compared to other apps used for similar purposes, violating the principle of data minimization (Deb, 2020; Parathasarathy et al., 2020). It was claimed at that time that the vague language in the app’s privacy policy not only leaves it open for the government to share data across departments leading to secondary usage, but also allows it to retain data well beyond the duration of the crisis (Kapur, 2020) indicating the emergence of a surveillance state.
Recently, there have been several studies assessing the impacts of various socioeconomic dimensions of COVID-19 in the Indian context (Aneja & Ahuja, 2020; Debata et al., 2020; Rakshit & Basistha, 2020) including the privacy implications of digital contact tracing (Deb, 2020; Parathasarathy et al., 2020). While much of the literature on privacy policy has discussed on the requirements from private sector businesses, the focus on that from e-government initiatives has been quite sparse. In this paper, for the first time, we present a comprehensive analysis of Indian public (i.e., government initiated) COVID-support apps to understand the status of their privacy policies.
Our contributions, from a practical perspective, are:
(a) Comparison of privacy policies with requirements in EU GDPR, India’s PDP Bill and IT Rules, 2011: We identify 10 key principles from the requirements set forth by the European Union General Data Protection Regulation (EU GDPR), India’s proposed Personal Data Protection Bill 201922 Available at (henceforth referred to as PDP Bill) and the existing Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 201133 Available at (henceforth referred to as IT Rules 2011). We then explore how many of these principles are covered (referred to as “coverage” in the paper) by the privacy policies of Indian public COVID-support apps. In the process, we discover that only 38% of these apps have app-specific privacy policies. Where privacy policies are present, the coverage of critical principles is often limited. None of the privacy policies cover all the 10 key principles.
(b) Exploring the design and readability of privacy policies: We also investigate the design and the readability of these policies. We checked the length and accessibility of the privacy policies and conclude that, within the studied policies, shorter policies are not necessarily better, in contrast to what is generally advocated by privacy experts. This finding agrees with that of Bowers et al. (2017). To check whether privacy policies consider the varying literacy rates across the Indian population, we measure the readability of the policies using tests of readability from the linguistic community. We calculated six readability scores—Flesch Reading Ease score, Flesch–Kincaid Grade Level score, the Gunning-Fog score, the Coleman–Liau Index, the SMOG Index, and the Automated Readability Index. We found that none of the policies uses plain English and on average the reading skills of a high school or a college student or above are required to be able to read them. So, many users may find these policies difficult to read.
(c) Discussion on the specificity of privacy policies: While a privacy policy may cover a certain data protection principle, it may make very generic statements along that principle, therefore not being specific. So, we also investigate the extent of specificity for those privacy policies that exhibit a relatively high coverage of the key principles, since inclusion of a large number of key principles does not imply high specificity. Interestingly, only about 8% of the app-specific privacy policies show an acceptable specificity in addressing each of the covered principle. We further benchmark the privacy policy that we found to be the best in terms of specificity against those of two international apps. We found it to be almost comparable with one of them and slightly weaker than the other.
Figure 1 depicts the flow of our analysis. Our results uncover some critical problematic areas for Indian public COVID-support apps. In most cases, privacy policies are not available. Where they exist, they suffer from readability issues. In many cases, the coverage is also quite poor. In addition, in majority of cases where there is high coverage, specificity is very poor. We believe that the findings of this study will help the government and regulators to better address these problems in future disaster management apps and in other digital initiatives.
Building an effective privacy policy has long been identified as a challenge. Research has shown that, in practice, privacy policies are verbose, lengthy, purposefully vague and hence, difficult to read and understand (Cate, 2010; Gluck et al., 2016; Jensen & Potts, 2004), are not appealing to users enough to be read, often do not meet user privacy expectations (Reidenberg et al., 2015) and overall, do not really support rational decision making (McDonald & Cranor, 2008). They often contain legal or technical terms that users are not familiar with and the level of education needed to understand them is high (Milne et al., 2006; Reidenberg et al., 2015; Waldman, 2017, 2018). Even privacy experts may find them to be misleading (Reidenberg et al., 2015). Not only the content, but the design of privacy policies is also very crucial as poor design may discourage users from reading privacy policies altogether (Waldman, 2017, 2018). Sometimes, privacy policies may be designed to nudge users into revealing more personal data than necessary (Reidenberg et al., 2015).
Privacy researchers have made considerable efforts to aid designers and developers in creating effective privacy policies and users to easily understand privacy practices by analyzing available privacy policies and recommending improvements. Privacy policies have been analyzed both manually (Bowers et al., 2017; Rao et al., 2016) as well as through automated means. The latter include techniques such as supervised learning (Constante, Sun, et al., 2012), rule-based extraction (Constante, den Hartog, & Petković, 2012; Zimmeck & Bellovin, 2014), deep learning (Harkous et al, 2018), automated semantic analysis (Linden et al., 2020), and topic modeling (Chundi & Subramaniam, 2014). While automated means allow the processing of large number of privacy policies in much less time, the automatic extraction of data practices from complex and vague privacy policies can be difficult (Wilson et al., 2016) and can be done more effectively through manual processing. Machine learning text classifiers (Linden et al., 2020; Zimmeck et al., 2017) used for the analysis of a large number of privacy policies are often trained based on human annotations. The criteria used to evaluate privacy policies include readability (Ermakova et al., 2015; Massey et al., 2013; Meiselwitz, 2013; Zhang et al., 2020), usability (Jensen & Potts, 2004), compliance issues and coverage of key principles such as purpose specification and retention duration (Bowers et al., 2017; Cranor et al., 2015; Linden et al., 2020; Zimmeck et al., 2017), presentation (Linden et al., 2020), specificity with respect to describing data practices (Linden et al., 2020), the extent of unexpected data practices (Rao et al., 2016), and within-policy contradictions (Andow et al., 2019).
Major regulations in the European Union and the USA have specified certain requirements on the content, and in some cases, on the design of privacy policies (Davis, 2020). The EU GDPR44 has specified the information that must be provided to data subjects in its Articles 13 and 14. Article 29 Working Party (now replaced by the European Data Protection Board, EDPB) in its guidelines55 on transparency further discusses the requirements from privacy policies from the content as well as design perspectives. The Federal Trade Commission (FTC) in USA requires companies to mainly disclose what information it collects, why it does so, to whom the information would be sold and how customers could access their information and opt-out. Federal laws such as the Health Information Portability and Accountability Act (HIPAA), Gramm-Leach-Bliley Act (GLBA), Children’s Online Privacy Protection Act (COPPA), and the E-Government Act, all focus on the content of privacy policies. In India, the IT Rules 2011 require “body corporates” to provide privacy policies including information on the types of personal data collected, the purposes of collection and usage. India’s proposed Data Protection Bill 201966 has put forth much more enhanced requirements related to privacy policies, closely following the norms set by USA and the EU in this context.
The three main research questions we address are: First, do Indian public COVID-support apps have privacy policies? If yes, what do they cover? Second, where do these privacy policies stand in terms of their design? Third, what is the readability of each privacy policy? In this section, we describe the methodology we adopted to address each question.
We started by compiling a list of 81 Indian COVID support applications based on the search results with the keywords (COVID and Corona) in Google Play Store.77 Available at Although some of the apps have been released for iOS, we focus only on Android apps, because as per the latest statistics, Android based smartphones hold a share of 95.23% in India.88 Statista Research Department dated February 17, 2021 reports that as of 2020, 95.23% of the mobile operating systems in India are Android based. Further details are available at Of these, 63 apps were of public initiative, that is, developed by a government initiative at either national, state/Union Territory, district, town, or municipality level. Only these apps were considered for further study. In the rest of the work, we refer to these as public apps. The remaining 18 apps were of private initiative and were left out of the study. The information on whether an app is the result of a government initiative or a private initiative has been obtained from its Google Play Store page. In the entire paper, we use pseudonyms99 The detailed description will be made available on request. to refer to the apps.
To check whether the 63 apps identified above have their own privacy policies, we first define an “app-specific privacy policy”. By app-specific privacy policy, we mean a privacy policy that describes the data practices for the app and not the generic data practices of the website of a government department or a developer. For example, for some apps where only a website link could be found, the website was that of a private entity involved in the app development and had the privacy policy of that entity, but not specifically for the app in question (i.e., it did not mention the name of the app, data collected by the app, etc.). Such a privacy policy is not an app-specific privacy policy. We counted all apps which had link to app-specific privacy policies on their respective Google Play Store pages.
Subsequently, we conducted a manual coding analysis to determine whether the app-specific privacy policies adhere to these 10 key principles. During the coding analysis, we first performed an initial key word search and subsequently did a manual analysis. Before beginning the coding process, we generated a codebook (see Appendix A: Table A1) with codes related to the key principles. For example, keywords for the “Data Retention” principle included retention, retain, continue to share, store, delete after, termination, and uninstall. We consider that a privacy policy qualifies as fulfilling a certain principle if it simply mentions the principle or one or more codes corresponding to the principle. That a privacy policy does not qualify one or more principle, is a strong indicator that crucial privacy issues have not been considered.
Both the authors acted as coders. They had access to digital copies of all the privacy policies, methodically noted their observations in a spreadsheet and did not refer to any other resources while checking the privacy policies. The authors agreed to use their best judgment when they could not find the exact keyword but only found similar text or synonyms in the privacy policy corresponding to a certain principle. If a privacy policy neither had the keywords nor similar text corresponding to a specific principle, the authors assigned a 0 to the privacy policy for that principle implying that the principle had not been discussed at all, otherwise they assigned 1. Once all documents were analyzed individually, the authors compared the results of the independent trials and mutually reconciled to arrive at the data presented in Table A2 (Appendix A). During the reconciliation process, if the results of the coders were different for any privacy policy and/or principle, the authors extensively discussed the idea behind the scores given by them to arrive at a final score.
To measure inter-rater reliability on the coding of each policy, we computed Cohen’s Kappa urn:x-wiley:14723891:media:pa2801:pa2801-math-0001, (Landis & Koch, 1977) using IBM SPSS Version 26. Using the nomenclature of labels for describing the relative strength of agreement provided by Landis and Koch (1977) the level of coder agreement we obtained can be described as follows. Coders agreed “almost perfectly” (urn:x-wiley:14723891:media:pa2801:pa2801-math-0002) on six principles: “Data Controller,” “Purpose,” “Data Subject Rights,” “Grievance,” “Security,” and “Children.” For the three principles, “Data Source,” “Recipients,” and “Data Retention,” the agreement can be classified to be “substantial” (urn:x-wiley:14723891:media:pa2801:pa2801-math-0003). For the principle, “Data Categories,” the agreement can be classified as “moderate” (urn:x-wiley:14723891:media:pa2801:pa2801-math-0004). For reconciliation, the authors agreed to use the broadest reasonable interpretation of a principle (Bowers et al., 2017).
Poor design may discourage users from reading privacy policies altogether (Waldman, 2017, 2018) and users may end up making privacy choices that are not in their best interest. From the design perspective, we wanted to understand whether the information and communication in each privacy policy is concise, whether the privacy policy is easily accessible and clearly demarcated from all other non-privacy related information. To this end, we first checked the number of words and sentences in each privacy policy using their digital copies in Microsoft Word. We also checked whether a privacy policy is directly accessible in one click from the privacy policy URL available on the corresponding app’s Google Play Store page and whether the policy is clearly demarcated from other text.
A document’s readability can be measured through readability scores created by the linguistics community (Bowers et al., 2017). Flesch Reading Ease (FRE) and Flesch–Kincaid grade-level scores (Flesch & Gould, 1949) have been popularly used to measure the readability of privacy policies and in general, of complex written passages (Milne et al., 2006). FRE is based on the average number of syllables per word and the average number of words per sentence and ranges from 0 to 100, 0 being unreadable and 100 being most readable. A FRE score of 60–70 has been characterized as “plain English” which translates into a grade level of eighth to ninth grade (Flesch & Gould, 1949). Since a single, “gold-standard” technique is yet to be identified, the readability of documents such as privacy policies and medical documents is commonly measured using more than one scoring mechanism (Bowers et al., 2017). Readability scores can be calculated using Microsoft Word Office Package (Microsoft Corp, Redmond, WA), Readability Calculations (Micro Power & Light Co., Dallas, TX), Readability Studio 1.1 (Oleander Solutions, Vandalia, OH; Badarudeen & Sabharwal, 2010), and Python libraries like TextStat (Gao, 2020; Gao et al., 2020; Kumar et al., 2017; Park et al., 2017; Redmiles et al., 2018) and Readability (Lei & Yan, 2016; Su et al., 2019).
To measure the readability of app-specific privacy policies, we first manually converted them into text-only documents to rule out the effects of formatting and typography (Bowers et al., 2017). We then calculated six readability scores—Flesch–Kincaid Grade Level score, Flesch Reading Ease, the Gunning-Fog score, the Coleman–Liau Index, the SMOG Index, and the Automated Readability Index—of all the app-specific privacy policies, using TextStat v. Details of same is available at Different scores consider the word count, sentence count, difficult words as well as syllable counts in the document. Table A3 (Appendix A) shows the formulae to calculate the different scores and interpretations of the scores in brief.
In this section, we describe the results we obtained for our research questions as identified in Section 3.
In our first analysis, we looked at the availability of app-specific privacy policies of the 63 public apps. We found that only 38% of these apps (24 out of 63 public apps) had an app-specific privacy policy, whereas 11% did not have any privacy policy whatsoever. This means, that for these latter apps, it was impossible to do any further analysis on their personal data handling. Neither users nor privacy experts would have any knowledge on what data are collected and how these data are used by these apps, let alone any rights of the data subject. For about 51% of the apps, some sort of privacy policy URL was present which did not lead to any app-specific privacy policy page.
Table A2 (Appendix A) represents the summary of findings. The percentage of privacy policies that mention different principles have been depicted in Figure 2. The overall coverage of privacy policies has been shown in Figure 3. We found that none of the apps covered all the 10 principles we studied. Only 3 apps with app-specific privacy policy cover 9 out of the 10 principles. All 24 apps cover at least one principle. Interestingly, one particular app (with pseudonym A24) covers only one principle.
The longest privacy policy had 2124 words and 78 sentences, while the shortest one had 240 words and 8 sentences. The average numbers of words and sentences in a policy are about 988 (standard deviation urn:x-wiley:14723891:media:pa2801:pa2801-math-0005 = 570.4) and 41 (urn:x-wiley:14723891:media:pa2801:pa2801-math-0006 = 20.3), respectively. Although privacy experts have generally advocated that privacy policies should be short, our analysis on key principle coverage indicates that shorter privacy policies are not necessarily better. This finding agrees with that of Bowers et al. (2017) from their study of the privacy policies of banking and mobile money services apps. All the 24 app-specific privacy policies were directly accessible in one click from the privacy policy URL available on the respective app’s Google Play Store page and were clearly demarcated from other text such as Terms of Use or website privacy policy.
The Flesch–Kincaid grade level scores for the different apps have been depicted in Figure 4. The range (i.e., minimum and maximum values), average, and standard deviation of all the six readability scores for the 24 apps have been summarized in Table 1. The average readability of the privacy policies is poor. None of the privacy policies have an FRE score within the range that has been characterized as “plain English.” All other scores also indicate that, on an average, the reading skills of a 12th grade student or above (i.e., high school, college level, or above) are required to be able to read the policies. This implies that only those who have these skills can read and comprehend these policies. To reach out to the remaining people in terms of transparency, the policies must be made simpler to read.
Simply evaluating whether a privacy policy covers a certain set of principles does not conclusively indicate that a desired level of transparency has been achieved. It is also important to understand how specific or general a privacy policy is when fulfilling a principle. For example, a phrase such as “we collect your personal data” in a privacy policy is very generic for the principle “Data Categories.” On the other hand, the phrase “we collect your health data…” is relatively more specific along this same principle. Similarly, a phrase that mentions “we process your health data to provide you with our services…” is generic along the principle “Purpose” whereas one that mentions “we process your health data to find out about your risk of contracting disease A” is relatively more specific.
To determine the specificity of coverage with respect to the principles, we further analyzed all the privacy policies that had a coverage of 7 or more out of the 10 principles. Two distinct types of privacy policies emerged: (1) High-specificity privacy policies: Privacy policies of two of the apps, as compared to the others we studied, stood out in terms of specificity. One of the main functions of these apps was contact tracing. (2) Low-specificity privacy policies: Some apps use common templates for their privacy policies, irrespective of their functions, and have very low specificity despite a considerably good coverage.
Compared to all the 24 apps, it can be easily said that A1 has the most specific privacy policy, although it has a lower coverage (8 out of 10 principles, see Table A2, Appendix A) than a few other apps. For example, A1 describes specific retention periods for different types of data it collects, more detailed description of the types of data collected and how they would be used and provides the name and e-mail address of a Grievance Officer. It also avoids the usage of words and phrases that increase ambiguity, such as “including but not limited to” and “we may collect,” and so on. This high specificity-level, compared to other apps, may be due to several reasons—due to it being a nation-wide initiative by the government, being at the center of wide-spread privacy-related criticisms and without enough confidence of the public, the initiative would probably have failed.
The only state-level app that had a high specificity, but not higher than A1, is A11. Its privacy policy clearly specifies who introduced the app, the purposes of data use, the rights of the data subjects, and who must be contacted in case users cannot exercise the rights themselves and the e-mail address of the Grievance Officer. A11 is also among the few apps with the highest coverage (9 out of 10 principles, see Table A2) and like A1, one of its main functionalities is contact tracing.
Apps A2, A4, A7, A9, A12, A13, A15, A17, A18, A19, A20, and A22 follow a common template (e.g., see Figures B1 and B2 in Appendix B) for their privacy policies, with only slight differences (e.g.,, the names of the government department which introduced the apps were changed in each case). Some other apps such as Apps 6 and 8, use a more modified version of the template. The number of principles that these privacy policies cover is in the range of 7 to 8. Out of these 11 apps, 5 apps belong to the same state. App 13 mentions “This privacy policy page was created at and modified/generated by App Privacy Policy Generator” implying that very little attention was actually paid in creating the privacy policies of these apps. None of these policies provide an exhaustive list of data being collected. Moreover, the purpose specified for data collection is generic—“for a better user experience,” “used for providing and improving the Service,” even when the app is performing contact tracing. App A6 has a “Contact Us” section in its privacy policy, but no contact details have been provided in this section. The use of ambiguous phrases is also noticeable—for example, “including but not limited to,” “we may employ third-party companies,” and so on.
We further investigate to understand how A1 fares in comparison with its international counterparts. The MIT Technology Review Covid Tracing Tracker Project (2020)1212 has evaluated 25 automated contact tracing apps backed by national governments globally along five measures: (1) whether citizens can voluntarily choose to use the apps; (2) whether there are limitations on how the collected data can be used; (3) whether the data will be destroyed after a certain period of time; (4) whether the data collection is minimized, and (5) whether the effort is transparent such as the availability of clear policies to the public. From this list, we selected two Bluetooth-based apps (pseudonymized1313 The details will be made available on request. as AI1 and AI2), for whom privacy policies were available in English, which had a high (i.e., satisfied all five evaluation criteria) or a medium rating (i.e., satisfying only three evaluation criteria) and satisfied the transparency criteria. These apps were among some of the earliest to be launched (around the same time as A1) and the privacy aspects of both these apps have been widely debated by experts. Incidentally, A1 scored a low rating (i.e., satisfying only two evaluation criteria) and did not satisfy the transparency criteria according to the project.
Whereas the privacy policy of AI1 satisfies 7 out of 10 principles, that of AI2 has a full coverage. So, in terms of coverage, A1’s privacy policy covering 8 out of 10 principles is better than AI1 but weaker than AI2. The privacy policies of both these international counterparts are also highly specific. For example, they clearly describe the types of data that are collected and their specific uses (avoiding the use of phrases like “including but not limited to”), specify exactly how many days the data will be retained, with whom the data will be shared and under what circumstances, and even provide links to describe how users can request for deletion of their data. AI2 also provides the contact details including email address, postal address, and phone number to make a privacy related enquiry or lodge a complaint. It also specifies that a complaint can be lodged with the office of the information commissioner or the federal police. Overall, the specificity of A1 and AI1 are similar, whereas that of AI2 is slightly better than A1.
Worldwide, digital contact tracing apps have come under increased scrutiny due the amount and sensitive nature of location, contact, and health data they collect. The apps have also triggered privacy concerns such as state-sponsored surveillance. Evidences suggest that during moments of crisis, governments have introduced mass surveillance systems never really doing away with them even after the crisis is over (Biddle, 2020). According to the European Data Protection Board (EDPB),1414 Available at when personal data processing is necessary for managing a disaster like the pandemic, data protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thus guarantee the effectiveness of these measures.
Introduction and enforcement of a comprehensive data protection regulation and improving user awareness about privacy harms of personal data processing are two vital ways to improve the privacy disclosure of Indian disaster management apps and other digital initiatives.
In India, the IT Rules 2011 specifies that a “body corporate” who “collects, receives, stores, deals or handles information of provider of information, shall provide a privacy policy for handling of or dealing in personal information including sensitive personal data or information and ensure that the same are available for view by such providers of information who has provided such information under lawful contract.” It also requires that these privacy policies must be clear, readable, and easily accessible. Some apps like A5 and A11 have stated these rules, among others such as the IT Act 2000 and National Disaster Management Act, 2005, to be their legal basis. Not providing a privacy policy or providing a link to a website or other types of privacy policy, not specific to the app, or having a privacy policy that takes too much effort to be retrieved and is not easily accessible, is not appropriate. Yet, our results show that more than 50% of the apps fall under the above category, indicating a lack of proper enforcement of the existing regulations.
The IT Rules 2011 also require that privacy policies must specify types of personal or sensitive personal data or information collected (“Data Categories” principle), the purpose of collection and usage of such information (“Purpose” principle), disclosure of information including sensitive personal data or information (“Recipients” principle), and reasonable security practices and procedures (“Security” principle). Six out of the 24 apps we studied do not cover one or more of these principles. Although the existing requirements may be the reason why many apps have privacy policies that are of high coverage, many of these apps also have low specificity including cases where the privacy policy was simply replicated from others irrespective of the actual functionalities of the apps and/or generated by an online policy generator. So, privacy policies seem to have been generated only for the sake of superficial compliance with the requirements and not to serve the real purpose of informing data subjects/users about the data controller’s practices in a comprehensible way. This also indicates a lack of suitable enforcement of existing regulations.
To date, the EU GDPR is one of the most demanding and comprehensive privacy regulations (Linden et al., 2020). In the context of the pandemic, the EDPB has adopted guidelines1515 Available at for the proportionate use of location data and contact tracing tools that promote high levels of transparency to favor the social acceptance of such apps. Additionally, the EDPB has adopted the statement1616 Available at on the data protection impact of the interoperability of contact tracing apps that emphasizes on the transparency to additional data processing and disclosure of data to additional third parties due to the interoperability of these apps. Studies have shown that the GDPR has been the catalyst for the overhaul of privacy policies (Zaeem & Barber, 2020), including changes in text, more coverage of data practices, better compliance, and better specificity levels on an average, both within and outside the EU (Linden et al., 2020; Zaeem & Barber, 2020). It has also triggered an overall reduction in the permissions used by apps (Momen et al., 2019).
In India, the requirements for coverage in the IT Rules 2011 are not sufficient, compared to the EU GDPR. In fact, the PDP Bill 2019, which draws its inspiration from the EU GDPR, requires a much wider coverage and includes all the 10 principles that we use in this study. Therefore, privacy policies of disaster management apps must focus on covering all the 10 principles. The Data Access and Knowledge Sharing Protocol introduced during the pandemic under Empowered Group on Technology and Data Management under the Disaster Management Act, 2005 to provide legal safeguards for the operation of A1 seems to have helped to resolve many privacy concerns that had arisen regarding the data collection and processing by A1. The privacy policy of A1 has also subsequently undergone several updates resulting in a comprehensive statement. In India, the privacy policies of mobile money services have been found to have one of the best coverages, when compared to those of other countries across the world, and they also meet the criteria set out by the IT Act (Bowers et al., 2017).
The above evidences suggest that appropriate regulations could lead to better privacy policies, although further explorations in this direction are required. Therefore, it is possible that the quick implementation and enactment of the PDP Bill, which is currently under Parliamentary review, could have led to a better coverage and specificity of privacy policies of COVID-19 apps. Moreover, all the apps, and not just A1, could have been brought under the legal safeguards provided by the Data Access and Knowledge Sharing Protocol for a stronger approach to data protection during the pandemic.
As a complement to the above recommendations, ongoing, visible, and easily accessible initiatives on the education of the public about personal data collection and use will empower them to better navigate the data eco-system that is so ubiquitous today. Public awareness can foster a better understanding of how data can be collected and used not only by e-governance apps (such as UMANG) but also by widely used applications including Facebook, Twitter, YouTube, and hundreds of other apps from small and large private players, their benefits and risks to individuals as well as the society at large (De & Shukla, 2020). It provides users with the ability to recognize privacy harms when they occur, a clear guidance on their rights, what can be done to exercise these rights, and how to obtain recourse when data has been misused. In India, the Reserve Bank of India (RBI) regularly undertakes multilingual campaigns, such as “Suno RBI Kya Kehta Hai”1717 Available at (roughly translated as “Listen to what RBI is saying”), through SMS, advertisements in print and visual media and other means to educate users on the safe and secure use of digital payments in order to combat increasing payment frauds. Specific to the pandemic, the Regional Outreach Bureau (ROB) of the Ministry of Information and Broadcasting, Government of India has undertaken several campaigns on maintaining personal hygiene, abiding by the lockdown rules, maintaining social distancing, avoiding rumor mongering, and even downloading A1. Similar persistent campaigns to spread awareness on data collection and use can be launched under the Digital India initiative to prepare the public for a risk-aware digital future.
In conclusion, the results of our study point towards the necessity of a comprehensive data protection regulation in India and its strict enforcement along with the need to increase privacy awareness among the users of the digital eco-system. The lessons learnt from this study will help governments and regulators to address privacy concerns better in future disaster management apps and in other government digital initiatives.
Dr. Sourya Joyee De is an Assistant Professor in IT and Systems area at Indian Institute of Management Raipur, India. She is a Fellow of Indian Institute of Management Calcutta (Ph.D.). Prior to joining IIM Raipur, Sourya has held research positions at INRIA Grenoble Rhone-Alpes and LORIA-CNRS-INRIA Nancy Grand-Est, France for close to four years. Her research has been funded by the French ANR project BIOPRIV, CISCO San Jose, U.S.A., Samsung GRO Grant, INRIA Project Lab CAPPRIS, and the Grand-Est Region, France. Sourya was also a Visiting Scientist at Indian Statistical Institute Kolkata, India. Her research interests include information privacy and security. Her research has been published at various reputed journals and conferences. She has also published two books on privacy with Morgan & Claypool Publishers, San Rafael, CA, U.S.A.
Dr. Rashmi Shukla is currently working as an assistant professor in ‘Economic Environment and General Management’ area at Indian Institute of Management (IIM) Raipur. Prior to this, she was with NMIMS (School of Business Management), Mumbai as an assistant professor (economics area). She holds a doctoral degree in Economics from Indian Institute of Management (IIM) Indore. She is a computer science engineer and earned an MBA degree (Finance and International Business). Currently, her research interests include economic growth and development; digital transformation process; innovations in digital economy landscape; cashless economy. Her work has been presented at various international conferences and has been published in various peer reviewed international journals.
The data that support the findings of this study are available from the corresponding author upon reasonable request.
Early View
Online Version of Record before inclusion in an issue
The full text of this article hosted at is unavailable due to technical difficulties.
Your password has been changed
Enter your email address below.
Please check your email for instructions on resetting your password. If you do not receive an email within 10 minutes, your email address may not be registered, and you may need to create a new Wiley Online Library account.
Can’t sign in? Forgot your username?
Enter your email address below and we will send you your username
If the address matches an existing account you will receive an email with instructions to retrieve your username