How Do Messaging Apps Respond to Privacy?
Messaging apps are increasingly being used for education as well as communication with family and friends. Like many of the products we have reviewed in 2020 and 2021, we are reminded that these apps and services are being used in new and diverse ways. Whether the companies like it or not (or even know it or not), parents and educators are using these messaging services not only for entertainment and personal communications, but also to support distance learning. Are the companies keeping up with this use of their technology? Do they have privacy in mind as the technology advances and the use of that technology expands? Let's have a look at the state of privacy for messaging apps.
Messaging apps have become critical in daily life. Under these circumstances, children and students might use a parent's mobile device and parent's account to message teachers or fellow students. This may result in the collection of sensitive information about their private communications, which could lead to privacy risks and harms that may affect children, students, and families. In fact, many families use messaging apps to communicate with each other even when they're all at home, and kids might text each other while sitting next to each other on a couch. It's not just a substitute for letters or emails, but rather a fundamental reconstruction of communication from face-to-face conversations and a nonverbal communication system of gestures to a remote, online, and visual system of communication using text and symbols like emojis.
The privacy risks of message interception by law enforcement or authoritarian governments are significant, even if they are limited to certain people or specific countries. The understanding that some messages might be read both chills speech and limits who can use some of the products if their privacy and security are lax. Therefore, encryption is extremely important. Encryption, or even the assumption that encryption exists, allows speech to flow freely without concerns about these messages being intercepted, read, and used for purposes other than what the sender intended.
For this article, we rated the top 10 messaging apps that we believe are representative of most types of messaging apps available across different platforms today. We chose messaging apps based on the features, type of content provided, and popularity in the Apple App Store and on Google Play. We also chose messaging apps used by children and students in every major age group at home, on the go, and in the classroom. We chose Apple iMessage, Google Messages, Skype, WhatsApp, Slack, Snapchat, Facebook Messenger, Telegram, Discord, and Signal.
How we rate privacy
When evaluating whether to have children use messaging apps at home or in the classroom, parents and teachers need to understand both the privacy policies and terms of service. To create a truly comprehensive evaluation process, the Common Sense Privacy Program completes a full, in-depth, 150-point inspection of a product's privacy policies in order to offer privacy ratings that are easy to understand.
What we found: The overall ratings
From the privacy concern chart below, you can see that Apple iMessage received our highest overall score and was the only product to earn a "pass" rating for better privacy practices that protect everyone. Signal received the lowest overall score with a "warning" rating. Google iMessage received the highest overall score, even with a "warning" rating, because Google iMessage had the most comprehensive policy despite engaging in some worse privacy practices, which earned them a warning rating. How did this split occur? We give points for transparency.
Google's comparatively higher score, in other words, speaks to their transparency in telling us that they use our data for several purposes, but also share it for advertising. Apple is less comprehensive in its transparency (and could raise their score if they addressed more issues in their policies), but Apple's policy earns them our highest pass rating because they only engage in better practices that protect their users' privacy.
Product | Privacy Rating |
---|---|
Apple iMessage |
79%
|
Google Messages |
81%
|
Skype |
69%
|
66%
|
|
Slack |
57%
|
Snapchat |
56%
|
53%
|
|
Telegram |
51%
|
Discord |
50%
|
Signal |
38%
|
What we found: The breakdown
The following ratings and scores further break down the concerns we had about each of the privacy evaluation results of the top-10 messaging apps. The following chart illustrates a range of privacy practices from "best" to "poor" based on our privacy ratings and evaluation concerns. Products that score a "poor" are not necessarily unsafe, but they have a higher number of privacy problems than the "average" product. Similarly, products that score "best" are not necessarily problem-free, but had relatively fewer problems compared with other products.
Top-10 Streaming apps
Product | Privacy Rating | Data Collection | Data Sharing | Data Security | Data Rights | Data Sold | Data Safety | Ads and Tracking | Parental Consent | School Purpose |
---|---|---|---|---|---|---|---|---|---|---|
Apple iMessage | Good | Best | Good | Best | Average | Average | Good | Good | Poor | |
Google Messages | Good | Best | Best | Best | Average | Good | Average | Good | Average | |
Skype | Fair | Average | Average | Fair | Fair | Average | Average | Poor | Poor | |
Average | Good | Fair | Good | Poor | Average | Good | Fair | Poor | ||
Slack | Fair | Good | Best | Good | Fair | Average | Average | Fair | Poor | |
Snapchat | Average | Average | Fair | Best | Fair | Good | Average | Fair | Poor | |
Facebook Messenger | Average | Good | Fair | Best | Fair | Average | Average | Poor | Poor | |
Telegram | Average | Average | Good | Fair | Poor | Good | Fair | Poor | Poor | |
Discord | Good | Good | Fair | Best | Good | Fair | Fair | Fair | Poor | |
Signal | Fair | Average | Average | Fair | Poor | Average | Poor | Poor | Poor |
Compare privacy ratings
The following chart compares the privacy practices of all the messaging apps we rated, as described in their privacy policies. These practices can put children's and students' privacy at risk if they sell personal data to third-party companies or use personal information for third-party marketing, targeted advertising, tracking, or ad-profiling purposes. In the chart below, "yes" is considered a worse practice that puts children, students', and consumers' privacy at risk. The term "unclear" in the chart means the messaging app did not transparently talk about a worse practice that can put children or student's privacy at risk, and therefore there is no expectation of how the app will use personal data.
Product | Privacy Rating | Sell Data | Third-party Marketing | Targeted Ads | Third-party Tracking | Track Users | Ad Profile |
---|---|---|---|---|---|---|---|
Apple iMessage | No | No | No | No | No | No | |
Google Messages | No | No | Yes | Yes | Yes | Yes | |
Skype | Yes | Yes | Yes | Yes | Yes | Yes | |
Unclear | Yes | Yes | Yes | Yes | Yes | ||
Slack | Yes | Unclear | Yes | Yes | Yes | Yes | |
Snapchat | Yes | Yes | Yes | Yes | Yes | Yes | |
Facebook Messenger | No | Yes | Yes | Yes | Yes | Yes | |
Telegram | Yes | No | No | Unclear | Unclear | No | |
Discord | Yes | Yes | Unclear | Yes | Yes | Yes | |
Signal | No | Unclear | Unclear | Unclear | Unclear | Unclear |
Apple's iMessage policy explains that personal information will be shared by Apple to provide or improve their products, services, and advertising by third-party apps. Personal information will not be sold or shared with third parties for their own marketing purposes. However, Apple may use, transfer, and disclose nonpersonal information for any purpose.
On a positive note, Google Message's terms say Google does not sell users' data to third parties. However, the terms say Google may display targeted advertisements to users based on their personal information and app activity or activity on other Google services, and its terms say Google allows third-party partners to track a user's browser or device for advertising on other sites across the internet. It gets tricky when the app you're using is owned by Google, a company that many other companies use to provide tracking and advertising services. If they only keep information about you to themselves (or first-party tracking), they still could be using it for a wide variety of purposes.
The same holds true for Microsoft-owned Skype, which explains that when you're using Skype, they provide a privacy dashboard that allows users to control some of the data Microsoft processes through their use of a Microsoft account on the Microsoft privacy dashboard. And recently as WhatsApp joined the Facebook group, WhatsApp said it would protect existing users by saying that if you are an existing user, you can choose not to have your WhatsApp account information shared with Facebook to improve your Facebook ads and products experiences.
The question of who is a third party gets even more complicated when a messaging app has divided loyalties between corporate and individual users. For example, Slack's policy calls corporate users "customers." Customers or individuals granted access to a workspace by a customer ("Authorized Users") routinely submit customer data to Slack when using the services. Slack also collects, generates and/or receives other Information, including workspace and account information. For example, in order to create or update a workspace account, the user or their customer (e.g., the user's employer) supplies Slack with an email address, phone number, password, domain and/or similar account details. On the upside, Slack's policy says they do not sell the personal information they collect and will not sell it without providing a right to opt out. At least it's clear who a person is -- so far so good.
Crucially, it's not just what data a messaging app company collects, but how they use it, and for how long. For Skype, Microsoft's privacy policy states they may use personal information for marketing and targeted advertising purposes, but Microsoft says they do not use a user's emails, chats, video calls, or voicemail messages as well as their documents, photos, or other personal files to target ads. WhatsApp's policy says that they don't store user messages once they've been delivered, and that WhatsApp and third parties can't read the user messages when they are end-to-end encrypted. Snapchat's terms say they do not sell personal information to third parties, but the terms do state that Snapchat and third-party partners may put advertising on the services. Of course, nothing is perfectly temporary, and Snapchat says users should keep in mind that other users can view their Snaps, Chats, and any other content, and these users couldn always save that content or copy it outside the app.
What are the risks?
Given the amount of data shared from these messaging apps, and the limited protections offered by privacy policies, we next consider the possible risks and harms from participating in these practices. Just because the data is secure and encrypted does not mean that it's entirely private from other users or the public. For example, Signal's terms say calls and messages between users are always encrypted and can never be shared or viewed by anyone but the user and the intended recipients. However, using the messaging service may allow personal information or content to be made publicly visible to others by the intended recipient. Users should also be aware that just because the content of their messages is protected with end-to-end encryption and blocked from being intercepted by the messaging app company or law enforcement, that doesn't mean there are no risks. Information about end-to-end encrypted communications may still describe communication metadata, such as the platform used to send a message, the time the message was sent and received, the identification of the sender and receiver, and the amount of message data sent and received.
We especially draw attention to the higher vulnerabilities and the concordant danger these practices might impose on children and younger students. Although an adult might not mind having their name shared with co-workers on Microsoft's Skype, when children users are signed in, some products may display a user's name or username and their profile photo as part of their use of Microsoft products, including them in a user's communications, social interactions, and public posts. The terms state many Microsoft products require some personal data before users can access its services. Microsoft services can collect data including: first and last name, email address, postal address, phone number, password credentials, and user data such as age, gender, country, and preferred language. That's a lot of information traded to use a service, especially for a child.
In addition to the obvious dangers, some companies are vague about what happens to the personal information of the children that use their products. In particular, Slack is a business communication service. According to Slack's policy, when an authorized user submits information, it may be displayed to other authorized users in the same or connected workspaces. In order to create or update a workspace account, the user or their customer (e.g., the user's employer) supplies Slack with an email address, phone number, password, domain and/or similar account details. However, it is unclear what obligations Slack has to children younger than13 or 16 if use is not prohibited by law, or if Slack itself has knowledge that a child is using its service.
Children and data privacy
When it comes to their children and students, parents and educators value the ability to understand and control what personal information is collected from the messaging apps they use. And if so, does the user know how to control what information is collected and whether their child's or students' personal data is being used to deliver personalized or targeted ads?
Messaging apps can request access to a mobile device's location, which unlike the encrypted message content can be collected and shared with other metadata to third parties for their own purposes. Parents need to be aware that their child's mobile devices -- and messaging apps running on the device -- can continuously collect the device's location and proximity to nearby cellular towers that can determine their child's precise GPS location, which can be shared with other companies or law enforcement.
What about age-inappropriate content?
Messaging apps can also introduce age-appropriate or age-inappropriate media, and link to different third-party app content providers. For example, for Apple's iMessage product, parents should be aware that their child may share information with others depending on the Apple features and services that he or she uses, and that may include their child's name and contact information. Parents and educators may also feel like they don't have the ability to make a meaningful choice when it comes to privacy because these communication tools are already being used or are the only available ones in a certain environment like a school. Children younger than 13 are not permitted to use the WhatsApp product at all, according to its policy, but that may not be the whole story.
Let's not forget that even if an app states up and down it does not allow children to use its product, it does happen. Snapchat's terms also state no one under 13 is allowed to create an account and the services are not intended for -- and Snapchat does not direct them to -- anyone under 13. However, a general audience product like Snapchat may be considered directed to children if the product would appeal to children under 13 years of age, which takes several factors into consideration such as: the subject matter, visual content, the use of animated characters or child-oriented activities and incentives, music or other audio content, the age of models, the presence of child celebrities or celebrities who appeal to children, language or other characteristics of the product, or whether advertising promoting or appearing on the product is directed at children. Therefore, a general audience application or service that collects personal information from users to chat with friends with animated cartoon characters and images would likely be a child-directed product.
What is going on now?
- The facts: Messaging apps are treated as trusted services and can collect a significant amount of behavioral viewing data and personal information. Some apps have a kids version, and for Facebook Messenger Kids, parents can exercise controls available on the parent dashboard to decide whether their child's contacts see additional account details from their child's account. Parents can also choose to approve each connection request individually, or they can choose to allow their child to make and accept connection requests on their own without a parent or guardian approving each one.
- The law: Federal law imposes certain requirements on operators of websites or online services directed to children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child younger than 13. One requirement is the parental consent prior to collection of personal information. California law also prohibits the sale of personal information of California consumers under age of 16 without appropriate consent. Similar to COPPA, CCPA also requires the consent of a parent or guardian to collect information for children under the age of 13. For users in the European Union, the General Data Protection Regulation restricts processing the data of a child under 16 years of age unless valid consent is provided.
- The feelings: Parents and educators may have feelings about messaging apps always collecting data from their children and students while they are using the apps to create a personalized profile. This is often referred to as the "creepiness" factor and could include collecting metadata or location data without express permission, or using the data for purposes other than what the app was initially used for. For example, a person might send a message on a messaging service and get an email or advertisement elsewhere selling them merchandise related to who they are talking with, based on the time and location the message was sent.
- The future: Beyond what is currently collected and how it is used, messaging apps may store metadata indefinitely. At some point, companies may use the data in ways that no one has yet imagined -- such as changing default interactions on other unrelated apps and services based on what types of people the user has sent and received messages and what the recipients of their messages are interested in. In addition, data brokers could also combine metadata in the future with other data collected from apps and services you use to try and re-identify data about you that you thought was anonymous or de-identified.
What should parents and educators do?
Parents and educators have several options when deciding whether to use messaging apps. Some may be thinking about which messaging app they should use, and others may have already made up their mind to use one or more services, but aren't sure which one is best for privacy. Some may want to know how to change their messaging app's privacy settings to best protect their children or students. Parents and educators may also want to know how to exercise their data rights and tell companies not to sell their data.
Below are some suggestions for managing this process to better protect child and student users:
- Check the privacy settings. Messaging apps have settings that allow different data collection features to be turned on or off. If it's not necessary to collect content or analytics data on how the app is used, then these extra features can be turned off to minimize the amount of sensitive information collected. For iMessage, Apple offers a few different sets of controls to help parents manage their child's access to Apple ID services and features. These include Restrictions, Screen Time, Family Sharing, and Communication Limitation. As Telegram explains in its privacy policy, the data a user posts in public communities is encrypted -- both in storage and in transit -- but everything a user posts in public will be accessible to everyone.
- Encourage supervision. Younger children and students should use messaging apps only when an adult is present to supervise use. Anonymity can work both ways, for good and bad, like when a Telegram account's policy notes that they do not require the screen name to be a user's real name. So who is the child talking to?
- Check which apps are installed. Remove unwanted third-party messaging apps to limit information collection. While using iMessage, Apple's terms remind parents to be aware that third-party apps may be collecting data about children. Some kids may not even realize they are using a different product, such as when communications happen through a messaging app while using another gaming program. The terms of Discord Chat for Gamers state they collect personal information such as first and last name, mailing address, email address, or telephone number, and other information collected automatically when users visit the service. The terms indicate collected information (including data collected with automated tracking or usage analytics) is shared with third parties for advertising or marketing purposes.
- Ask companies not to sell your data. Use free online resources, like donotsell.org, to request that companies not sell your personal data for profit. In addition, Apple's Limit Ad Tracking control is enabled by default for all devices associated with a child's Apple ID, to ensure they do not receive targeted advertising from third-party apps.
- Make your preferences known to companies and legislators. Many parents have taken (or wanted to take) steps to limit data collection -- and some think they have. Half want to, but they don't know how. This is the jumping-off point for action. The next step is to empower parents and educators so that they actually have this control and use it. Legislators can support this practice by mandating features allowing parental controls, and when that doesn't fully protect kids, allowing the information to be deleted from devices and databases.
- Make informed decisions about which apps to use. This article is a snapshot of messaging apps right now. Business practices change rapidly as companies think creatively about how to gather, process, and sell data. Look for safer defaults, like in the case of iMessage, which does not allow children younger than 13 to create their own Apple IDs, unless:
- Their parent provides verifiable consent.
- Their Apple ID is part of the child account creation process in Family Sharing.
- They have obtained a managed Apple ID account through their school.
In addition, while using iMessage, Apple's Limit Ad Tracking control is enabled by default for all devices associated with a child's Apple ID to ensure they do not receive targeted advertising from third-party apps.
In deciding whether to log in or use messaging apps, consider the impact on children that use the service and the amount of screen time. Factor into your decision whether the messaging app used end-to-end encryption as well as the potential for the company -- and any third-party companies -- to misuse your personal information, location, or metadata over time.
table.products { border-collapse: collapse; border: thin solid black; } th, td { padding: 10px; text-align: left; table-layout: fixed; } .products th { background:#000000; color:white; } .products td { text-align: left; } .products .header { background-color:#000000; color:white; }