Transparency, which is covered in more detail in the Transparency chapter, is a core part of building trust, both with users directly as well as with civil society, regulators, and other stakeholders. Increasingly, regulators are introducing requirements around transparency that companies must meet in order to comply with the evolving legal landscape. However, what transparency entails and what constitutes enough transparency is still a keenly debated topic, and depends greatly on the circumstances in which privacy is being protected, oversight is being conducted, and accountability is being sought.
The Integrity Institute produced a guide to meaningful transparency in the context of transparency reporting, which also has relevance in the context of privacy. The guide highlights that meaningful transparency supports the following objectives:
- Informing users about the use, storage, and sharing of their data as well as their rights;
- Building trust with users to ensure that they feel that their data and privacy are assured and are actually safe;
- Broadly aiding users in understanding platforms, their impact, and privacy issues;
- Providing accountability for actions taken;
- Empowering people inside of companies to do the right thing and strengthen the health of platforms for the long term, from both business and user points of view.
Transparency in a privacy context is normally transparency to the user: how a user can know that their data is safe and that their privacy is being respected. By contrast, “The purpose of having a transparency report is to communicate to the wider public the policies and procedures a company has in place, how and to what extent those policies are enforced, and thereby more clearly speak to accountability and progress”(GIFCT’s Transparency Working Group, 2021).
While this paper does not go into detail about transparency reports, these two topics are linked as both aim to inform about the use of data and actions taken, both seek to build trust with stakeholders, and both are increasingly mandated in law (e.g., Chapter 3 of GDPR and DSA Article 13 ).
In 2022, GIFCT conducted a survey on transparency as part of the Transparency Working Group and identified the following principles to ensure meaningful transparency. The information provided in transparency reports should be:
- User-friendly and concise
- Accurate and clear
- Use of illustrative examples
- Use of tables, charts, and infographics
- Explain trends or data interpretation to avoid misinterpretation
- Considerate of which data may need additional context or interpretation
- Accessible
- Downloadable
- Machine-readable/API
- Timely
- Relevant
- Verifiable
- Replicable/auditable
- Provide research access to data/information sources (under controlled circumstances).
This section will delve into various aspects of transparency in privacy, focusing on accountability; data protection and privacy impact assessments; anonymization and de-identification; incident response and recovery; data retention; privacy risks; and mitigations. Each area plays a crucial role in safeguarding user privacy and fostering a privacy-centric culture within organizations.
Accountability: Demonstrating Commitment to Privacy
Accountability serves as the cornerstone of any effective privacy program. It involves a proactive commitment to privacy by implementing robust policies and practices to protect user data. Communicating effectively about these policies and practices is integral to building trust with users. Below are some essential elements of accountability:
A Clear Privacy Policy
A well-crafted privacy policy is the foundation of transparency. It should be written in clear, concise language accessible to all users. The policy must outline how user data is collected, used, stored, and shared, and provide information on the purposes for which the data is processed. Transparency can be achieved by avoiding vague language and using concrete examples that help users understand how their data will be treated. Making this complex topic as easy to understand as possible and the information as easy to find as possible is critical to demonstrating a platform’s commitment to privacy.
Building User Awareness Through Privacy Statements
Example: Zoom’s Privacy Statement
Here, Zoom provides a good example of bringing all the information together in one place, but organizing it in a way to enhance the user’s experience and make the information accessible.
User Consent, Control of Data, and Privacy Setting
Obtaining user consent is a fundamental aspect of privacy compliance. A robust consent management system is needed to ensure users are informed about the specific data processing activities that require their consent as well as manage records of consent given. Records of consent need to be maintained to demonstrate compliance with regulations such as the General Data Protection Regulation (GDPR).In addition, users need to be informed about the consent that they are giving. Under the California Consumer Privacy Act (CCPA), they have the right to know about the personal information a business collects about them and how it is used and shared, as well as the right to opt-out of the sale or sharing of their personal information. This is often accomplished by giving users control over privacy settings.
Building User Flows for Privacy
Example: Google Privacy Checkup
Google has a Privacy Checkup built into its account management system to help users adjust the key privacy settings that they need to be aware of.
Third-Party Verification
Many organizations collaborate with third-party partners for various services that involve data sharing. To maintain transparency, conducting due diligence on third-party partners, to ensure that they adhere to similar privacy standards is essential. The involvement of third parties in data processing should be clearly stated in the privacy policy. This verification is a complex topic in itself. One way to start is to compare the actual businesses processes with the stated ones, and validate that everything is in order before dats is transferred to and from vendors. This applies to both data in transit and data at rest (especially as many services now use streaming data processes). There is a clear need for automating not only the process of documenting but also the monitoring and validation of how data moves, what data is moving, and whose data it is. More details on this and the specific application to the CCPA can be found on the IAPP website.
Regular Internal Audits
Internal audits are essential to assess and maintain privacy compliance. Trust and safety professionals should conduct periodic audits to identify any discrepancies between privacy policies and actual practices. Any identified gaps or issues should be addressed promptly, and the findings should be used to continuously improve privacy practices. In their 2023 Privacy Risk assessment, IAPP noted that only about 21% of organizations have “empowered the third line of defense to undertake privacy audits.”
Data Protection and Privacy Impact Assessments (DPIA)
Data Protection Impact Assessments (DPIAs) are systematic processes used to identify and assess potential privacy risks associated with data processing activities. DPIAs are particularly important when launching new projects or making significant changes to existing systems and are critical tools for identifying and mitigating risk and demonstrating compliance with the GDPR. DPIA’s are, in fact, a new requirement under GDPR:
“Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.”
The UK’s Information Commissioner’s Office (ICO), which is responsible for enforcing the GDPR in that country, has prepared a Data Protection Impact Assessment template, which is a useful starting point for designing, conducting, and recording a DPIA. The template helps simplify the process to crystalize why a DPIA is needed, what data process are being assessed, which stakeholders should be consulted, and how to consider the necessity and proportionality of current data processing practices (essentially, is there another route to achieving the same outcome with low privacy risk, and is the risk to privacy reasonable in consideration of team or organizational goals?) and finally assess and manage risks.
Once privacy risks are identified through DPIAs, it is crucial to develop and implement appropriate mitigation and risk management strategies.
More details on DPIAs can be found on the EU GDPR Website as well as on the UK ICO website and the Irish Data Protect Commissioner’s website.
Transparency and Data Retention and Processing
As we hold data and use data we can further enhance or degrade user’s privacy being clear with users how data held about them is being stored and processed can further help build trust. By contrast, poor data handling practices can not only damage users’ privacy, but also broader trust in the organization. It is also important to consider both policies and methods for data storage and processing.
Storage and Purpose Limitation
Data should only be retained only for as long as necessary to fulfill the purposes for which it was collected. Defining specific retention periods for different types of data helps prevent unnecessary data storage. While it’s often not straightforward or helpful to users to explain how long specific data will be held for, explaining the purposes data is being stored and processed for and how decisions are made about when data should be deleted will enhance transparency.
Disclosing How Platforms Retain Metadata
Example: Snap’s statement about how they retain metadata
Example: X’s (formerly Twitter) explanation of their data retention
Consider two example statements about the type of data retained, why, general parameters for how long it is retained, and what happens if the data has to be held for longer.
Secure Disposal
When data reaches the end of its retention period, it should be securely disposed of. This not only prevents potential data leaks or unauthorized access, but also is a requirement under both GDPR and CCPA. Users also have the right to delete personal information collected from them (with some legal exceptions). This means that tools that gather personal information should also be designed in such a way that users can remove their data. In addition telling users about secure deletion , it is also an opportunity to remind them of the platform’s commitments to preventing accidental loss of data, both in how the platform protects the data and ensuring that user-initiated deletion requests are both clear and not susceptible to accidental deletion requests.
Building User Flows for Privacy
Example: How Google Retains Collected Data
Google provides a very accessible explanation of how they enable safe and complete deletion.
Other rights
Users have the right to correct inaccurate personal information as well as rights to security and confidentiality. Being transparent about each of these rights and how users are empowered to exercise their rights with regard to data management and privacy is an opportunity to showcase the platform’s commitment to privacy as well as build trust with users.
Exercising User Rights Over Their Data
Example: Meta’s Guide on deleting information and exercising rights
Meta provides a guide on how to exercise these rights, with links to the places in their tools where these actions can be taken.
Anonymization and De-identification
Anonymization Techniques
Building on the definitions of anonymization and de-identification introduced in the “What is Privacy?” section of this chapter, it is important to note how they relate to transparency work. Anonymization is the process of removing or altering identifiable information from data, making it impossible to associate the data with specific individuals. This is particularly relevant when sharing data for research or analysis while preserving privacy, but it may also be required at other times when handling sensitive data, especially PII, or when developing products and services.
There are many approaches to anonymization, but they generally fall into one of the following categories:
- Masking: removing, encrypting, or obscuring the private identifiers.
- Differential privacy: adding mathematical noise to the data making.
- Pseudonymization: Replace the private identifiers with pseudonyms or false values
- Generalization: Replacing a specific identifier value with a more general one
- Swapping: Shuffling the attribute values of the dataset so that they are different from the original one
In addition to these techniques, sensitive information may need to be extracted so that it can be anonymized from wider data. This is common in unstructured datasets.
How data is anonymized is extremely important, as it really determines how much protection the anonymization provides. Despite anonymization efforts, there may still be risks of re-identification or de-anonymization of data. New methods, including the use of AI, to re-identify data are being developed all the time, so it is important to review anonymization processes regularly as well as reviewing the risk of re-identification and considering the ethical implications of sharing potentially sensitive data.
Building User Flows for Privacy
Example: Google Privacy Checkup
Google has a Privacy Checkup built into its account management system to help users adjust the key privacy settings that they need to be aware of.
As with accountability measures, being clear with users about how anonymization is used to protect their data is key to building trust, for example, Google’s Policy center has a technology section that lays out how they approach anonymization.
Incident Response and Recovery
When working with user data, it is common to experience some level of data loss or cyber security breach that impacts users personally and compromises their sense of privacy..
A well-defined incident response plan is critical to handling data breaches or privacy incidents effectively. An incident response plan should be proactive. It should be put it in place so that when an incident occurs, teams know how to respond and recover. This plan should outline the steps to be taken in the event of a data breach, including:
- Incident identification: Detecting and verifying a potential incident promptly.
- Containment and mitigation: Taking immediate action to contain the incident and prevent further damage.
- Notification: Timely communication with affected users and regulatory authorities.
- Investigation and remediation: Conducting a thorough investigation to understand the root cause and implementing measures to prevent future incidents.
How T&S professionals communicate during an incident is central to establishing and maintaining trust with users. When this communication goes badly (normally because of a lack of transparency or minimizing the impact of an incident), trust is irreparably damaged. Teams need to have a solid communications and transparency plan as part of the incident response.
Building User Flows for Privacy
Example: The FTC’s Guide for Business’ Data Breach Response
The FTC provides some really helpful resources on incident response and recovery.
Conclusion
People have the right to privacy, and protecting that right is central to safeguarding human rights, such as freedom of expression, both online and offline. At the same time, protecting privacy rights is increasingly challenging in the ever-complexifying landscape of online platforms. Thankfully, resources and practices that support the fight for privacy are more abundant than ever. Responsiveness to legislative developments and policy commitments, advancing technologies that govern data use and keep it secure, and building infrastructure can all come together to create a better, safer online experience for users and trust and safety employees alike.
Acknowledgements
Authors│Michael Swenson, Aki Nakanishi, Bethany Smith, Rhett King, Roberta Savera, Tom Thorley
Contributors│Numa Dhamani, Talita Pessoa, Mackenzie Tudor
Special Thanks│Harsha Bhatlapenumarthy, Amanda Menking