6 Privacy Engineering Skills Teams Need In 2023
Overview
A Case For Privacy Engineering
It’s hidden in plain sight. In order to successfully manage data privacy, companies must successfully manage data. Engineering teams must extend the technical workflows and processes for other realms of data operations, such as ETL and analytics, to privacy management within the development life cycle. In doing so, they practice privacy engineering. Per Wikipedia, privacy engineering is:
“…an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy.”
Here at Ethyca, we like to think of privacy engineering as the act of baking privacy into systems rather than bolting it on after the fact.
From an engineering perspective, building capacity in privacy engineering will relieve operational pains and allow teams to focus on product-building with confidence that their work is compliant.
From a regulatory standpoint, privacy engineers bring deep value to an organization. By interfacing with engineers and Governance, Risk, & Compliance (GRC) experts, they ensure that technical operations align with legal requirements through elegant workflows. They help an organization work smarter, not harder, on privacy.
Here, we discuss six skills—three technical skills and three GRC competencies—that, once acquired by dev team members, will harmonize engineering and compliance needs. Throughout, we refer to engineering teams, but the guidance equally applies to data and product teams, too.
Privacy Engineering Technical Skills
De-Identifying Data For Erasure
One of the hallmarks of modern privacy ops is the DSR: the data subject request, sometimes called a data subject access request (DSAR). From GDPR to CCPA and beyond, more individuals are being granted rights to request that companies access, update, or erase the personal data they hold on them. While GRC teams might oversee the DSR fulfillment process, Engineering teams are the ones with boots on the ground, handling tasks like data discovery and data deletion. Skills required to execute the full suite of DSR types are vital, but we focus on erasure here as an illustrative example.
Building an erasure pipeline requires working knowledge of contemporary data architecture: data warehouse platforms like Snowflake, caches such as Redis stores, among others. Backend engineers tasked with building these pipelines should be equipped with knowledge of SQL and NoSQL databases such that instances of the requester’s personal information are correctly identified for erasure.
After the requester’s personal information is identified, the appropriate erasure strategy should be applied. Simply putting a null value in the database is not always the prudent move. From a technical standpoint, certain entries must retain specific data types or structures. Otherwise, the erasure could disrupt downstream data operations. Informed by existing data operations, erasure strategies should include some form of each of the following:
- Hashing data with a consistent salt
- Supplying fixed or random values in accordance with existing restrictions on data type or structures
- Setting values to NULL
For instance, the most appropriate erasure strategy might be to erase certain categories of personal data using a fixed string “MASKED.”
Conducting Privacy Product Research
Usability is a bottleneck on impact in privacy engineering. Beyond developing products that respect users’ data, engineers must evaluate the uptake of those innovations by their intended audiences. In collaboration with product designers, engineers should develop quantitative and qualitative metrics and modify the product accordingly. Depending on the nature of the product, this responsibility can involve A/B testing, questionnaire design, or backend analytics on site performance. Because of this, privacy engineers work on both customer-facing and internal challenges.
It’s important to note: some of the very tactics for improving usability can introduce privacy risk! Make sure you weigh those risks up while planning for usability testing.
Externally, engineers should identify where end-users are experiencing obstacles in implementing the privacy product. They should understand and root out dark patterns in UX: any design patterns that cause users to act against their own best interests. For instance, obscure language or presentation of privacy controls can prevent users from making an informed choice.
Internally, engineers might collaborate with in-house data scientists to develop a privacy-preserving toolkit for analyzing users’ product usage in aggregate. In consulting with the data scientists, the engineers will adapt the toolkit to ensure that it is as low-friction as possible, compatible with existing system requirements without compromising technical privacy protections such as differential privacy or k-anonymity.
Validating and Verifying at Scale
Data is moving faster than ever, in greater volumes than ever, and to more places than ever. At the same time, the international data flows are complex and shifting. On the technical side, engineers are responsible for developing workflows in cloud-computing platforms like AWS that abide by international requirements while being ready to adapt to amended data-transfer agreements. To do so, they validate against frameworks like access control lists (ACLs) to ensure that governance models are translated into entity-level access controls.
Privacy Engineering GRC Competencies
Staying Informed On Emerging Regulations
Depending on the size and strengths of the GRC team, a GRC specialist may already keep close tabs on the upcoming privacy regulations relevant to the company. However, the privacy-informed engineer plays an indispensable role as a liaison between GRC and Engineering teams. They should be familiar with the business requirements of landmark regulations like the European Union’s General Data Protection Regulation (GDPR). Even though the regulations that have followed in GDPR’s wake are not identical, understanding GDPR provides a widely applicable framework for parsing new regulations worldwide.
The concept of extraterritoriality is essential for anyone in privacy to understand. By and large, it does not matter where a business is headquartered, but where its end-users reside. Because of this, GDPR applies to companies with EU users, not just companies with headquarters in the EU. By monitoring regulations and understanding how best to update technical systems, engineers can reduce friction throughout not only their team but also GRC and product teams. The news section from the International Association of Privacy Professionals offers a great starting point for getting up to speed on the latest state-level and global regulatory news.
De-Identifying Data For Erasure, Part 2: Business Requirements
While data erasure is an involved technical process, it also requires a GRC sensibility regarding if, when, and for how long to retain specific data. Global privacy regulations carve out exceptions for erasure, where some data should not be erased in order to meet business requirements. For example, customers’ purchasing information is generally considered personal information, so it seems to be in the scope of an erasure request. But tax requirements generally require such data to persist for a set period, before deletion is permitted.
An engineer supports a healthy erasure workflow—in which the data erased is precisely what’s needed according to the law—by interfacing with the GRC team to understand what business requirements limit the extent of data erasure. As with much of privacy engineering, the engineer must be ready to translate across disciplines in these GRC conversations. They must know, for instance, how to apply legal counsel’s guidance on tax compliance into fields in distributed data systems.
Understanding The Legalities Of Consent
Consent is a cornerstone of modern privacy, and violating it has led to some of the industry’s largest fines. For instance, the largest GDPR fine to be finalized, over $56M against Google, arose from consent violations. The largest GDPR fine to be levied, not yet finalized, also related to consent violations: $877M against Amazon. Engineering teams should have personnel who understand the basic principles of consent in data processing contexts. For instance, the personnel should know the distinction between opt-in and opt-out approaches to consent. Any systems that perform operations requiring users’ consent should request consent that is in line with Recital 32 of GDPR: it is “a freely given, specific, informed and unambiguous indication of the data subject’s agreement.”
Privacy Engineering: Finally Coming To The Fore
Privacy engineering is younger than the related field of security engineering, which has become a core competence for teams shipping enterprise-grade software in recent years. Privacy engineering is on track to follow a similar path, as organizations recognize it as the scalable, sustainable means to build respectful and compliant products. In the years to come, Engineering teams will likely see a greater number of specialized programs for training in privacy engineering.
In the meantime, building capacity in privacy engineering not only reduces friction in workflows—it can also confer a competitive edge. In other words, privacy engineering isn’t just about avoiding fines; by proactively embedding respect into data systems today, a company can build a genuinely trustworthy brand that will stand the test of time.