News & Insights

Melanie Ensign Melanie Ensign

Mastering the Art of Privacy Engineering

Julia Child’s TV appearances turned French cooking into a mainstream and lucrative community. PrivacyCode is here doing the same thing for privacy engineering.

Eric Lybeck, Director of Privacy Engineering

Photo by @ellaolsson on Unsplash

Books that were just recipes or knitting patterns were not very exciting, but when Julia Child appeared on television, she was exciting and inspired a lot of people to cook French food. Her books became best sellers. 

So it was with the early internet. The first websites were like static recipes or knitting patterns, and not very engaging. Once there was a community of cooks, pots and pans, food merchants, and eventually competing hospitality restaurant competitors. Things became interesting and lucrative for the community. 

These communities couldn’t exist and couldn’t find subject matter until Vint Cerf added standard addresses to websites and turned them into potential little interactive portals and even independent conglomerate companies. 

We’ve seen the same pattern of events with early efforts to apply technologies to the problems of privacy. Some thought their AI solutions were going to get us there to mark up the “data” and ML would automatically inventory and classify data. There just wasn’t enough context or change in time, and the context and actor could not be understood.

Then there are the solutions that scan the code, again looking for the magical context.

Code scanning, not working? You need to tag the code with the right context. 

With PrivacyCode, we have Privacy Objects, or Tasks. The task markup language allows us to do an analysis of the behaviors of the Task itself. The task is inherently a rule and the language may recognize who, why, where and when an operation is performed on the rule, in relation to the rule, or in combination with the rule.

In other words, context. 

The synergy among Tasks and Rules and Actors are correlated to business outcomes. How does your software engineering activities help you achieve your corporate goals? PrivacyCode provides the answer.

The AI standards bodies are all knitting very lovely patterns. As are the individual data protection compliance rules. PrivacyCode creates an interactive community that brings together lawyers, developers, and marketing folks – to make the unhappy, happy.

We have the platform and the swarm intelligence branch of AI has the right math. The future of PrivacyCode is irresistible. 

Read More
Melanie Ensign Melanie Ensign

The Need For Intelligence Privacy in the Intelligence Age

Intelligence privacy is protecting an individual from the potential negative impact from the use of Artificial Intelligence (AI) and other substantive privacy threats through the use of digital information by synthetic intelligence.

Eric Lybeck, Director of Privacy Engineering

Digital nueral networks

As a privacy engineer, I am increasingly concerned about the threat to privacy in the digital age. It’s one of the reasons I work at PrivacyCode, where we built a flexible platform that enables our customers to manage privacy, security, and AI engineering.

While important, we know the answer to the threat of privacy is not always more laws and government regulation. The GDPR, the California Privacy Rights Acts, the EU AI act, and other regulatory protections are not enough and will not be enough to protect privacy in this new Intelligence Age. We need to think about privacy in a more holistic way, encompassing both substantive privacy and informational privacy, and consider a new type of privacy for this new age, intelligence privacy. 

Substantive and informational privacy are like the two interfaces of a software system, inseparable and mutually dependent. Our ability to live in the information age would be destroyed without our computers, presenting information to us through their screens, and communicating with the rest of the world through their communications. 

Just as our mobile phone cannot function without both of these interfaces, respect for substantive and informational privacy are both essential for human society to live freely and without fear. 

We need informational privacy to protect our substantive privacy when we are making decisions about our health, finances, or relationships. We also need substantive privacy to protect our data privacy, for example, by preventing companies from collecting and selling our personal data without our consent. We also need intelligence privacy.

Intelligence privacy is protecting an individual from the potential negative impact from the use of Artificial Intelligence (AI) and other substantive privacy threats through the use of digital information by synthetic intelligence. Through the use of AI, threats once possible only through the analysis of an individual’s personal information, are now possible without any knowledge of that individual. 

AI systems built without sufficient controls can be used to conduct particularly insidious discrimination against individuals. Perhaps the outputs are used and disguised as concern or solicitude. A fantasy? China’s social credit system is a national credit rating and blacklist system that assesses the trustworthiness of individuals based on their behavior. Individuals given higher social credit scores will be eligible for rewards. Individuals with low scores will be punished, such as being banned from traveling or staying in certain hotels.

It is easy to imagine other systems creating other discriminatory behaviors that can be normalized and accepted through social conditioning. A society might imagine it could predict the lifetime capabilities of five-year children using advanced artificial intelligence algorithms. The same society may promote such a system as it would allow for the heavily gifted children to be provided special opportunities. The impact however may be to deny that child the opportunity to make their own choices for their entire life. Such a system operating at a state-level will likely benefit a few at the expense of many. Those left behind may never benefit, history is full of examples of authoritarian rulers or societies who committed many horrors.

Privacy, security, and AI engineering are essential understanding for software engineers who must take a leadership role to help to protect society against threats from intelligence privacy failures. 

PrivacyCode’s platform helps the software engineer - and the AI engineer.

One use of the platform is to consider threats at the design stage. These threats can be non-obvious and are best addressed in system and algorithm design. Well designed systems protect intelligence privacy and protect users from the serious harms that can arise from the use of personal information by an adversary. 

Here are some additional, tangible actions that you, as a software and AI engineer, can take:

  1. Advocate in your company for establishing guiding principles for responsible AI development. To help develop these guiding principles, consider the NIST AI Risk Management Framework, the Hiroshima Process International Guiding Principles for Organizations Developing Advanced AI Systems, and the Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence. 

  2. Consider the potential risks and ethical implications of the AI systems during design. Perform security, privacy, and artificial intelligence threat modeling in a software platform, such as PrivacyCode, that allows for threats to be identified and mitigations tracked through the system lifecycle. 

  3. Document the AI system engineering activities so that decisions and actions are improved upon over time.

There has been no more interesting time to be a software engineer than the present. Through your efforts, we can build solutions that protect intelligence privacy.

Read More
Melanie Ensign Melanie Ensign

Customer Case Study

PrivacyCode’s platform for Privacy Optimization and Integration provides this global Digital Services, Document Management, and Financing Company a centralized System of Record to capture roles, responsibilities, decisions, actions, and insights so teams can build quickly and confidently.

A global Digital Services, Document Management, and Financing Company tapped PrivacyCode to create a centralized System of Record to capture roles, responsibilities, decisions, actions, and insights so teams can build quickly and confidently.

This multibillion-dollar multinational operates in 160 countries and continues to retain a longstanding reputation for quality and service, delivered by its 25,000+ employees, and supported by a large partner ecosystem. 

The company serves the vast majority of the Fortune 500 and enjoys the confidence of approximately 250,000 customers worldwide. With nearly 15,000 patents owned and in use today, it is regarded as one of the world’s greatest innovation engines.

The Situation

Like most companies, this PrivacyCode customer continues to redefine its post-pandemic relationships with its stakeholders.  Given its recognition as one of the world’s most reputable companies and its ongoing Environmental, Social and Governance commitments, the company’s leadership wanted more certainty that its data and privacy controls were comprehensive enough to support its corporate strategy to grow market leadership, monetize innovation and optimize operations. 

PrivacyCode Delivers

This customer turned to the PrivacyCode Platform to establish new and relevant Data Privacy approaches and controls.

PrivacyCode’s cloud-native, AI-driven platform is purpose-built for actionable insights and results.  It curates, orchestrates and manages the avalanche of new and changing requirements across the complex and nuanced areas of Data Privacy, Data Governance, Data Quality, and the rapidly increasing need for Responsible AI Builds, Implementation and Governance.  

Leveraging PrivacyCode’s NLP and LLM AI capabilities, ingesting their information into a number of PrivacyCode’s Privacy Object Libraries (themselves based on globally-recognized standards and frameworks), allowed the customer to not only address Data Privacy Controls, it also created formal methodologies for remaining on top of and ahead of such challenges in the future.

Within 90 days, the PrivacyCode platform and focused advisory services for baseline analysis, alignment and implementation guidance addressed this core challenge, while positioning the broader and deeper use of PrivacyCode across this global entity. 

What’s Next

The pace of change in this space is unrelenting and the multitude of regulations and oversight demands further fueled by generative AI is growing at near-exponential rates.

When leaders invariably move or take on new responsibilities, the corporate knowledge that has been amassed goes with them. The PrivacyCode Platform not only ensures that this doesn’t happen, it also creates a System of Record where roles, responsibilities, decisions, actions, and insights are captured such that organizations can build on what they’ve learned & captured, as opposed to effectively starting over again.

PrivacyCode’s platform for Privacy Optimization and Integration goes far beyond serving the important needs of data practitioners while at the same time, it powerfully raises their own strategic value to the business.

This customer intends to use the PrivacyCode platform and its Privacy Object Libraries to create real, always-current privacy automation and controls, while building a deep and lasting System of Record which, crucially for this business, reinforces to their stakeholders that privacy is everyone’s job.

Read More
Melanie Ensign Melanie Ensign

Extending PrivacyCode to Address Responsible AI

PrivacyCode’s AI/ML engine and Privacy Object Library enables any organization to manage Responsible AI challenges by connecting them to business goals.

By Eric Lybeck, Director of Privacy Engineering

AI is now ubiquitous and has already changed the way we live and work. We buy an increased amount of goods and services online, sometimes not even realizing AI is providing us recommendations. Social networks use advanced algorithms to keep us engaged. In the field of medicine, AI offers incredible progress in the early detection and treatment of disease. 

As with any new technology, it’s important for organizations to prioritize their investments, address threats and risks posed by AI, and measure results in an effective manner.

PrivacyCode, working with input from our design partners and leveraging our AI/ML engine, created our Privacy Object Library that enables any organization to manage Responsible AI challenges by connecting them to business goals.

Link Responsible AI to Business Goals

The first step in building a Responsible AI program is identifying your desired outcomes. Whether the outcome is to improve customer retention, increase efficiency, or accelerate innovation, as long as you know the outcomes, you can start to track the AI initiatives and their impact. 

For example, the goal may be to accelerate innovation in a specific market or vertical, so projects that involve incorporating AI-powered or defined capabilities in your products, may align to this corporate goal. Tracking, measuring, and proving that impact is what PrivacyCode.ai was built for. 

Use a Common Enterprise-wide Framework

Despite popular headlines, AI is not an unregulated “Wild West.” Existing regulations already govern AI’s use cases and derivatives. Consider that all internal corporate policies apply as well, so you need to keep in mind all of these cross-disciplinary requirements related to security, privacy, ethics, and non-discrimination, to name a few. 

The commonly-cited uncertainty of AI regulation often comes from new or emerging laws and frameworks that either add to or intersect with these existing requirements.  For example, there are new frameworks, such as the NIST AI Risk Management Framework and proposed new laws such as the EU AI Act. This makes it increasingly important, yet difficult for organizations to stay up-to-date on the latest developments. PrivacyCode.ai was built for this too.

We use AI Machine Learning technology to quickly update our Privacy Objects library with new and emerging frameworks and requirements. Then we distill them into repeatable, reusable tasks that business teams can own and implement. Our Responsible AI library, Ethical and Responsible AI Essentials, provides the foundation of an enterprise-wide framework. 

Design, Build, and Maintain Responsible AI Systems

Our customers use PrivacyCode.ai to manage Responsible AI projects, and solve problems such as validating the AI training dataset compliance, communicating how AI systems work, and proving fair and non-discriminatory results.

• • •

If you are interested in more information about how you can improve your outcomes with Responsible AI, you can contact our team here.

Read More
Melanie Ensign Melanie Ensign

“She Said Privacy / He Said Security” podcast interview

Michelle addresses the privacy and security risks companies face in regard to AI, the current state of tech regulations, and how PrivacyCode.ai enables customers to build and prove the business value of global privacy programs.

Privacy AI: The Future of Building Smart Privacy Programs

Join hosts Jodi and Justin Daniels in this episode of She Said Privacy/He Said Security Podcast, where they again interview Michelle Finneran Dennedy, CEO of PrivacyCode.ai about the surge in privacy tech stack.

Additionally, Michelle addresses the privacy and security risks companies face in regard to AI, the current state of tech regulations, and how PrivacyCode.ai enables customers to build and prove the business value of global privacy programs.

Listen to the full episode.

Read More
Melanie Ensign Melanie Ensign

“Tech Seeking Human” podcast interview

Podcast host and Tech Evangelist Dave Anderson recently spoke with PrivacyCode.ai CEO Michelle Finneran Dennedy about data strategies for entrepreneurs, including how to properly collect it, process it, and piece it all together.

Privacy and Tech: Is it time to freak out yet?

PrivacyCode.ai is the privacy engineering platform for business-focused teams, even startup business teams.

Podcast host and Tech Evangelist Dave Anderson recently spoke with PrivacyCode.ai CEO Michelle Finneran Dennedy about data strategies for entrepreneurs, including how to properly collect it, process it, and piece it all together.

In his own words, Dave says this episode is for:

…people that are interested in the in’s and outs of privacy from an expert that not only wrote the book on the topic, started the Chief privacy functions of some of the largest companies, but has gone out on a limb, to start her own company hell bent on educating people about privacy and ensuring the next generation of leaders get their privacy strategies, documented, ethically tested, and are sustainably sound. 

She makes Privacy sound cool. I really didn’t think that was possible.

Catch to the full discussion with Michelle and Dave above.

Read More
Melanie Ensign Melanie Ensign

Customer Case Study

With PrivacyCode this Global Relationship Intelligence Automation provider can now pursue new business opportunities, while supporting critical needs for proactively incorporating new regulations and privacy insights to meet demanding customer expectations.

PrivacyCode curates, orchestrates, and manages the avalanche of new and changing requirements for a Global Relationship Intelligence Automation Provider.

Photo of lights by Joshua Sortino

A leader in Relationship Intelligence Automation leverages AI and an analytical understanding of the power of relationships, for deeper team collaboration and deal flow acceleration.  Business relationships today have been profoundly and forever affected by communications channels and technology such that there is great unrealized value and opportunity to be found in extracting new insights from how these relationships manifest and interconnect.

The company is privately funded, raising $100+ million through multiple funding rounds.  Its cloud-native platform has driven thousands of organizations to new levels of insights, collaboration, and an ability to standardize and accelerate deal flow processes.

While its focus is on deriving deep value from business relationships and networks of all kinds, it operates with customers in markets where trust and privacy are paramount for everything from regulatory reporting to competitive advantage.

The Situation

As a private and rapidly growing company serving data intense and data sensitive markets, this PrivacyCode customer had an ongoing and critical need to remain fully aware of data protection requirements for its global customer base.  

The need to address these requirements on the near-immediate timeline that is expected in the markets they serve was creating new growth for the company, while existing customers’ expectations continued to expand. If this company needed to proactively cover and address these requirements, and eliminate potential blind spots in their relationship intelligence platform.

The company’s leadership scanned the market and looked to their own legacy providers, only to realize that none of their vendors or others in the space had the depth and breadth of the requirements needed. At best, they were privacy-adjacent and could not offer centralized and accountable capabilities built on a highly responsive and flexible platform.

PrivacyCode Delivers

As a result of key gaps across the legacy offering landscape, this customer turned to the PrivacyCode Platform.  The customer realized that they could now pursue new opportunities, while supporting critical needs for proactively incorporating new regulations and privacy insights in a manner that met demanding customer expectations.  

They also came to realize that in order for them to maximize the promise and potential of Relationship Intelligence and given the tremendously high trust expectations, that they too needed a platform which would hold them to their own privacy accountabilities. 

PrivacyCode’s cloud-native, AI-driven platform is purpose-built for actionable insights and results.  It curates, orchestrates and manages the avalanche of new and changing requirements across the complex and nuanced areas of Data Privacy, Data Governance, Data Quality and the rapidly increasing need for Responsible AI Builds, Implementation, and Governance.

Within the first 45 days, leveraging PrivacyCode’s NLP and LLM AI capabilities, ingesting their data/regulatory specific requirements into a number of PrivacyCode’s Privacy Object Libraries (themselves based on globally-recognized standards and frameworks), the PrivacyCode implementation addressed the missing pieces, while establishing a centering capability for new levels of collaboration and accountability. 

What’s Next

The Relationship Intelligence Automation market, while specialized, is significant and can positively impact deal flow unlike any other type of relationship management platform or service.  Bringing together disparate data sources to uniquely maximize value demands a systematic/system-wide appreciation for the data being disseminated.  It also demands a clear understanding of the regulatory requirements having a direct and material effect on the value of that synthesized data.

The PrivacyCode Platform for Privacy Optimization and Integration creates a lasting System of Record where roles, responsibilities, decisions, actions and insights are captured such that organizations can build on what they’ve learned and captured.

With their immediate challenges addressed, this customer is now planning a larger PrivacyCode platform rollout as they now find themselves able to aggressively pursue even larger and more strategic opportunities as their rapid growth trajectory continues.

Read More
Melanie Ensign Melanie Ensign

Video Interview: Managing privacy in constant motion

In this interview with Information Security Media Group, PrivacyCode CEO Michelle Dennedy explains how organizations can look at privacy at a strategic and "almost cellular level" that is in constant motion.

In this video interview with Information Security Media Group at the RSA Conference 2023, PrivacyCode CEO Michelle Dennedy discusses:

  • Post-pandemic privacy challenges

  • The privacy law landscape globally

  • Evolving privacy issues involving TikTok, social media and younger users

Watch her full interview here!

Read More
Melanie Ensign Melanie Ensign

Customer Case Study

For a global digital experience and account-based marketing leader, PrivacyCode delivers a foundational platform where the business, operations and technology teams find a shared understanding of tasks and actionable data insights, and where they can connect the value of their work to the business overall.

Global leader in digital experience and account-based marketing turns to PrivacyCode for a foundational platform where business, operations, and technology teams find shared understanding, actionable data insights, and connect the value of their work to the business overall.

Photo of the word "Together" carved into wood by Nick Fewings

This global leader in Digital Experience and Account-Based Marketing is focused on redefining and repositioning the go-to-market function for B2B companies across geographies, segments and verticals. 

The company, and its $1B+ valuation, depends on bringing disparate data sources and AI together to create highly personalized engagement across the increasingly complex B2B Buyer Journey.

Its platform supports thousands of customers around the world in nearly 150 languages, making it both globally-engaged and globally-sensitive to regional data and regulatory requirements.  

The Situation

As a private and rapidly growing company, this customer had expectations being placed on it by its investors, customers, and partners.  The range of data demands from different jurisdictions had created layers of unanticipated business challenges.  

The company’s leadership evaluated point offerings and could not derive either lasting value or resource efficiency.  These privacy-adjacent products did not and could not meet the company’s requirements for flexibility, responsiveness, cost-avoidance and risk mitigation.

So as not to hinder its growth aspirations or cause irreparable harm to the business, the company knew it had to address the challenge, particularly as their technology teams and business teams had neither clear accountability nor a comprehensive appreciation for the data requirements which quite literally underpin the company’s business.

PrivacyCode Delivers

After failed attempts with legacy offerings, this customer turned to the PrivacyCode Platform to support multiple needs. First, the customer needed to respect the cost-avoidance requirements for adding personnel to this function.  The time, training and multi-jurisdictional expertise required was rightly seen as untenable.  

Second and keeping with its own focus, the customer wanted to leverage something comprehensive and revolutionary but which was also based on proven experience, knowledge, and global awareness.

They also needed a foundational platform where the business, operations and technology teams could find a shared understanding of tasks and actionable data insights, where they could clearly connect the value of the work to the business overall.

PrivacyCode’s cloud-native, AI-driven platform is purpose-built for actionable insights and results.  It curates, orchestrates and manages the avalanche of new and changing requirements across the complex and nuanced areas of Data Privacy, Data Governance, Data Quality and the rapidly increasing need for Responsible AI Builds, Implementation and Governance.

Within the first 60 days, leveraging PrivacyCode’s NLP and LLM AI capabilities, ingesting their data/regulatory specific requirements into a number of PrivacyCode’s Privacy Object Libraries (themselves based on globally-recognized standards and frameworks), the PrivacyCode implementation addressed their range of core challenges, allowed them to remain judicious on resource utilization, while helping the company to maintain its focus on its customers, partners and strategy execution.

What’s Next

Particularly owing to its multilingual capabilities, the company’s leadership understands that the work needed to protect their business and “digital crown jewels’ has historically been bespoke.  As their digital leaders move or take on new responsibilities, the leadership is deeply committed to retaining such strategic corporate knowledge.

The PrivacyCode Platform for Privacy Optimization and Integration creates a lasting System of Record where roles, responsibilities, decisions, actions and insights are captured such that organizations can build on what they’ve learned and captured, as opposed to effectively starting over again.

This customer is planning a larger PrivacyCode platform rollout, serving more parts of their business, while enhancing stakeholder engagement and expectations.

Read More
Melanie Ensign Melanie Ensign

The Privacy Paradox. What’s a CISO to Do?

Although in some instances, the privacy function may indeed report into the CISO, the intention here is for CISOs to shift their mindset and turn privacy integration and collaboration into an advantage for the security function writ large. Here are some ways a CISO might want to approach the new realities of the privacy-security partnership.

By Kristy Edwards, co-founder and Technical Advisor to PrivacyCode

People expect CISOs to be superheroes.  

They are expected to manage a highly complex tech stack while staying ahead of a perennially expanding volume of cyber threats. They often must do this while being understaffed, under-budgeted and under a lot of stress. 

And now, data privacy is also increasingly on the CISO’s mind - and in some cases, directly on their plate. This means that CISOs and their teams will need to get comfortable with different models and focus areas.

I’m going to explain why this is a good thing - for the security and privacy teams, and for the organizations they support. And, how CISOs can work effectively with privacy teams -or in the absence of one- to increase their overall security posture. 

First, it’s helpful to understand how privacy and security have (or have not) worked together in the past, and how they interact today. 

Photo courtesy of #WOCinTech Chat

The Privacy Function: Moving From Siloed to Symbiotic

Historically, CISOs have been mostly insulated from the privacy function, and vice-versa. Privacy teams usually resided in the legal department, where they crafted policies and stayed abreast of new regulations. Meanwhile, the CISO and his or her team focused on strategic approaches to preventing breaches, engaging everyone from the Board to sysadmins and monitoring endpoints for indicators of compromise. If they engaged with the privacy team at all, it was when a new policy was handed off to them to “figure out” how to implement it appropriately in products or systems.

That started to change when the California Legislature approved SB 1386, which mandates that companies with customers in California must inform them if they believe their personal information has been breached. Arguably, SB 1386 helped usher in a whole new generation of privacy regulations. I was in a security executive role at the time and saw firsthand how this new regulation was a forcing function for the C-Suite and Boards to recognize that security controls are a business imperative.

In other words, a privacy law helped to elevate the posture of cybersecurity inside the enterprise.

Fast forward to now, when an explosion of cyber threats - ransomware, malware, nation-state attacks, you name it - and a patchwork of new privacy laws have emerged concurrently to make the CISO’s job highly fragmented. They can barely keep up with stopping breaches let alone understand and implement protections for different categories of personal data that these new privacy regulations demand.  

As this threat landscape has evolved and the amount of data in the hands of businesses has exploded, the structure of security and privacy teams has also changed.

Today, I see three scenarios for how privacy and security teams work together (with nuances within each.)

  • Closely adjacent: In this model, privacy and security teams are “peer” groups that work together regularly. However, they have well-defined roles and boundaries. For instance, the privacy team might tell the security team that stored data encryption is required for sensitive personal data, but they have a very clear handoff point to security, who specifies encryption algorithms and key lengths and leads the security review to execute this requirement. This model is more symbiotic - each team understands the value and role of the other and have processes and guardrails around who does what.

  • Combined:  Here the two teams work as one - the operational and technical elements of the privacy program are within the CISO’s domain and report into him/her, often with a dotted line to the CPO. I personally have run global privacy programs within this structure, which tends to exist in companies that are more “security forward” and understand the critical role of the privacy engineer within the security function.

  • Cobbled Together: In this scenario, privacy responsibilities are doled out across various company stakeholders. This is most common in small to medium sized businesses that have not invested in full time privacy staff. Various aspects of privacy may be an added responsibility for a variety of stakeholders, from a commercial attorney to a privacy engineer to an organization’s CISO. For these folks, privacy may be their 2nd (or 3rd!) job, and so keeping track of how new regulations and policies are deployed and enforced in their business can be particularly challenging. 

I share this background because it can help CISOs who are struggling to “figure out how to work with privacy” envision a potential model. However, internal structure and division of labor only goes so far. To succeed, CISOs need to understand that no matter how the function is assembled, today they sit at the intersection of data privacy and cybersecurity. And rather than be overwhelmed by that or wishing it will go away (it won’t), they should understand that collaborating with privacy teams and finding workable technical solutions can actually, paradoxically, make a CISO’s life easier.

Actions CISOs Can Take to Make Privacy Work for Them

When I say “make privacy work for them,” I mean that figuratively. Although in some instances, the privacy function may indeed report into the CISO, the intention here is for CISOs to shift their mindset and turn privacy integration and collaboration into an advantage for the security function writ large.  Here are some ways a CISO might want to approach the new realities of the privacy-security partnership.

  • Accept the new reality.  The protection of personal information is a security problem.  While it’s true that data privacy focuses on how personal data is collected, used, and shared; and data security refers to the measures and technologies used to protect data from threats, the reality is more complex. 

For instance, employee monitoring of email and other communication channels has employee privacy implications for the enterprise. How this monitoring is done in terms of tools and technology often lands with the security team - even though the notice and oversight requirements are the domain of privacy teams. 

And there’s the rub, for CISOs at least. Security leaders may have a larger team or have a seat at tables where the privacy leaders do not, but security itself is downstream of privacy. A new regulation or policy is born first, and then the security team needs to figure out the implications of that policy for a whole slew of scenarios such as this one. And if something goes wrong in the protection of data or how a privacy law or policy is implemented the buck will often stop with the security team.

By understanding, accepting and leveraging this current dynamic, CISOs can begin to find ways to even the playing field, so that accountability is shared between the functions, and the CISO’s posture is elevated to being more strategic and business-outcome focused.

  • Embrace collaboration - or pay the price. As mentioned above, security has been downstream of privacy. Today, they need to meet in the middle. Not only to protect an organization’s most valuable assets - its data and reputation - but because the C-suite and the Board expect it. Frankly, most Boards don’t really understand the difference between the respective roles and responsibilities of privacy and security. What they really care about is risk exposure.  

To manage that risk, a spirit of cooperation and collaboration is essential–as equal, respected partners (regardless of fancy degrees or technical prowess). There are significant benefits of this to both teams, but for the CISO it enables them to get off defense and take a more proactive role in managing risk in concert with their privacy counterparts. Even more important, by working together, the chance of something falling between the cracks is minimized. When these teams work in silos, it’s often at cross-purposes and even in competition with one another - for budget, attention and credibility. And we know how well that goes over within an enterprise. 

So what does “collaboration” look like?  As mentioned before, enterprises structure their teams in different ways. Depending on whether yours is closely adjacent, combined or cobbled together will often determine how easy or challenging it will be to work together. But don’t let org structure be an obstacle. There are many opportunities to collaborate independent of where a function resides.  

Incident response planning is a good place to start. Often breaches (which the security team manages) involve personal data (which the privacy team is charged with protecting.)  Collaborating on your incident response plan - and doing tabletop exercises so that both entities are appropriately involved - is a great in-the-trenches-together way to team up and identify gaps, overlap and weak spots.

Another opportunity is to share strategies, even philosophies, of how you each approach your mission. This does not involve cross-training each other - privacy and security are still very different animals when it comes to expertise and daily responsibilities - but rather ensures there is an awareness and understanding of each other’s operating framework. For instance, the National Institute of Standards and Technology (NIST) has developed frameworks with both privacy and security considerations. If your organization adopts the NIST frameworks, coming together to understand the shared principles across functions may be beneficial.

The cost of not collaborating can be high. Financial penalties, reputational harm and loss of business can all occur when personal data is compromised. By structuring teams and processes in a way that enables an intersectional, collaborative approach, these problems can be avoided. 

  • Raise Your Privacy IQ. I get it. When you work in security you’re constantly inundated with information. But a little learning can go a long way to enable a more informed position when working with your privacy partners.

As more and more privacy regulations come into being, the potential fines and penalties for non-compliance will drive at least some of the security spend a CISO must make.

So how can you intelligently spend without a general understanding of laws like GDPR or CCPA? A CISO need not understand these laws in fine detail - that’s the privacy officer’s job - but some knowledge is essential to knowing how best to protect the data these laws govern.  

Likewise, Privacy teams need to understand enough about tech solutions like MFA and zero trust to the degree that it helps them speak to technical teams in a language they can understand. Going further, CISOs can help CPOs understand what is involved in protecting data beyond legal requirements. In cobbled together orgs, privacy responsibilities are distributed among security personnel and non-privacy lawyers, and may need more support (and spend) on privacy services, or increasingly, by leveraging skillfully-built SaaS platforms like ours that can fill in the gaps. 

  • Plan for a privacy-first future.  By now you’re acutely aware that people are serious about protecting their private information - and they have a strong partner in state and federal governments and regulators who take enforcement actions. And that, while the social push to protect an individual’s privacy has accelerated, the process to manage it from the enterprise has remained almost frozen in time - making your job even harder. 

However, CISOs have a secret weapon: significant expertise in acquiring and deploying technical solutions. Privacy teams, for the most part, do not. As you gain more insight into the privacy world and collaborate to secure personal data, you’ll likely see how privacy teams have tried to track their policies, programs and metrics using a variety of processes or tools. Perhaps you’ve been on the receiving end of these efforts - for instance, when product requirements from the privacy team were either not fully documented or an update to a policy wasn’t communicated in a way that the security team could use.

As the owner of an already unwieldy security tech stack, CISOs know that a point solution to manage privacy isn’t likely the answer to these inefficiencies. And that checklists and outdated modes of communicating privacy requirements have resulted in gaps and increased risk.

What CISOs Need To Do Now

CISOs are incentivized to see that going forward, privacy is managed in a more efficient way so that their team can understand requirements, communicate progress and over time, analyze trends. And, to do so in an environment where both the privacy experts and the security experts can work together to see how enterprise projects are moving through each critical stage, in a fully transparent, fully tracked way.  

We built PrivacyCode as a SaaS platform so that privacy and security teams can seamlessly work together on privacy implementations in the language that both teams understand, while gaining valuable reporting and insights. Our solution empowers security teams - who are not privacy experts but who often take on privacy work because of the adjacencies of the two fields - to meaningfully track, measure and prove the effectiveness of their privacy work. 

If you’re a CISO or security professional who is interested in helping to solve the privacy management paradox for your organization, reach out to us. We’d love to hear from you. Meanwhile, as you continue to work hard to protect your enterprise, remember that even superheroes need a day off. 

Read More

Media inquiries

Media@PrivacyCode.ai