Ever since the White House’s Executive Order on Improving the Nation’s Cybersecurity, federal agencies have made a concerted effort to implement zero trust in order to achieve the specific goals laid out in the Federal Zero Trust Strategy by the end of fiscal year 2024.
This trend was the foundation for a lively discussion at the Billington Cybersecurity Summit in Washington, D.C. The session focused on how agencies are protecting data within the context of zero trust. Dynatrace public sector chief technologist, Willie Hicks, moderated the panel comprising the following notable experts from within the federal government and the industry:
- Shane Barney, chief information security officer for the Department of Homeland Security;
- Gerald J. Caron III, chief information officer for the Department of Commerce International Trade Association;
- JR Williamson, CISO at LEIDOS; and
- Travis Rosiek, public sector CTO at Rubrik.
Reflecting on their discussion, I identified five emerging “truths” about zero trust and data. These truths illuminate the nuances that impact whether zero trust implementations will be successful in the long term.
The five truths about zero trust in the federal government
1. Identity may not be the most important zero trust pillar.
At its core, zero trust is about following this principle: “Never trust, always verify.” While, for many, zero trust typically begins with identity, the panelists argued that it is not more important than the other five pillars that the Cybersecurity and Infrastructure Security Agency (CISA) has outlined.
“Identity is not necessarily as important as data because at the end of the day, we steal identities to get to the data,” said Williamson. “I think the idea is earned trust. It’s about managed trust, it’s about building trust, and through strong identity and access management, we can earn the trust that we need to use data effectively and for the intended purposes.”
2. Zero trust is not a standalone control.
We cannot consider zero trust as a be-all, end-all silver bullet that will solve all our It is simply one approach. Defense-in-depth and cyber resiliency are also important methods of protecting data.
“Don’t be overly reliant on one system to protect you,” said Rubrik. “Make sure you have various failsafe mechanisms in place. That can be increased visibility – endpoint, network, and data – so you know what you have for protection. Then, make sure you have other systems in place to identify when your failsafe mechanisms stop.”
The rise of quantum computing is also increasing the importance for agencies to plan for the future, ensure the readiness of their cybersecurity infrastructure, apply defense-in-depth and cyber resilience best practices, and enable teams to be agile and responsive.
3. Zero trust must focus on the flow of data.
For security teams to effectively protect their data, they must establish a baseline understanding of “normal” data flows. “Understanding your data, what you have, what it’s doing, where it’s going, where it’s flowing, how people are accessing it, and when they’re accessing it are things you want to be able to understand so you can properly protect and segment it,” said Caron.
Establishing this baseline enables teams to recognize and mitigate anomalous – and potentially malicious – behavior around their data. It also allows them to take steps to reduce their attack surface, such as using microsegmentation.
4. Use risk assessments to prioritize data.
While “data is the new oil,” not all data is created equal. There is classified and unclassified data. There is sensitive, critical information and information needed for routine tasks that won’t bring major consequences in the event of a breach. It’s extremely costly to elevate the security of an entire digital ecosystem to an optimal zero trust level. So, it’s best to assign value to data and protect it accordingly.
5. Sharing is caring.
While agencies are putting safeguards in place to protect data, they still must share information with fellow agencies, industry partners, and other external parties. Rather than moving data around and risking its security, integrity, and authority, agencies are exploring ways to enable controlled access to information.
“Yes, [sharing] does increase risk,” said Williamson. “But that’s what zero trust principles are about. That’s why this conversation around identifying and safeguarding our data is so important.”
Protecting data with unified observability and security
Many federal agencies trust the Dynatrace platform to help them oversee and protect their data. With automatic and intelligent observability, agencies can continuously monitor and capture all data from logs, metrics, and end-to-end transactions. Additionally, the Dynatrace unified platform uses artificial intelligence to set baselines and automatically identify potentially threatening activity to enforce least privilege.
Looking for answers?
Start a new discussion or ask for help in our Q&A forum.
Go to forum