Episodes
-
AFCEA’S TechNet Cyber conference held in Baltimore, Maryland was the perfect opportunity to sit down with Greg Carl, Principal Technologist from Pure Storage.
Pure Storage is used by 175 federal agencies. Time to sit down from a subject matter expert and explain their value proposition.
Today’s federal government is attempting to accomplish digital modernization through a move to the cloud and, at the same time, reduce staff. To multiply the risk associated with this endeavor, we see an increase in cyber attacks on data at rest, in transit, and while in use.
Greg Carl drills down on how Pure Storage can help federal leaders in several areas, he begins with Retrieval Augmented Generation, RAG.
People have jumped into AI without knowing how to structure a large language model, the popular LLM. RAG focuses on text generation and tries to make sure the data collected is accurate, relevant, and contextually aware.
Pure Storage asks, if RAG protects the results of a query, what protects the “Retrieval” part of RAG. We know LLMs are being attacked every day. Malicious code could be placed in a LLM, and the RAG system might not know.
A decade ago, backups were child’s play. A server down the hall, a backup appliance. Today, one needs an agile cloud solution to perform continuous backups in a hybrid world. One way to gain resilience is to use immutable backups where the attacked system can be restored and not lose valuable time.
Speed and security handling important data activities can reduce costs for federal leaders by improving accuracy of LLMs and speed the time to recover after an attack.
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
When most of us hear the word “lifecycle,” we normally assume they are talking about the Software Development Lifecycle. Today, we are going to vary that concept and discuss the Contract Lifecycle and its management. It has been recognized as a part of systems management, so it has developed its own abbreviation: Contract Lifecycle Management, or CLM.
Our guest is Ryan Donley from Icertis. He highlights the shift from traditional methods like Excel spreadsheets to modern digital platforms.
Much like software, the CLM can be divided into pre-award, post-award, compliance, and closeout areas. Every agency oversees this sequence in a unique manner.
Ryan Donley points out that some organizations still use Excel spreadsheets for this task. He recommends that people realize that antiquated processes can limit your ability to have accurate information and can cause reporting to be delayed.
Further, when a system is automated, coordination between departments is accelerated, and issues like compliance can be acted upon quickly.
Icertis operates on a single-tenant GCC high cloud with Microsoft, ensuring security and compliance.
-
Episodes manquant?
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Brian Funk from Metaphase summarizes his company in one sentence, “It’s all about meeting – meeting the mission with the technology.” What makes his company unique is the focus on drawing down costs in an efficient way.
That is a great concept and has worked for Metaphase since its founding in 2013, but today we are uncertainly living in a world of policy. The question to ask, how does Metaphase operate in a world where the next six months are almost impossible to predict.
Brian Funk’s response is that they support over twenty agencies, it has given them a range of experience so they can select from a wide range of solutions. One example he gives includes a rapid response to a DHS RFI.
Instead of sketching a possible solution, Metaphase delivered a fully functional application. That in and of itself, is a demonstration of being able to rapidly adapt to unpredictable situations.
Funk also discusses the need for guardrails in AI usage and the potential for AI to enhance both efficiency and security in federal IT.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
The federal government is releasing so many RFIs and RFQs that it is increasingly challenging to select which ones to respond to.
For example, let us say you get 60 requests. You do not have an equal chance to win any of the sixty messages. Do you assign the same amount of time to each one? Do you review each and rank the chances of success? What about the time you used in the ranking process?
Remember, you could jeopardize your chances of winning if you do not respond promptly.
Deep Water Point & Associates offers one solution to this dilemma. During the interview, Brian Seagraves describes a system called “North Star” that leverages AI to look at an opportunity and give it a grade for your specific company. A ranking of 0—100 means you will not waste time or effort on a proposal that will go nowhere.
As a “proof of concept,” John Milward from Axxa painted a picture of a solution. In 2023, he was drowning in responding to opportunities. He started using the North Star system and has experienced drastic improvement.
Brian Seagraves reminds the audience that the federal government still awards contracts and sends out RFPs. During stressful times, it is always best to keep a cheerful outlook and increase the number of opportunities for your company.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Artificial Intelligence can be applied to code generation, predictive analytics, and what is called “generative” AI. " Generative means the AI can look at a library of information (Large Language Model) and create text or images that provide some value.
Because the results can be so dazzling, many forget to be concerned about some of the ways the starting point, the LLM, can be compromised.
Just because LLMs are relatively new does not mean they are not being attacked. Generative AI expands the federal government's attack surface. Malicious actors are trying to poison data, leak data, and even exfiltrate secure information.
Today, we sit down with Elad Schulman from Lasso Security to examine ways to ensure the origin of your AI is secure. He begins the interview by outlining the challenges federal agencies face in locking down LLMs.
For example, a Generative AI system can produce results, but you may not know their origin. It's like a black box that produces a list, but you have no idea where the list came from.
Elad Shulman suggests that observability should be a key element when using Generative AI. In more detail, Elad Shulman details observability from a week ago vs. observability in real-time.
What good is a security alert if a federal leader cannot react promptly?
Understanding the provenance of data and how Generative AI will be infused into future federal systems means you should realize LLM security practices.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
The federal government needs to share information on a wide variety of platforms and must provide methods to ensure this transmission is secure. Of course, the hard part is the “how” part of this data transfer.
Tim Fuhl from Owl Cyber Defense gives the listener an overview of how Owl Cyber Defense can help federal agencies share information securely.
To accomplish this task, he discusses two fundamental concepts: diodes and Cross Domain Solutions.
Diodes. This is a mysterious word that was liberated from electrical engineers. When designing a semiconductor, one may need to create a one-way path to prevent a signal from returning. The solution in electronic design is a “diode.”
Owl Cyber Defense took an electrical concept called a diode, which provided “one-way” data transfer. When they combined this one-way street with a data path, they developed a “data diode,” a device that limits data transfer to one direction, protecting the system from a reverse movement.
When it comes to securing federal systems, a “data” diode is a device that restricts data transfer one way, essentially creating a one-way street.
Cross Domain Solutions. One of the newest abbreviations in the world of security is Cross Domain Solution (CDS). The federal technical world is comprised of levels of protection. As a result, what is needed is a way for communication between varying security levels.
During the interview, Tim Fuhl defines both terms and gives examples of where this innovation can be applied to federal systems.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
In traditional commercial selling, a company seeks to understand its business problems and then presents a solution that would save time and money. Understanding federal requirements has a few more barriers than scheduling a meeting with the CIO.
The federal government has security requirements and considerations few commercial companies can even consider. There are no effortless ways to understand system requirements for a company trying to break into the federal marketplace.
This has been understood for decades. In fact, Ron Reagan decided to help small businesses understand their needs and provide some assistance.
The Small Business Innovation Research (SBIR) program was established in 1982. The concept was simple: an agency would post requirements and look for a small company to get a response. If the proposal was favorable, some steps allowed further development and funding.
During today’s interview, Tom Ruff updated us on the three phases of SBIR and provided specific examples of companies that have successfully navigated the process.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Most people are so overwhelmed with the dazzle of Artificial Intelligence that they dismiss the power of quantum computing.
Reality is the optimized solution for solving some federal problems involving artificial intelligence; other issues need to consider quantum.
Today, Murray Thom puts the ability of quantum computing in a better perspective. For example, when it comes to aerospace maintenance, there are so many variables that classical computing is challenged to provide an answer. We all know that a traditional computer would use bits (0s and 1s). Quantum allows an approach that is not as linear and can provide faster answers to many questions.
The crus of the interview was not a debate on the origins of quantum and Einstein’s remark about God not throwing dice. The debate is over—quantum work. Quantum computing can help the federal government find solutions to public sector challenges like optimizing public services, transportation networks, and defense.
The core of this interview is whether your federal agency is looking for a problem that is too expensive or too time-consuming to solve using classical computing. It is possible to use quantum innovation to solve the problem more economically.
Look at some success stories from D-Wave; they may provide an economic option for you.
Download the D-Wave e-book “Transforming the Public Sector: Quantum-Powered Optimization” on the Carahsoft website.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
It is a challenge for most technology companies to give a three-word summary of what they do, especially with the complexity implied with the phrase “entity resolution.” The CEO of Senzing, Jeff Jonas, gives a three-word summary of the complex issues they manage -- “bad guy hunting.”
OK, what does this mean to federal tech leaders?
Today, we sit down with Will Layton to learn how a topic like “entity resolution” can improve federal cybersecurity.
During the interview, he gives an overview of how federal systems have evolved over the years and the need to understand the implications of automation.
We know federal systems are, in general, moving to the cloud. This may be a private cloud, a public cloud, or even a hybrid cloud. Second, data ingestion has overwhelmed most agencies.
As a result, many large-scale organizations are implementing automated tools, some call “agents” to become more efficient.
Will Layton describes how humans need to be identified an automated tool, or entities, need to establish credentials as well.
When a malicious actor tries to present like an entity in a complex automated system, Senzing can identify it and save federal leaders from unwanted actio
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
For many, IDEMIA is a relatively unknown company that was recently awarded a 10-year Blanket Purchase Agreement from GSA. The focus is on next-generation identity proofing for login.gov.
At first glance, you might say “IDEMIA” is an overnight success. Upon further examination, you will discover that IDEMIA has served federal agencies for over 60 years.
During the interview, Donnie Scott gives listeners a complete rundown on the variations on identity, identity proofing, identity management, and identity access management.
He reinforces that rigorous identity-proofing can reduce waste, fraud, and abuse of federal systems.
This is becoming a more complex problem. For example, technology enthusiasts are experimenting with so-called “agents” to access data, assemble it, and then attempt to draw conclusions.
At each step along the way, there are gateways to verify the validity of the person (or non-human entity) requesting data.
This interview offers a great perspective from a well-respected company that provides identity proofing to the federal government.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Walking around the Salesforce World Tour DC can make you dizzy with use cases. Let’s step back and look at underlying principles.
To boil down the massive information overflow, we sat down with Nasi Jazayeri from Salesforce to focus on improving efficiency by taking advantage of Salesforce agents.
Automation. Federal employees will obviously be asked to do more with less. One way to accomplish this task is to structure a system where tedious decisions do not have to be made by humans. Tasks can be designed without human oversight to a specific level.
Workflows. Salesforce is increasingly becoming a hub for data amalgamation. Integrating API into workflow can improve how systems can manage various dependencies.
Compliance. This is one of Salesforce's superpowers. Everyone is trying to figure out where the best application of agents would be. Inevitably, mistakes will be made. Compliance is built into a system like Salesforce. You can evaluate several options without reinventing the wheel for each instance.
Salesforce has many use cases for agentic applications, such as citizen service automation, healthcare administration, and interagency collaboration. Sometimes, general value principles can reinforce decisions made regarding agents and Salesforce.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Federal leaders are walking a tightrope. They want to leverage the promise of AI; however, they are responsible for making federal data secure. Beyond that, these AI “experiments” should not negatively impact the larger systems and must have a detached view of practical applications.
During today’s conversation, Paul Tatum gives his view on accomplishing this balance.
He illustrates the idea of experimenting with AI through, of all things, avocados. For example, he acts as if he must document the process behind importing avocados. He shows how an AI agent can be used safely and provides practical information.
The key here is “safely.” People working on federal systems are jumping into AI agents without concern for compliance or security. They run into the phrase “unintended consequences” when they access data sloppily, which can lead to sensitive information leaks.
Rather than detailing potential abuse, Paul Tatum outlines the Salesforce approach. This allows experimentation with specific guidelines as well as for compliance and controls for autonomous agents.
This way, the data to be accessed will be cleaned and not subject to misinformation and duplication problems. Further, because you are acting in the functional equivalent of a “sandbox,” you can be assured that information assembled from AI experiments will be placed in areas where they are safe and secure.
Learn how to leverage AI, but learn in an environment where mistakes will not come back to haunt you.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Most people know Cloudflare from federal or commercial experience. They have been around since 2009, and some will estimate that around 20% of all websites use Cloudflare for web security services.
The listener's question is simple: can one apply this commercial success to improving federal network security?
During today’s interview, Anish Patel from Cloudflare answered that question by directing his comments to Zero Trust, User experience, and automation.
Zero Trust is a federal initiative that cuts across civilian and military agencies. Cloudflare can assist by providing access to applications and data by verifying every user and device before granting access.
Because of their commercial success, Cloudflare realizes that an end-user experience can impact security at many levels. Simplifying the remote user experience will bolster security for everyone.
With today’s massive data increase and constant attacks, users can get alert fatigue and not be as responsive to threats as in an earlier age. During the interview, Anish Patel details how automation from Cloudflare can reduce the amount of vigilance needed by end users to accomplish network security goals.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Once the transition to the cloud became dominant, the importance of identity was placed ahead of permission to access apps and data.
When data centers were down the hall, one could have physical access to a room and sign-on permission. The hybrid cloud, private clouds, and an interest in “alternative clouds” make identity the keystone of modern computing.
Companies like Okta, Ping, and SailPoint work with identity and access management but rely on services that can provide a federated identity service.
Today, we sit down with Dr. John Pritchard, the CEO of Radiant Logic, and learn that Radiant does not compete with these well-known vendors but provides the backbone for their service.
Dr. Pritchard uses an interesting phrase: “continuous identity hygiene.” This means that although a person’s biology will not change, he can compromise essential elements of his identity. This must be a continuous process.
This fact has been recognized by CISA and DoD’s 2027 Zero Trust Goals and can be identified as Identity Security Posture Management.
In this thorough discussion, Dr. Pritchard presents a 30-year framework for network identity and includes comments on a unified data layer, data staging, and how to select a reference architecture for using a federated identity service.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Many people deceive themselves when moving systems to the cloud, thinking the same precautions used for an on-premises system can be used in the cloud.
Neil Carpenter from Orca Security dispels that notion right out of the box. He details that when a system is moved to the cloud, it operates under a shared responsibility model. While the Cloud Service Provider may be able to serve a solid infrastructure, that does not mean the applications and data are protected as well.
Further, the popularity of virtual systems means that workloads can spin up and down rapidly. This means a one-time scan is just that: a photograph of a moment; only continuous monitoring can provide the reassurance that federal systems managers demand.
While we know that cloud systems can scale rapidly, many do not understand that scaling also widens the attack surface. Michael Hylton from Orca Security recommends investing in a system that can provide continuous scanning in a dynamic environment.
How is this accomplished? During the interview, Neil Carpenter defines agent vs. agent-less systems. When Orca Security established an agent-less system, it allows them to scan, speeding deployment and reducing the risk of coverage gaps.
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Today, we sit down with Karthik Natarajan, Solutions Engineering Manager, U.S. Public Sector, for SNYK.
SNYK has garnered a formidable reputation in the commercial sector by helping to identify and fix vulnerabilities in code, open-source dependencies, and container images.
Karthik Natarajan acknowledges that no code can be 100% secure; however, one way to improve by a magnitude is to incorporate the “Shift Left” approach. This phrase has been around for twenty years but has recently gained momentum.
The concept of shift left moves testing and performance evaluation to an earlier part of the software development lifecycle. But SNYK goes further by applying AI to look at open-source dependencies.
When infrastructure transitions to “infrastructure as code,” vulnerabilities may be included. SNYK also looks for vulnerabilities in infrastructure code.
The interview ends with Karthik explaining that SNYK’s success is due to it being written for cloud applications- it is cloud native. Also, they judiciously use AI and rigorously check corrections to code that may introduce trouble.
-
The federal government is transforming from on-premises and private cloud systems to a hybrid cloud.
What most listeners do not realize is that the linchpin to this transition is the Application Program Interface (API). It has been hiding under the radar for so many years that malicious actors use this perspective to attack the API.
Info Security Magazine reports that 99% of organizations struggle with API security. Where to start? First, get an inventory of how many APIs you are dealing with.
Stephen Ringo emphasizes the need for discovery tools to identify rogue and shadow APIs, noting that passive discovery methods are preferred to avoid network disruptions.
He also points out that API security is often overlooked, even in cloud-native solutions, and that misuse, rather than malformation, is the primary threat. Ringo advocates initiative-taking measures to secure APIs and prevent data breaches.
Three main ways to protect APIs: Educate and raise awareness about API security risks among federal CIOs and IT leaders. Discover and inventory all APIs, including rogue or shadow APIs, within the organization. Evaluate API security capabilities of cloud providers and ensure proper security controls are in place. -
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
We all know the quote from Peter Drucker, "If you can measure it, you can manage it." It's pretty easy to apply when throwing a javelin but difficult when measuring success in complex software development projects.
Today, we sat down with Jeff Gallimore, Chief Technology and Innovation Officer and founder of Excella. He brings with him decades of experience collaborating with teams on successful federal projects.
We start by noting the fallacy of using one metric to measure success. While completing the initiative on time might make an agency administrator happy, that will change rapidly if compliance is not achieved, and scaling will break the system into pieces.
Jeff has seen breakthroughs using a framework called DORA, DevOps Research and Assessment). The key metrics are deployment frequency, lead time for changes, change failure rate, and failed deployment recovery time.
These metrics, now part of Google, are research-based and predictive of IT and organizational outcomes.
They emphasize the importance of a holistic approach, avoiding single-metric focus, and the role of leadership and culture in fostering high-performing teams
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
The Partnership for Public Service recently conducted a poll, and just 23% of Americans believe federal services are easy to navigate.
Today, we will examine the importance of User Experience and how to overcome some of the challenges federal agencies face when attempting to improve.
Lisa Hoover is the Head of Experience and Design at Karsun Solutions. In that role, she has experienced all aspects of federal design. She begins by observing that customer challenges may be recognized but not remediated.
She argues that there are several reasons for this standstill. Many federal agencies are dealing with legacy systems, and attempts to improve the CX can have unintended consequences.
Further, qualitative improvement is difficult to determine in a world of bits and bytes. Sometimes, the ease of scaling data can make a system so complex that one does not know where to begin.
Lisa Hoover recommends looking at Karsun Solutions' ReDuxAI offer. It leverages AI to establish a “blueprint” to see how everything connects, making digital transformation possible.
Hoover also addresses the need for efficiency in federal IT, aiming to streamline processes and improve customer satisfaction. The conversation underscores the potential of AI to enhance federal service delivery.
https://karsun-llc.com/innovation-center/innovation-center-projects/go-redux-ai/
-
Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes? www.Federaltechpodcast.com
Here we are at the beginning of 2025, and Bill Church, F5's CTO, discusses the company's role in helping federal agencies navigate the complexities of multi-cloud environments and cybersecurity threats.
F5's strength spans the application portfolio of enterprise organizations. This includes application security, enhancement, quick access, improved availability, and even making them secure. It doesn't end there; they also help with encryption and authentication.
Church emphasizes the importance of flexibility and consistency in managing diverse cloud environments.
He highlights the challenges of API discovery, noting that many organizations are unaware of the number of APIs in their systems. F5's tools, like the App Study Tool, help identify and manage these APIs.
Church also discusses using AI and machine learning in F5's solutions for enhanced security and data protection, including an AI gateway for large language models.
- Montre plus