What the ‘Signalgate’ Snafu Overlooks
The real problem isn’t the Signal app, but the outdated and inefficient communications methods the U.S. government uses

Last month, National Security Adviser Mike Waltz blundered by adding journalist Jeffrey Goldberg to his Signal app group chat for a Principals Committee meeting, which includes the most senior national security policymakers, on plans to attack Yemen’s Houthi outlaws. He also erred in not making the participants identify themselves in the chat, which would have outed Goldberg and prevented him from reporting on the meeting. Classified information was discussed in the chat, particularly specifics about when our military would strike the Houthis and about the postattack damage assessment.
About the attack: Well done. It was about time. The Houthis have been firing at U.S. warships for over a year and still are. We needed to strike at their leadership.
About the meeting: Not good. Don’t include adversarial journalists in your secret national security discussions. And even though Signal is an end-to-end encrypted app—which no one claimed was compromised—this high-level discussion on national security should not have occurred on an unauthorized civilian messaging system. And if some participants were using the app on their internet-connected smartphones, that would raise even more security concerns. Some experts have even suggested that administration officials could be prosecuted under the 1917 Espionage Act, even though before this dust-up, many commentators regarded that law as being in woeful need of reform.
This “scandal” has become a media feeding frenzy. In the fallout, the Democrats want everyone involved to resign, President Trump said his administration probably won’t use Signal anymore, Waltz is left dancing on thin ice and the Senate Armed Services Committee requested the Defense Department’s inspector general to investigate Signal’s use in the military.
This last point is telling. It is a tacit acknowledgement that Signal’s use within the federal government is widespread, and probably for more than unclassified business, despite official government edicts to the contrary. Intelligence officials report that Signal is on their phones and computers and that using the encrypted app is routine. Signal is used by the military, by diplomats and probably by operators in the field. We have already learned that the Biden administration often used it. That senior government officials continue to use it is a tribute to its secure features.
Ironically, this secure app used widely by government officials was invented in response to the NSA’s collecting confidential data. Edward Snowden’s 2013 theft of NSA secrets inspired Signal’s inventor, Moxie Marlinspike, to invent an end-to-end encrypted messaging app that collected no metadata on its users. It filled a vital need for secure communications with no “backdoor” for a spy agency to enter.
The real issue here is not the use of the Signal app but the ponderous and antiquated nature of government communication, especially when it involves sensitive information. The rest of the world has figured out how to communicate securely and efficiently, while elements of the U.S. government lag woefully behind. National security expert Jeffrey Rogg rightly comments that the government’s communication systems are often inconvenient to use and haven’t always been secure. But if we turn everything over to security, we lose flexibility and agility. There must be a reasonable tradeoff.
Senior government officials conducting sensitive business from remote locations using an encrypted app looks like the future. It’s a solution that balances security with flexibility. Main players in the Principals Committee meeting—such as Waltz, Vice President Vance, Director of National Intelligence Tulsi Gabbard and Defense Secretary Pete Hegseth—have media and military backgrounds and use Signal as a matter of course, so there would likely be no obstacle to widespread adoption of that or a similar app.
Among senior policymakers, an oral culture often emerges that depends less on formal communication channels and more on quick, informal conversations. This episode amply illustrates that. Instead of denying that this circumvention of official protocols occurs, we might as well figure out how to manage their interactions more efficiently and securely.
And the government can do it if it wants to. It is not the first time senior leadership has demanded more efficient communications. President Obama, for example, insisted on having a secure BlackBerry, and he got one.
The alternative to harnessing technology’s power for efficiency and adaptability is the “brick and mortar” approach to security communications that the government currently has. Rather than using an encrypted app, schlep to the White House from Langley or Liberty Crossing in D.C. traffic, or to the secure video facility in your building. Maybe this is more secure, but it is also more time-consuming and inefficient.
Intelligence bureaucracies are path-dependent on inefficient systems. Several years ago, when I worked in the intelligence community, I had to make a business trip within the U.S. I scheduled this with my counterparts within my organization via email, but this wasn’t good enough. I had to send my counterparts a classified “cable” with the appropriate headers and routing codes, or they wouldn’t acknowledge the plans we had already worked out by email. And the recipients worked down the hall from me.
Some commentators on the intelligence community thought it missed an opportunity during COVID-19 (2020-2021) by not having its employees work remotely. With today’s technology, there is no reason everyone has to be in one building. This would have greatly reduced costs and made the intelligence community a more attractive place to work. And it would have put a premium on more secure communications, including internet-connected smartphones.
Encrypted mobile phones are commercially available. They should be distributed widely to our national security officials and intelligence officials. With proper security vetting, there seems little reason why a secure encrypted app on an encrypted phone cannot be in common use.
Critics object that the phones are insecure; they might be lost or fall into the wrong hands. In the real world, though, we use our smartphones for everything, including as integral parts of our companies’ computer systems. They serve as our two-factor authentication devices. If I don’t have mine, I cannot access my university’s computer system. So, the idea that you can’t use a secure encrypted app because you might lose the phone makes little sense. It is like saying you can’t use your car because you might lose your key fob.
Communications security is really about the humans who use it. Any communications system can be circumvented. Even the National Security Agency, our intelligence holy of holies, has been penetrated by malicious actors like Snowden. Intelligence from the SIPRNet information sharing system, which no one thought was that secure but used it anyway, was downloaded easily in 2010 by Private Chelsea Manning. This “Wikileaks” intelligence leak didn’t turn out to be that impactful after all.
There’s no such thing as a perfectly secure communications system. But the U.S. government needs to carry out classified business anyway. We have to balance security with function. Having our senior national security personnel conducting sensitive business on a secure encrypted app, if it makes them more efficient, seems like an easy fix if our government has the will to do it. The true lesson of “Signalgate” is that the app itself is not the problem; rather, it’s a possible solution to the problem of secure but efficient government communications.