Theory can help disable terrorists’ messages

An electrical engineer at Washington University in St. Louis has devised a theory that sets the limits for the amount of data that can be hidden in a system and then provides guidelines for how to store data and decode it. Contrarily, the theory also provides guidelines for how an adversary would disrupt the hidden information.

The theory is a fundamental and broad-reaching advance in information and communication systems that eventually will be implemented in commerce and numerous homeland security applications — from detecting forgery to intercepting and interpreting messages sent between terrorists.

Jody O’Sullivan Ph.D.

Using elements of game, communication and optimization theories, Jody O’Sullivan, Ph.D., professor of electrical engineering at Washington University in St. Louis, and his former graduate student, Pierre Moulin, Ph.D., now at the University of Illinois, have determined the fundamental limits on the amount of information that can be reliably hidden in a broad class of data or information-hiding problems, whether they are in visual, audio or print media.

“This is the fundamental theorem of data hiding,” O’Sullivan said. “One hundred years from now, if someone’s trying to embed information in something else, they’ll never be able to hide more than is determined by our theory. This is a constant. You basically plug in the parameters of the problem you are working and the theory predicts the limits.”

Data, or information, hiding is an emerging area that encompasses such applications as copyright protection for digital media, watermarking, fingerprinting, steganography and data embedding. Watermarking is a means of authenticating intellectual property – such as a photographer’s picture, or a Disney movie, by making imperceptible digital notations in the media identifying the owner. Steganography is the embedding of hidden messages in other messages. While data hiding has engaged the minds of the nation’s top academics over the past seven years, it also has caught the fancy of the truly evil. In February, 2001, nine months before 9/11, USA Today reported that Osama bin Laden and his operatives were using steganography to send messages back and forth.

“The limit to how much data can be hidden in a system is key because it’s important to know that you can’t hide any more and if you are attacking (trying to disable the message) that you can’t block any more than this.” O’Sullivan said. “It’s also important because knowing this theory you can derive what are the properties of the optimal strategy to hide information, and what are the properties of the optimal attack.”

O’Sullivan and Moulin published “Information-Theoretic Analysis of Information Hiding,” in the March, 2003 IEEE Transactions On Information Theory.

O’Sullivan is associate director of Washington University’s Center for Security Technologies, a research center devoted to developing technologies that safeguard the United States against terrorist attack.

Ronald Indeck, Ph.D., Das Family Distinguished Professor of Electrical Engineering, is director of the center, which features nearly 40 interdisciplinary collaborators. The Center was founded in early 2002 to address the fundamental scientific and engineering questions that arise in the design of advanced security systems. The primary motivation for the technical direction of projects is to provide the basis for new industries and to assist the government and existing corporations through collaboration. The goal is technology development to defend against an array of threats, including: cybersecurity, critical infrastructures, environmental catastrophes and natural disasters, among other possibilities.

While the intellectual pursuit of data hiding is relatively new, with the first international conference on the topic held in 1996, the practice goes back to the ancient Greeks. Herodotus was known to have sent a slave with a message tattooed on his scalp to Mellitus; the slave grew his hair out to hide the message, which was an encouragement to revolt against the Persian king. In World War II, the Germans used microdots as periods at the ends of sentences. Magnified, the microdots carried lots of information. The German usage is a classic instance of steganography.

There will be much work ahead before O’Sullivan’s theory will be fully implemented.

“This is an example of one kind of work we do at the Center that has a big impact in the theory community, but it’s a couple of layers away from implementation,” O’Sullivan said. “But the theory answers the questions, what is the optimal attack and what’s the optimal strategy for information hiding? Our paper was referenced many times before formal publication.”