720-891-1663

Information Ops Kill Chain

Way back in 2011 Lockheed Martin released a white paper defining the concept of the “Cyber Kill Chain” (see below).

cyber kill chain

The cyber kill chain defined the steps in a hacking attack and then the way a defender can use that to “kill” that attack.  It is a very effective tool and here is a link to that original paper.

Based on our information society, now might be the right time to create an information operations version of the cyber kill chain.  This kill chain is based on the way Russia did business way back in the 1980s and they are still doing it that way.

Step 1  – Find the cracks – in the fabric of society­ — the social, demographic, economic and ethnic divisions

Step 2 – Seed distortion by creating alternative narratives. In the 1980s, this was a single “big lie,” but today it is more about many contradictory alternative truths­ — a “firehose of falsehood” — ­that distorts the political debate.

Step 3 – Wrap those narratives around kernels of truth. A core of fact helps the falsities spread.

Step 4 – (This step is new.) Build audiences, either by directly controlling a platform (like RT) or by cultivating relationships with people who will be receptive to those narratives.

Step 5 – Conceal your hand; make it seem as if the stories came from somewhere else.

Step 6 – Cultivate “useful idiots” who believe and amplify the narratives. Encourage them to take positions even more extreme than they would otherwise

Step 7 – Deny involvement, even if the truth is obvious.

Step 8 – Play the long game. Strive for long-term impact over immediate impact.

This is the playbook the Russians used in the2016 and continue to use today.  It was a new game to most Americans so they didn’t know how it worked.

Here is Bruce Schneier’s version of the Information Operations Kill Chain Circa 2019.  Note this is directly from one of Bruce’s blog posts.

 

Step 1: Find the cracks. There will always be open disagreements in a democratic society, but one defense is to shore up the institutions that make that society possible. Elsewhere I have written about the “common political knowledge” necessary for democracies to function. We need to strengthen that shared knowledge, thereby making it harder to exploit the inevitable cracks. We need to make it unacceptable­ — or at least costly — ­for domestic actors to use these same disinformation techniques in their own rhetoric and political maneuvering, and to highlight and encourage cooperation when politicians honestly work across party lines. We need to become reflexively suspicious of information that makes us angry at our fellow citizens. We cannot entirely fix the cracks, as they emerge from the diversity that makes democracies strong; but we can make them harder to exploit.

Step 2: Seed distortion. We need to teach better digital literacy. This alone cannot solve the problem, as much sharing of fake news is about social signaling, and those who share it care more about how it demonstrates their core beliefs than whether or not it is true. Still, it is part of the solution.

Step 3: Wrap the narratives around kernels of truth. Defenses involve exposing the untruths and distortions, but this is also complicated to put into practice. Psychologists have demonstrated that an inadvertent effect of debunking a piece of fake news is to amplify the message of that debunked story. Hence, it is essential to replace the fake news with accurate narratives that counter the propaganda. That kernel of truth is part of a larger true narrative. We need to ensure that the true narrative is legitimized and promoted.

Step 4: Build audiences. This is where social media companies have made all the difference. By allowing groups of like-minded people to find and talk to each other, these companies have given propagandists the ability to find audiences who are receptive to their messages. Here, the defenses center around making disinformation efforts less effective. Social media companies need to detect and delete accounts belonging to propagandists and bots and groups run by those propagandists.

Step 5: Conceal your hand. Here the answer is attribution, attribution, attribution. The quicker we can publicly attribute information operations, the more effectively we can defend against them. This will require efforts by both the social media platforms and the intelligence community, not just to detect information operations and expose them but also to be able to attribute attacks. Social media companies need to be more transparent about how their algorithms work and make source publications more obvious for online articles. Even small measures like the Honest Ads Act, requiring transparency in online political ads, will help. Where companies lack business incentives to do this, regulation will be the only answer.

Step 6: Cultivate useful idiots. We can mitigate the influence of people who disseminate harmful information, even if they are unaware they are amplifying deliberate propaganda. This does not mean that the government needs to regulate speech; corporate platforms already employ a variety of systems to amplify and diminish particular speakers and messages. Additionally, the antidote to the ignorant people who repeat and amplify propaganda messages is other influencers who respond with the truth­ — in the words of one report, we must “make the truth louder.” Of course, there will always be true believers for whom no amount of fact-checking or counter speech will convince; this is not intended for them. Focus instead on persuading the persuadable.

Step 7: Deny everything. When attack attribution relies on secret evidence, it is easy for the attacker to deny involvement. Public attribution of information attacks must be accompanied by convincing evidence. This will be difficult when attribution involves classified intelligence information, but there is no alternative. Trusting the government without evidence, as the NSA’s Rob Joyce recommended in a 2016 talk, is not enough. Governments will have to disclose.

Step 8: Play the long game. Counterattacks can disrupt the attacker’s ability to maintain information operations, as U.S. Cyber Command did during the 2018 midterm elections. The NSA’s new policy of “persistent engagement” (see the article by, and interview with, U.S. Cyber Command Commander’s Gen. Paul Nakasone here) is a strategy to achieve this. Defenders can play the long game, too. We need to better encourage people to think for the long term: beyond the next election cycle or quarterly earnings report.

This is not a silver bullet as Bruce explains in his essay, but it is a framework for starting to address the information operations attack.

Information operations attacks are not going away and they are not limited to the Russians.  Americans, politicians, are using them too because, up until now,

The Information Operations Kill Chain is part of an essay by Bruce Schneier.

 

 

Facebooktwitterredditlinkedinmailby feather

Leave a Reply

Your email address will not be published. Required fields are marked *