What is DisinfoDB?

DisinfoDB (“Disinformation DataBase” or DDB for short) provides accessible summaries of tactics used in the spread of disinformation. DDB collates and creates reporting on factors contributing to Information Disorder, then categorises and adds them to the ever-growing archive of disinformation data.

This data enables us to spot patterns in how personas differently distribute disinformation, and helps us predict how they might behave in the future, meaning people can use DisinfoDB to learn about methods used to manipulate them and reduce their effectiveness.

That’s a big claim! Let’s see why I think it’s true:

The Disinformation Cycle

A disinformation campaign follows these steps, usually one after the other:

The Disinformation Cycle, adapted from The Malign Foreign Influence Campaign Cycle

You can read more about it here, but the most important thing to keep in mind right now is what happens during the calibrate stage; at the end of a campaign disinformers will review and update their methods, depending on resistance received. Account banned on social media? They’ll need to make a fresh one. Fact-checkers debunked their claims? They’ll need to spin new narrative threads.

When we do stuff that makes spreading disinformation harder, we increase the amount of time needed to spend calibrating. Knowing this we can try to optimise our efforts so that disinformers need to spend as much time as possible reworking their methods, giving them less time to spend disinforming. But how do we know what things are hardest for them to recalibrate?

The Pyramid of Pain

We can learn from our friends in the Cybersecurity field! Focusing mitigation efforts on stuff harder to recalibrate is already a well established practice there. To tell how difficult something is to recalibrate, they refer to the Pyramid of Pain;

Don’t worry, we’ll leave the techy waters of Cybersecurity and be back in the mindscrewy land of Disinformation soon.

Basically, at the top of the pyramid we have TTPs (Techniques, Tactics and Procedures) which are, as you can see, Tough! to recalibrate. The lower you get the more Trivial calibration is. For example, sometimes hackers try to trick people into downloading malware via phishing emails (this is a TTP). We could tell our computer not to open files we know are malware (by blocking their Hash Values), but it’s Trivial for hackers to change those and do the same attack again. Or, we could help users spot phishing attacks so that they don’t download malicious files in the first place; the hacker would need to figure out a whole new way to get onto victims’ computers. And that’s Tough!

Mastering the Phishing technique opens a lot of metaphorical doors for Hackers

I’ll be honest, at this point I don’t know if there is a direct equivalent in of the Pyramid in Disinformation. Pyramids might not even be the right structure; maybe we need a Catacomb or a Mausoleum. But the logic of TTPs being harder to change than other things makes sense to me (we’ll see if we can support that claim as time goes on). In the meantime, here’s a rough draft of how The Disinformation Pyramid of Pain might look:

For simplicity I’m changing TTPs to just “Tactics”, and putting “Narratives” at the bottom (although I’m sure we may excavate even trivial-er parts of the Pyramid as we keep digging). In the real Pyramid there would probably be things in the middle of the structure too. As DisinfoDB continues to expand we can come back to this concept and see if we can make out the finer details.

To support this Pyramid’s existence we can look at a Disinformation version of the scenario provided earlier where Tactics are Very Hard to recalibrate, but Narratives are Trivial. Let’s say someone was disinforming you about the severity of Coronavirus; they may deliberately use complicated numerical data to make their statements appear authentic, knowing that the average reader wouldn’t deconstruct the numbers behind the argument. A fact checker may later disprove the individual narrative, but this doesn’t stop the disinformer creating new stories manipulating different data; the strategy of misrepresenting statistics to support an inauthentic argument can be re-used in multiple disinformation attempts.

If we improved the data literacy of targeted demographics this would reduce the probability that narratives employing data manipulation would succeed in the future. The disinformer would need to come up with new ways to mislead, which is a lot harder to do than making new stories with the same old methods. More generally if we mitigated strategies used in attacks instead of individual narratives we may be better able to stop the spread of information disorder, as disinformers need to spend more time calibrating and less time producing or amplifying.

Prebunking and Inoculation Theory

One way we can improve resistance to Narratives is by exposing them to weaker versions of specific inauthentic arguments – this way people will have already been able to reason out why dangerous narratives are incorrect, even when presented by skilled manipulators in the wild. This is called Inoculation Theory (but is more pithily referred to as Prebunking). Recognising that inauthentic narratives regularly change, the free online game Bad News uses the same prebunking method to expose players to tactics used in the spread of disinformation, improving resilience when seen in the wild. The Bad News team have developed several different games with the same core design philosophy of prebunking against disinformation tactics.

In this example Bad News educates players about the intentionally emotive topics used in disinformation to encourage the spread of inauthentic narratives.

I think this is a fantastic strategy, but making a full game is pretty resource intensive, and I don’t think that it’s optimal to wait for a full development cycle to complete before distributing much needed inoculations. You might have noticed that society is running RAMPANT with nonsense right now and we should probably be fairly quick with sorting it.

A Taxonomy of Tactics

We’ve put forward the idea that it’s more difficult to recalibrate tactics than it is other aspects of a disinformation campaign. Some tactic prophylactics have already been developed, showing it is possible to successfully invent interventions. But I think there’s a gap for those of us who don’t want to wait for the good news that Bad News has finished another game to fight “fake news”, a gap in which we can again learn from our friends in Cybersecurity. Organisations’ security teams don’t have to sit around waiting for one company to create protections against tactics; they can refer to MITRE ATT&CK to learn about methods used by hackers, and improve defences themselves. MITRE’s tactic profiles provide summaries of techniques, examples of their use in the real world, and recommendations on how to detect and protect against them.

Let’s go back to the example of Phishing as a tactic used by hackers. Anybody can learn from its’ dedicated MITRE ATT&CK profile that one of the ways to reduce the probability that hackers hook a victim is to block all emails with suspicious filetypes (so malware can’t make its way into the system). This basic advice benefits teams by enabling them to improve defences against hard-to-recalibrate tactics without needing to invest money and time into full-scale mitigations.

Profiles of Tactics used in disinformation campaigns could provide similar value for those of us looking to protect ourselves and each other from malicious outside influence; this is the gap I want DisinfoDB to fill. In the niche of “people who want to provide defence against disinformation tactics over individual narrativesBad News looks to create a full ‘vaccine’ for inoculation against techniques, and DisinfoDB has rough-and-ready advice on how to avoid infection while these are still in development. Remember the part of Covid where nobody was vaccinated, but we still knew to stand six feet apart and wear a mask? That’s the kind of information DDB can provide.

So, what is DisinfoDB?

Tactic Profiles

DisinfoDB reviews reporting on Information Disorder and distills information on Tactics into profiles. These profiles provide accessible summaries of the Tactics, link to further reading, and provide other supporting content. One example looks at how online harassment can be used to minimise the spread of unwanted information.

While I’d like to match MITRE ATT&CK’s mitigation and detection recommendations, this is not a feasible goal for the one person team (me, hello!) working on DDB today (especially in a field far less developed than Cybersecurity). I currently handle this by acknowledging that Tactic Profiles are at an early stage of development, containing a summary and further reading section, to help provide groundwork for users’ own self-education journey.

Article Archives

DisinfoDB started as a place where I could collate articles I’d read about disinformation and tagging them based on the narrative and the type of content shared. While DDB’s mission has evolved these article archives still provide value in collating real-world examples of tactics. There are lots of different organisations and sources reporting on disinformation, so it’s useful to have them collated in one centralised location.

Users can browse archived articles outside of profiles with freetext filters allowing them to find reporting on topics matching their interest. For example, this table is pre-filtered to show all articles in the Database that referenced information disorder amplified by Public Figures / Celebrities (a complete unfiltered collection of articles can be found here):

One way I’d love to improve the article archives would be to have them automatically populate reporting from key sources, so they aren’t bottlenecked by waiting for me (hello again!) to add them manually. It’s also worth noting that the article archives existed before Tactic profiles, so tags may need to be updated as the database develops.

DisinfoDB Going Forward

DisinfoDB is an evolving project that is continuously being worked on. Things may change but that’s OK – part of what we’re doing here is not waiting for a perfect finished product before making information accessible. Along with continuing to create more tactic profiles and adding to the archives, some loftier goals I have are:

Mapping Tactics to Stages of the Disinformation Cycle

Mitre ATT&CK has created matrices which place tactics within stages of a cyber attack. For example, Phishing is a tactic used to gain Initial Access to a computer, after which hackers will Execute malware using one of many tactics within the category. Prior to any of these they will have done Reconnaissance against the target and Developed Resources required to perform an attack. Knowing this is beneficial to investigations; we can track paths hackers are likely to have taken through stages of an attack and eliminate tactics based on the presence of others in the same category.

I’d love to make a DDB matrix collating Tactics under stages of the Disinformation Cycle, but this is something that would require more Tactic profiles to be completed before starting.

Mapping Tactics to Personas

DisinfoDB also tracks reporting which directly attributes groups behind influence operations, and generates Personas. Different Personas use different Tactics, and should one find themselves concerned about particular groups influencing them, they may refer to Personas to see how they are likely to do so. In this example the Mind-Hacktivist Persona is shown to have used Tactical Infodemics and Imposter Content in the past, so may do again in the future.

At the time of writing I have seen less reporting on Personas than on Tactics, and as such tracking Personas appears to be more exploratory than other content in DDB (which is built upon the great research and reporting of many giants). Valuable and accurate Persona profiles would therefore take more time to develop than Tactic profiles, and I believe expanding tracked Tactics is where we can see the value from efforts spent at the moment.

Summary

It can be a little meta mind-boggling to think about how our thoughts can be influenced by other people. It’s nice to consider one’s mind a logical, infallible fortress and that if we have an opinion it’s because we rationally reasoned it out and precisely puzzled it together. It’s scary to think the pieces may have been intentionally laid out in a fashion that drew us to a conclusion we wouldn’t have otherwise agreed with, by people who may not have our best interests at heart. It’s probably impossible to make ourselves immune to manipulation, but knowing how psychological traps are deployed may reduce the probability that we fall into them. This conclusion was arrived at based on the following:

  1. When we improve our resilience to parts of a disinformation campaign, disinformers need to calibrate before they can try manipulating us again.
    • We know this because of research done by the US Military into influence operations ran against the 2016 US Presidential election.
  2. There are lots of different things we can target to make it harder for them to succeed, but the aspect that’s hardest to recalibrate are Tactics used.
    • We used the Pyramid of Pain from Cybersecurity as an example to make this assumption.
  3. High effort inoculations are being developed to hinder disinformation Tactics, but there’s a gap for people who’d like to be informed about Tactics while more robust interventions are still in production.
    • We looked at the MITRE ATT&CK framework to see how this gap has been filled in the Cybersecurity field.
  4. DisinfoDB provides accessible summaries of Tactics used in disinformation campaigns, and collates real-world reporting to provide practical examples.

That’s it! As an aside, do you think we should have a different name for the Pyramid of Pain when it’s talking about Disinformation? Here are some ideas I had for other things we could call it:

  • Catacomb of Coercion
  • Triangle of Trickery
  • Edifice of Exploitation
  • Isosceles of Influence
  • Mausoleum of Manipulation
  • Sierpiński of Subterfuge
  • Reliquary of Ruses

Any sound good to you? Provide your own suggestions or just lampoon me on Twitter at @DisinfoDB.