How to tell open-source intelligence tales for the news

Leading open-source intelligence (OSINT) stories can be broken down into a few elementary steps. This post introduces a methodology on how to structure OSINT for news journalism.

Techjournalist
8 min readMar 8, 2021
Open-source intelligence (OSINT) can arm modern investigative journalist to publish scoops. In addition, methodologies in this guide may help to structure the journalism research.

Open-source journalism can be messy. It’s often packed with facts, cross-references to tools and databases, chargon and data points. Structure can help journalists to guide them through the investigation and lead the way on how to tell an engaging narrative. Neither is straightforward in this biz. Any additional assistance to introduce structure seem therefore helpful.

Investigators are often no storytellers. Here, we’ll add a few methodologies on how to cover complex OSINT stories in a matter of a few steps. Our journalistic goal isn't to sound smart and nifty. Instead, it’s to sell an appealing story that readers get. They should understand what we found and stay curious throughout the piece to make it to the end. Often, that turns out to be a challenge.

7 steps

Most award-winning OSINT investigations in my view can be broken down in a few steps. Readers struggle to take in too much. They might disengage and skip reading/viewing. To keep them curious, mastering a few powerful and clear-cut connections is essential in the storytelling process.

The seven-step rule started helping me in telling narratives on Twitter. I noticed it concentrated the research effort, made it simple to pitch stories to senior editors, and allowed to structure the open-source intelligence material better.

An OSINT Twitter thread can go a long way. Despite presenting often very complex processes, threads appear digestible. Each step is like a little pitch, a small chapter of a short story. Each step tries to pull in the reader. When John Scott-Railton Sr. Researcher @citizenlab told his OSINT story on the Captial Hill Jan 6 events, it was first on Twitter. Only later, the story by the New Yorker and Ronan Farrow followed.

Why seven? Arbitrary but it often just worked out. It’s less a rule than a principle for guidance as you will see but seven steps leave enough space for verifying your own claims and linking it back to the original question.

As a side note, this process isn't so much about how to investigate. There are plenty of structures and methodologies out there to do OSINT investigations. The process for journalism is often indistinguishable from how investigators at law enforcement agencies, national security agencies, or cybersecurity outfits approach OSINT.

However, for journalism, there are some major distinctions in how to tell the story when all the cards are on the table. If journos sprinkled into a story OSINT, they need to make sure it flows.

If it doesn't, we lose the reader and the story is in vain. I consider the best OSINT investigative stories that are barely recognisable as OSINT or data stories. Instead, they are just news-breaking scoops, elevated by open-source intelligence.

At times of writing, I count four elementary sections. There is ‘The chain’, ‘the pivot’, ‘the attestation’ and ‘the endemic’. They can mix and and reporters often combine them. It offers infinitive options and enough flexibility. I’ll try to explain each with a simple example from the last weeks (I might add more in the future).

The examples are scrambled together and may not be 100% perfect, or conclusive. However, I do believe they are indicative of how OSINT narratives can be broken down into small parts. Via these smaller more digestible chunks, they can boost understanding and participation by readers and editors.

‘The chain’

The ‘chain’ or the ‘trail’ is a narrative that tells how two seemingly unrelated entities, including people or companies, are linked together.

‘The chain’ typical shines in open source data corruption stories. When explaining ‘follow-the-money’ trail, the author connects entities and individuals. Many of the tax-evasion data stories — where one firm connects to another and then perhaps to another holding company and their board of directors - often employ such a structure.

[1] Introduction of person or entity of interest

Why should we care about this entity?

[2] 1st link in the ‘chain’:

Explain the datapoint presented here and be transparent about how it was found

[3] 2nd link in the chain:

Verify link between 1 and 2. The narrative continues by connecting them N other links in the chain

[4] Final endpoint:

Why is this link the ‘endpoint’ of the chain? Why stop here and what did it achieve?

[5] How the chain links back to the individual/entity

Why do we care where we ended up?

[6] Potential caveats/weaknesses of the chain

Publicise the assumptions the chain is resting on and publish potential weaknesses.

[7] Final conclusion: What findings we can rely on and what does it mean for the starting point?

Conclusion of the links made

‘The Chain’, is a frequent narration tool to connects an individual or entity of interest with an endpoint, another point of interest, via one or several links. The links are our OSINT findings. We circle back to the main hypothesis and explain the assumptions we made and why the connection is valid for the greater theme.

The Pivot

OSINT research usually starts with a hypothesis. We may hit a ‘cul-de-sac’. It happens to the best. We might include this into our story and explain how we pivoted into an alternative direction, presenting findings that may be indicative of the verification we have been seeking for. The findings may only be a ‘proxy’ for validating the initial hypothesis but can make for exciting investigative storytelling.

Typical examples here include one new investigation into illegal, unreported and unregulated (IUU) fishing and transhipment operation in West Africa. Open-source intelligence data was limited. It failed to bring the evidence we needed to pin down that cargo vessels have acted illegally in their transshippment behavior. Instead (that’s the pivot) our team concentrated on the illegal nature of turning off their AIS signal. This was not what we were after in the beginning but it turned out to be a valuable finding in explaining (proxy) for what makes regulating and monitoring IUU fishing and transshipments so difficult in the region (and why often, operators ‘slip through the net’).

OSINT investigation on IUU fishing and transshipments in West Africa

[1] Initial hypothesis

e.g someone who commits to wrongdoing

[2]Validation attempt fails: Cul-de-sac

When open data fails you to confirm or dismiss a hypothesis

[3]Pivot

We may find a new direction that may be indicative of the first hypothesis

[4]Layout finding

We layout our new findings and may adept our hypothesis we have validation for.

[5] Proxy analysis

[6] Connect back to the overall theme and confirm/verify proxy thesis

Link back to the first hypothesis via other scientific or intelligence research

[7] Caveats:

Recognise the weaknesses in the connection. Are there any feeble points in our assumptions?

The Pivot: a way to tell a story that seems (initially) in vain but turns into a new direction where our OSINT findings matter

The attestation

This is more of the conventional open-source intelligence story, involving verification. Journalist may present the raw material first (a video or an image). Then they compare the original claims with the findings. This way it allows to poke holes into statements, that may be falsehoods, offering a basis for attack. Additional details along the way make the story interesting, new and lively.

Typical examples include, as mentioned in the video, stories like Markus Sulzbacher’s reporting for Der Standard, many of the disinformation and verification stories, or even data stories such as this one on Chinese ethnic minorities.

Gottfried Küssel spotted at last week’s demonstration in Vienna

[1]State raw material

e.g. videos, data, images

[2]State claims:

Location, time, data it contains, like individuals an image features

[3]State the obvious: facts about the material

What do we see? What do we know? Assumptions?

[4]Present open-source findings

The verified location, confirmation of other aspects that confirm time/location/person/other data points etc.

[5]Caveats: any holes in our assumptions

Thorough self-criticism for assumptions made on which the verification rely on

[6] Verify our assumptions separately

Explain why some assumptions (are fixed and) have more weight than those we question

[7] Implication of verification/dismissal

We verified something. Congrats! Now for journalism to work, we needs to compare our findings with the original claims. Has someone made wrong statements? Verify the impact of your verification.

Attestation examples

OSINT stories that circled around online about the January 6 Capitol Hill incident is a good example. The news outlets involved verified how and where certain groups and individuals were involved at the scene of the crime.

The pattern

Open-source intelligence in stories can signal the endemic nature of a problem. But you need a critical mass of examples to be convincing. These examples need to have the same or similar profile.

Typical examples in my view include Bellingcat’s recent article on unethical animal trade benefiting celebrities or BR’s data analysis of hate speech and right-wing tendencies on Facebook.

[1] State the first example

Explain encounter. What do we see?

[2] Connect to reader: Why is this the best example you can bring forward?

Data or commentary on the severity of the example can be helpful. It needs to draw in the reader by invoking emotions or offer a point of reference that the reader can identify with. Typical examples are social media bot investigations. We’d pick a few examples that hit home.

[3] Switch: other/next example(s) with added OSINT findings

Different examples but with the same endemic pattern. These need to be chosen well. If they don't line up, it could jeopardize how convincing the story is.

[4] Larger scale data analysis

My old editor always said: ‘find at least three examples that point to validate your direction’. With OSINT, we can verify patterns via automating the collecting of data online. Analysis is solid if it involves a critical mass. For data collection, we could use machine learning and data scrapers.

[5] Further details that emerged in the analysis

e.g. Impact on the individuals, outliers etc.

[6] Caveats: weaknesses in linking examples together

Is the analysis fair? Do we connect the examples? Is our data connection unbiased?

[7] Conclusion: connect to greater theme

Is this merely a temporary trend or a lasting development?

Nullius in verba

As the Latin phrase above (in English: ‘take nobody’s word for it’), OSINT can provide ethical journalists with the tools and a storytelling structure to publish pieces with an edge. In this post, we explored methodologies to break down how to tell OSINT stories.

Reach out for comments or a debate

--

--

Techjournalist

Investigative journalist with a technical edge, interested in open source investigations, satellite imgs, R, python, AI, data journalism and injustice