What if just making data open and shareable is no longer enough?
For Link Digital – a global company that uses open source software tools to build digital platforms that manage and share data – this is not a question asked lightly.
In addition to sharing data, we believe what is needed is ‘verifiable honesty.’ This is a core purpose behind the Objective Observer Initiative (OOI), which attempts to blend philosophy, open data, AI and digital platforms into a new digital ecosystem framework that is more transparent, objective, and accountable.
Why Link Digital is working on this was the subject of a recent keynote by Link Digital Executive Director Steven De Costa, delivered at the Everything Open conference in Canberra, Australia. It was the second public outing of the thinking behind the OOI, the first being a presentation by Steven at the Canadian Open Data Summit late last year.

Photo by: Daniel Black
The following article will briefly look at the OOI and some of the points raised by Steven’s recent talk about the thinking behind it.
Being attentive to the signals in data
The OOI draws on over two decades of involvement by Steven in the open data/open source ecosystem. But its direct roots come from more recent events that are putting pressure, not just on Link Digital’s business model, but on the notion of openness more generally. A perfect storm of rising authoritarianism, increasing disinformation – making it hard to achieve any sense of shared understanding or consensus formation – and the technological disruption of AI.
As Steven puts it, this has led to a realisation that:
“While technology has the potential to provide productivity gains, through automation or other means, there is a very real structural change occurring when we employ technology in knowledge operations. Knowledge is something that is constructed, so it takes shape and can certainly be optimised, compressed and distorted. It can be alluring to suggest that the discovery and access to knowledge is a universally beneficial development, but the technology we are employing will consider shared understanding to be of the same category as shared ignorance.”
To put it another way, feeding data into opaque machine learning tools and passively receiving the results that emerge on the other side, is not enough. This situation – largely where we are at now – relies on clicks as a signal for consent, efficiency as a sign of value, and does not consider the fact that data can contain different signals that we need to be attentive to.
“Datasets contain both information and signal potential. Where data might be combined and used as part of a corpus of signals for the outcome of developing a model, machine learning presents us with the opportunity to not only share the data we have, but to actively construct and share the data we need to train models with the optimism of a healthy, cohesive and well informed society.”
Rethinking the role of data in creating trust
As Steven put it in his Everything Open keynote, an added urgency was injected into his thinking with the advent of the Trump presidency in early 2025.
That event was one of several – including Russia’s invasion of Ukraine and lingering dissatisfaction around the response to the Covid 19 pandemic – that made the notion of trust, and why it is declining towards institutions such as government, a central concern of social discourse. These developments also posed the question more sharply of how we reverse this trend and build greater social cohesion, and what role open data might play in this.
Of course, trust is a human thing. It is also causally part of a system dynamic. In our current digital set up, we have access to data but lack causal continuity; the ability to agree on how the present came to be. The OOI posits that this can only emerge when our digital systems remain observable, communicable, externally accountable, and truly open.
But the difficulties of constantly verifying honesty are huge, which led to the idea of trying to replicate trust within the framework of the open data ecosystem. And central to this was the realisation that people who act in a technological setting need to have a statement of intent.
The OOI emphasises explicit, accountable declarations of intent in relation to digital platforms, rather than implicit trust in opaque, black box systems. Instead of simply agreeing to terms we almost never read, individuals that create and interact with our digital systems would publish clear statements about how they intend to act or use data, and then those statements could be objectively verified for compliance.
As Steven put it in his Everything Open keynote:
“Without a statement of intent, you have no baseline by which to understand, are they moving though the substrate, are they doing things right?”
With these statements of intent, we can assess, if we want to, whether they are acting along the lines of their intentions.
The OOI as a signal referee
The OOI aims to act as a signal referee, of sorts, not only asking what patterns we can extract from data but anchoring signals to explicit commitments from those who create the systems and use the data, observing whether they are being honest in the way they operate and meet these. This pushes back on the assumption that these signals are a form of truth that can emerge when we run enough data or statistics into machine learning tools. Rather, it posits these signals are relationships between data, intent, and observation.
In this regard, the OOI doesn’t just ask what patterns can we extract from data? It asks, do the observed outcomes match the stated intent? This differs markedly from our current digital set up where the main measurement of satisfaction in relation to our engagement with digital platforms is clicks and efficiency the only signal for consent.
In his keynote Steven provided an example of how the OOI might act to combat disinformation. Rather than just attempting to passively inoculate citizens against propaganda, the OOI is aligned to attenuate the population towards constructive, optimistic outcomes, to create a clear ‘signal’ for them to invest in.

Moving toward co-construction
The OOI is already influencing Link Digital’s work with data portals. As Steven puts it:
“There is a natural crossover between data portals and websites, and the need for organisations to have a mature understanding of what ‘signals’ they are publishing along with their data, documents and web pages. Importantly, there is an opportunity to co-construct datasets that provide optimistic signals. Such datasets would naturally be made discoverable on portals such as those we develop and maintain for our clients using CKAN, Drupal and other open technologies. “
Link Digital is now entering the experimentation phase of the OOI, in which we want to explore how to develop a model of open technology based on transparent intent and shared cadence, and how open communities can move beyond just accessing data to using that data to fashion a shared coherence.
In his Everything Open keynote – which you can watch in its entirety here – Steven discussed the first of these experiments, which involves attempts to observe and capture a minute. We will make the outcomes of this experiment an interoperable public dataset and see what products can be engineered from it. You can get the full details of what is involved around the 28:39 minute mark of Steven’s talk.
An invitation to collaborate
While the OOI is a Link Digital initiative, it is vital to stress that we profess no ownership of it. The OOI is an idea and an opportunity, and we extend an invitation to anybody that is interested to build systems and societies that are verifiably honest by design.
Work related to the OOI is open by default, and you can find a full history of the ideation phase online here.
You can register your interest in joining us in co-constructing the OOI here.
All code generated by the OOI is public infrastructure and will be available here.