This is the third and final in a series of articles on the Open Government Partnership Summit 2025, in Vitoria-Gasteiz, Spain, in October. Technical Writer/Analyst Andrew Nette was one of the Link Digital members present at the event.

For nearly two decades Link Digital has been working in the open data and digital space. In addition to becoming a trusted participant in the open source community, Link Digital is firmly established in the evolving data ecosystem, with a focus on supporting open data, digital service design and citizen centred design practices.
And while it continues to be the bread and butter of the company’s business, the direction that open data is taking is changing. This was again reinforced for me at the recent Open Government Partnership (OGP) Summit 2025. That open data was mainly raised in the context of its centrality in developing transparent and public focused uses of AI, was no surprise. This is not just about how open data can be made more AI ready, but the ways in which AI is transforming how open data is created.
But the Summit also tackled another emerging issue: how large language AI models will be incorporated into and help to transform digital public infrastructure (DPI) and linked to this, AI’s potential to enable greater public access to the data that is stored on platforms such as open data portals.
Open data – still vital
Underlining some of the discussion at the Summit is the reality that the impetus behind open data has faltered, part of a larger rethinking of the concept of openness triggered by several factors: the erosion of the social contract between people and governments; the activities of malicious anti-democratic forces; and the relentless scraping and appropriation – without compensation or consent – by AI companies, which has resulted in open data producers starting to shut their doors or adopt paywalls and access restrictions.
But open data is still seen as vitally important, at least by attendees at the Summit. The Open Government Partnership’s 2024-25 report shows that 40% of commitments from its member countries and local governments focus on public participation, digital transformation, and open data.
That said, there was discussion that open data advocates have taken acceptance of open data for granted and that there is a pressing need for a more aggressive effort to publicise the ongoing value of making data open and shareable. Key to this, open data needs to exist not just as a static property on public portals but be seen as helping meet challenges, such as climate change, innovation and closing the gap between democracy and service provision or, as some at the Summit referred to it, improving the democratic feedback loop.
There was agreement around the many benefits of using quality open data to train large language models of AI. But there was also consensus, not just around open data as a core component of government digital public infrastructure (DPI), but how AI can be used to reshape the creation, management, sharing and access to open data in DPI such as open data portals.
How artificial intelligence (AI) could change digital public infrastructure
I have written before on this site about the ways AI can benefit from open data and vice versa. Large language machine models need to be trained on significant volumes of data. Data that is higher quality – including being checked and standardised – will produce better, and more equitable and ethical AI outcomes.
Open data can also benefit enormously from AI in terms of cleaning and error detection, bias identification, metadata enrichment, language translation, enriched metadata, and duplicate detection. Open data’s centrality to AI is a powerful argument not only for governments to focus on creating better open data infrastructure and policies to prioritise data quality, discoverability and useability, but to make more government data, particularly high value datasets, more open
But AI is also reshaping the design of government DPI such as open data portals and catalogues. AI can help clean, link and enrich datasets and detect gaps and errors in public data sets automatically, enhancing data comprehension and use. But AI can also make open data more interoperable and interlinked, changing datasets so that they are not just static information on isolated portals to more accessible and interactive data on connected ecosystems.
Several government representatives at the OGP Summit discussed how they are in the process of introducing AI powered chatbots or digital assistants into open data platforms and other DPI. Sometimes referred to as ‘conversational interfaces’, the aim is that these will
change how citizens engage with data. Natural language search will help users easily visualise datasets. It can translate datasets and metadata into multiple languages and, in much the same way as a film or music streaming service, suggest related high value datasets. All of this could enable a better quality engagement by citizens as well as broaden the reach of data beyond data specialists.
Major challenges
While AI certainly brings benefits to DPI, it also throws up future challenges for DPI implementation. To cite just one, instead of designing one digital interface for everyone, AI will increasingly increase citizen’s expectations that they will be able to access variations tailored to their specific circumstances.
As I noted in my previous report on the OGP Summit, in terms of ensuring full transparency in relation to the AI technology applied to public services and digital infrastructure, governments need to adopt full algorithmic transparency. This includes publicly detailing why the AI algorithm in question was implemented in the first place, and clearly defining roles and responsibilities. This includes the importance of human supervision, such as regular audits and the traceability of decisions related to the technology.
And, as Renato Berrino Malaccorto, Research Director at Open Data Charter stressed, the capacity to undertake such regulation varies greatly between countries and especially between developed countries and those in the global south. And, even in richer countries, regulatory frameworks for AI use in governments vary among countries, from actual legislation to just recommendations, and while few explicitly mention open data, many focus on data governance.
There are also vital questions relating to the tech stack used in these initiatives. DPI should be designed for transparency, resilience and long term sustainability. Going forward, this shows the importance of sustainable, transparent, community driven, open source technologies, like CKAN – the Comprehensive Knowledge Archive Network. Going forward, it also underlines a continuing and vital role for the open data community and organisations like Link Digital that are part of it.