It is no newsflash that governments at all levels are using data more to improve public policy and meet both the increasingly complex challenges they face and the expectations of a more data literate business sector and citizenry. Much of this data comes from what might be termed traditional sources – official statistics, surveys and censuses, etc. But, as a recent Organisation for Economic Co-operation and Development (OECD) report on government innovation identified, governments are increasingly drawing on non-traditional data sources to help design and implement better services. They are also using experimental tools to help navigate difficult and unpredictable environments. One popular tool utilising non-traditional data sources the OECD highlights is simulations known as a digital twin.
Continue ReadingWe have grown used to reading and hearing about the enormous potential of artificial intelligence (AI). The rapidly evolving nature of the technology makes it difficult to pinpoint specific shifts in the relationship between AI and open data. But one emerging discussion that Link Digital is keen to highlight concerns the proposition that the intersection between the two involves far more than open data’s role as raw material to train and operate large machine learning models, the focus of so much of the commentary to date.
Continue ReadingFinding what you need in the landscape of data can sometimes feel like a daunting task. But what if there was a way to make all that data more easily discoverable and usable? That’s where Data Catalogue Vocabulary (DCAT) is essential. This standard, recommended by the World Wide Web Consortium (W3C), aims to bring order to the chaos by providing a common language for describing datasets.
Continue ReadingOpen data has major benefits for AI. But it is not enough for that data to be open and shareable, it also needs to be quality data.