Data Management Sponsored Story

Breaking boundaries: How Australia’s data & digital strategy challenges traditional data management

4 min read
Share
1200X800-advertorial-02

Summary: In this article, we focus on why traditional data management practices are unable to adequately help implement the recent “Data and Digital Government” strategy required by government agencies to enact. We will delve into how other governments have used a “data fabric” to implement similar capabilities that are outlined in the government’s strategy on data. 

In an era where digital technologies and artificial intelligence (AI) are reshaping our daily lives, the Australian Government late last year released the Data and Digital Government Strategy (the Strategy), setting a clear vision for delivering simple, secure, and connected public services for all people and businesses through world-class data and digital capabilities.

At the core of this strategy lies the recognition that data is not just a resource; it’s the lifeblood of effective governance. Data and the interconnectedness of data will play a key role in improving service design and delivery, as well as enhancing informed policy decisions.

Overcoming traditional data challenges 

However, several traditional data challenges must be addressed to successfully execute this strategy:

Privacy and Security Concerns:

  • Balancing data openness with privacy protection is challenging.
  • Striking the right balance ensures trust while leveraging data effectively.     
  • Applying consent and audit of how the data is to be used and restricted.

Legacy Systems and Silos:

  • Many government agencies operate on outdated legacy systems that hinder data sharing and integration.
  • Breaking down silos is essential for seamless data flow.

Data Quality and Consistency:

  • Inconsistent data formats, inaccuracies, and duplication plague government datasets.
  • Ensuring high-quality, standardised data is crucial for effective decision-making.

Why traditional data management practices will not deliver the outcomes required      

Traditional data management architectures are based on physically copying and centralising all of the departments’ and agencies’ data into a single location, such as a data warehouse. More modern versions of this are cloud-based and often referred to as a “data lake,”  “lake house,” or “cloud storage provider.”  Whether the data is physically copied, replicated, or ingested into an on-premise data warehouse or a cloud data lake, the architecture is in principle the same with the same restrictions, making this approach neither viable nor practical in delivering the strategy’s objectives in the short and long term. Here’s why:

Diverse data sources

  • In today’s digital landscape, data originates from a multitude of sources—government agencies, IoT devices, social media, and more.
  • Centralising all this diverse data into a single repository becomes unwieldy and inefficient. 

Data velocity and volume:

  • The speed at which data is generated (data velocity) and the sheer volume of data challenge centralisation.
  • Real-time data streams and massive datasets require distributed approaches.

Data silos and bottlenecks:

  • Centralisation can inadvertently create data silos (I.E moving some data to the cloud with the intent to consolidate several datasets will create even more data silos). 
  • Agencies waiting for centralised data face bottlenecks, hindering decision-making.


Data fabric approach – Lessons learnt from other governments

After partnering with several innovative governments across the globe, we found that highly successful global data programmes take an architectural approach known as a data fabric. 

This innovative approach focuses on managing and integrating data across an organisation without requiring the need to copy, replicate, or rehost data. It weaves together various data sources, systems, and technologies into a cohesive fabric, enabling seamless data access, sharing, and analysis. Here are key aspects of the data fabric:

Decentralised access:

  • Unlike traditional centralised data architectures, the data fabric emphasizes decentralised access.
  • Data remains distributed across different sources (databases, cloud services, APIs, etc.), but the fabric provides a unified view.

Interoperability:

  • The data fabric ensures that disparate data systems can communicate effectively.
  • It supports data integration, transformation, and movement between different components.

Flexibility and agility:

  • Organisations can adapt to changing data needs without major disruptions.
  • The fabric accommodates new data sources, technologies, and business requirements.

Data governance and security:

  • Data governance policies are embedded within the fabric.
  • Security, privacy, and compliance measures are applied consistently.

Holistic view:

  • Users interact with the data fabric as if it were a single entity.
  • They can query, analyse, and visualize data seamlessly, regardless of its origin.

Denodo Advertorial Image 01

For example, the Italian National Institute of Statistics (Istat) is the main producer of official statistics in Italy. ISTAT keeps track of the population and economic censuses and is also responsible for performing a variety of social, economic, and environmental surveys and analyses. 

They took a data fabric approach powered by data virtualization (without building physical connections, replicating data, or needing to centralise all the datasets) through the Denodo Platform to aggregate data from heterogeneous data sources and create a single point of data access. This enabled ISTAT to increase the quality of statistics produced and quickly undertake new projects, such as a consumer price survey.

Denodo Advertorial Image 02

Another example is how the National Nuclear Security Administration (NNSA) enabled their ‘Integrated Digital Environment (IDE)’ programme to provide unified access to weapon lifecycle data across different facilities and facilitate the timely and secure sharing of data. This would not be possible to achieve without a data-fabric approach. 

The NNSA had major hurdles in delivering on their data programmes due to multi-site data sharing within and across highly sensitive sites, including different legacy systems and multiple data formats, making information sharing difficult. Overcoming these challenges would not be possible without a data-based approach. 

“Accessing & aggregating data from different NNSA sites would not be possible without Denodo. It expedites projects by enabling fast and secure movement of data through different facilities” – Enterprise Architect NNSA

Realising the value of the strategy 

The Australian Government’s Data and Digital Government Strategy aims to revolutionise public services by providing simple, secure, and connected experiences. However, traditional centralised data management faces challenges in this context. The strategy emphasises agility, privacy, interoperability, and user-centric design. By not considering innovative ways to overcome traditional data challenges, the government faces the risk of developing another strategy that delivers very little value to the Australian public. 

By embracing decentralisation and focusing on delivering data securely to the right people at the right time, the government can better serve citizens and businesses in the digital age, fostering innovation and responsiveness.

Minh Nguyen Avatar
Minh Nguyen
Regional Director, Federal Government at Denodo | Website | + posts

Minh Nguyen is the Regional Director at Denodo in ANZ, looking after the Australian Federal Government. Minh has been involved in several state-wide data transformation initiatives and has published research papers on data sharing in the public sector. Minh is currently in the final year of completing his MBA in Digital Transformation at Harvard Business School.

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *

Related Stories

Next Up