![Data is at the heart of today’s government services. This is reflected in the federal government’s Data and Digital Government Strategy (the Strategy), which highlights its goal to use data, analytics, and technology to deliver simple, accessible services for people and businesses by 2030. As noted in the strategy, Australians expect personalised, integrated, and easy-to-use services from government entities they engage with. Such personalisation, especially across digital channels, is heavily dependent on data. Delivering such services becomes more effective when the data is more accurate and up-to-date. This is where real-time data comes into play. Why? Real-time data is more accurate because it is always up-to-date. This, in turn, improves the customer experience by enabling services to be more dynamic and interactive. However, because batch processing still accounts for the majority of data processing in government ranks, even the most recent data may become outdated when used to deliver government services. Engage with data in motion Batch processing is when the processing and analysis happen on a set of data that has already been stored for a period of time. This may be days, weeks, or even months, which just doesn't cut it for delivering dynamic and interactive citizen services. In recent years, data streaming has emerged as the technology that allows organizations to tap into their data in real-time in order to improve citizen engagement and experience. Event streaming, another name for data streaming, describes the continuous flow of data as it occurs. This enables true real-time processing and analysis for immediate insights. Streaming data distinguishes itself from batch processing by delivering the most up-to-date information when required. Apache Kafka, one of the most successful open source projects, is used by over 70% of Fortune 500 companies today and is well recognised as the de facto standard for data streaming. The open-source nature of Kafka lowered the entry barrier for working with streaming data, allowing companies to easily build use cases and solutions. However, as with all open-source software, there are limitations. Companies often end up spending more to efficiently manage, scale, secure, and evolve the streaming infrastructure. Why are we still using batch processing if data streaming is the future? Batch processing is still simpler to implement than stream processing, and successfully moving from batch to streaming requires a significant change to a team’s habits and processes, as well as a meaningful upfront investment. That is why Confluent has rearchitected Kafka to create a complete platform that provides a fully managed, cloud-native data streaming solution with the ability to turn data events into outcomes, enable real-time apps, and empower teams and systems to act on data instantly. Personalised for the people Confluent’s ability to utilise data as a continually updating stream of events rather than discrete snapshots means that public sector organisations can leverage data streaming to improve citizen engagement by offering personalised, data-driven services and insights. Confluent’s data streaming platform also enables real-time monitoring and analysis of government services and infrastructure, allowing public sector entities to quickly respond to critical events such as natural disasters or public health emergencies. At a more mundane level, Confluent supports data sharing and collaboration among government agencies, facilitating the seamless exchange of information to serve the public better and optimise resource allocation. And, importantly for government organisations, Confluent’s data streaming capabilities can enhance cyber security by detecting and mitigating threats in real time and safeguarding sensitive government data—a critical element in maintaining our national security. Indeed, 53% of Australian businesses surveyed in a recent Confluent study cited security and compliance awareness as the most applicable use cases for data streaming. It should come as little surprise, then, that industry analyst firm Forrester views Confluent as “an excellent fit for organisations that need to support a high-performance, scalable, multi-cloud data pipeline with extreme resilience.” Streamlining service improvement Data streaming is driving greater efficiency in more than three of four companies across Asia Pacific, according to Confluent research. Meanwhile, 65% of IT leaders polled see significant or emerging product and service benefits from data streaming. With this in mind, the potential for the government to do more with its data is clear, and personalisation is top of mind. Personalising citizen service experiences requires knowing who a customer is at any given moment. This is made possible by accessing data in motion, especially across multiple touchpoints. At the very least, this can help citizens avoid having to provide the same information over and over again as they interact with government agencies. And now, with Confluent assessed under the Australian Information Security Registered Assessors Programme (IRAP), government agencies with an Information Security Manual PROTECTED level requirement can use Confluent Cloud across Amazon Web Services (AWS), Google Cloud, and Microsoft Azure. Australian government agencies will then be able to gather and share data across departments, offices, and agencies securely and at scale. This means even more government agencies will be able to tap data in motion to integrate information across their applications and systems in real time and reinvent employee and citizen experiences for the better.](https://publicspectrum.co/wp-content/uploads/2024/05/Confluent-Advertorial.png)
The demand for generative AI has experienced a significant increase in recent years, highlighting its potential to bring about transformation in various industries. This widespread adoption, however, presents considerable challenges, such as ethical concerns, vulnerabilities in security, and the requirement for substantial computational resources. As businesses aim to harness the potential of generative AI, they must navigate the complexities in order to maximise the benefits while minimisng the potential drawbacks.
Public Spectrum has caught up with Kieran Hagan, the Data and AI Segment Leader at IBM Software Group, who has extensive experience in addressing these important issues. Hagan’s extensive experience in the field allows organisations to benefit from his wealth of knowledge on effectively implementing generative AI technologies. His viewpoints are especially pertinent as companies strive to find a balance between innovation and accountability.
Hagan’s credentials are exceptional, showcasing his expertise in the industry. He has led multiple AI initiatives at IBM, making significant contributions to the progress of machine learning and data analytics. His role entails assisting businesses in navigating the complexities of AI implementation, helping them utilise state-of-the-art solutions while tackling important ethical and operational issues.
Kieran Hagan discusses the significance of navigating generative AI’s complex challenges.
This is the new workload demand for industry. GenAI is considered the new Information technology revolution, fuelling Digital transformation projects and cloud adoption. Generative AI can play a crucial role in cloud and digital transformation strategies by helping businesses automate and streamline their operations, enabling them to achieve greater agility, scalability, and cost savings. You only have to see the “AI bump” in the US stock market and the increasing discussion of AI and regulation to see that in effect. Over 80% of C suite executives are considering or actively implementing Gen AI technology today.
IBM addresses the evolving demand for generative AI by investing in research and development, forming strategic partnerships, and continuously updating its product offerings. You can see this through our 12-month announcement of the watsonx platform up to now:
Governments are one of the most data-rich entities on the planet, and their focus is to deliver optimal citizen services. Generative AI is an excellent platform to reduce bureaucratisation and increase insight driven decision making for optimal citizen services. Generative AI tools like Watsonx can analyse vast amounts of unstructured data, including government records, public statements, and news feeds, to extract valuable insights and inform decision-making processes.
By processing large datasets quickly and accurately, these tools can help governments identify trends, patterns, and correlations that might otherwise go unnoticed. Additionally, generative AI can generate tailored reports and visualisations to communicate complex ideas effectively, aiding in transparency and accountability. However, it is important to consider ethical concerns and ensure that the use of generative AI aligns with relevant laws and regulations governing data privacy and security.
IBM ensures the ethical use of generative AI in its solutions by focusing on four main pillars: Culture, Technology, Governance, and Privacy. Within these pillars, IBM promotes a strong emphasis on data privacy and security. This includes implementing robust data governance practices, ensuring continuous model tuning and improvement, addressing legal and ethical considerations, incorporating human oversight, maintaining model transparency, and complying with relevant data protection regulations. By addressing these aspects, IBM strives to build trust in AI while protecting user data and upholding data privacy rights.
Businesses need AI that is accurate, scalable, and adaptable.
IBM’s approach to AI is based on four core beliefs:
Looking ahead, I foresee several emerging trends in generative AI, including increased adoption in various industries, further development of explainable AI, and the integration of generative AI with other cutting-edge technologies such as quantum computing and edge computing. Additionally, I anticipate the emergence of new ethical guidelines and regulations surrounding the use of generative AI.
The latest advancements in generative AI have organisations waking up to the full potential of AI.
But businesses are overwhelmed, underprepared, and unsure how to profit from AI.
Businesses need AI that is accurate, scalable, and adaptable.
Businesses want to build AI based on models that can easily be adapted to new scenarios and use cases.
Justin Lavadia is a content producer and editor at Public Spectrum with a diverse writing background spanning various niches and formats. With a wealth of experience, he brings clarity and concise communication to digital content. His expertise lies in crafting engaging content and delivering impactful narratives that resonate with readers.