I never said “thank you” to my calculator: reflections from the outgoing NSW Chief Data Scientist

Ian Oppermann

As AI becomes invisibly embedded in everyday life, we must remain laser focused on controlling and regulating how it uses data and the data it generates.

19 December 2023

The conversation around AI is hopefully getting close to its crescendo. It seems all that we talk about at government forums and leadership events. The world changed when generative AI and large language models were released just over a year ago. Now they almost completely monopolise our conversations and thoughts about the future.

Partly it is because AI is being embedded everywhere. We took delivery of a very smart “Smart TV” on the weekend. I duly connected, clicked and consented to many of my online services being connected through the TV and its back end. Terms and conditions were accepted, and software was updated. Very long “end user licence agreements” were skimmed and personal data flowed through a myriad of systems and processes.  And then…I could not find the free-to-air channels on the TV. I could get all kinds of commercial, app-based content, but not access the good old TV signals floating through space. Frustrated, I resorted to asking my TV for help.  “Hey (fill in the name of embedded digital assistant), get me the ABC” and the TV duly responded, and the screen changed to ABC in high definition. Hoorah! I subsequently thanked the TV as would be natural when receiving any help.

That small experience on the weekend highlights a few important realities of our current world: digital technology is everywhere, it can be very complex for end users to harness, it relies on use of significant amounts of personal information, it purports to seek consent – and if we want to access the benefits of the technology, we have little choice but to consent.

The AI interaction was also interesting. In this case, the AI is a tool for joining the processes and processing the data. The “data product” it created was the search for, identification of, and prioritisation to the screen of one part of the range of services available through the TV. The interaction was via my frustrated voice recognised by the algorithm, and a fairly natural sounding voice generation of an acknowledgment from the TV. My spontaneous thanking of the algorithm after the channel was found speaks to the fact that, increasingly, we are ignoring the technical complexity under the bonnet and engaging with the algorithm through intuitive shortcuts. Natural and easy to use interfaces such as voice recognition or natural language processing will likely accelerate that use of intuitive shortcuts.

Within government, we are also likely to find the data driven tools getting smarter over time as AI gets increasingly embedded in each software update. Increasingly, AI will just come built in and get switched on with the acceptance of some terms and conditions. On the occasions where we deliberately build smart tools or smart services, there will still be aspects of AI components we use in those tools and services that we do not fully control or fully understand. It could be a neural network model for facial matching, or a smart process mining module used in process automation. We may have an operational understanding of these components, but little real possibility to scrutinise and explain how the results (the data products) of these components are realised.

So, as the algorithms disappear into opaque complexity beyond our ability to explain, it becomes increasingly important to focus on the data, and the guiderails on the use of the data products created from the use of AI.

Data remains the biggest game in town. It is important that we understand provenance, chain of custody, data quality and fitness of the data for the purpose we are using it for. It is then important to understand the guidance, restrictions or prohibitions we need to put in place on use of the data products created. A great many potential harms or misuses of data could be avoided if we used this “data use” framework.

It sounds simple, but in practice it is not. This is because, most of the time, we will invent a bespoke process around the collection and use of data rather than building out general frameworks for data sharing and use. Wrapped up in a very wordy, byzantine data sharing agreement, we will try to limit use, apportion responsibility and minimise unknown future harms. Most data sharing agreements achieve none of these things in practice and effectively put a handbrake on sharing.

Increasingly, general data sharing frameworks and tools are being developed. I have been involved in examples such as the Data Sharing Framework whitepaper series published through the ACS, the NSW AI Assurance Framework and the ISO/IEC SC32 Data Usage standard which is nearing final approval. If we take the time to pick up these tools, we will go a long way towards understanding how to appropriately use data, and how to identify and apply the necessary safeguards around use of the data products.

Getting back to the title of this small piece, I have never thanked my calculator for doing me a good service. I have never uttered words to the effect “that was a very fine long division just now” to my trusty HP calculator. Perhaps I should have said thanks given all the years I used it early in my career. Perhaps if I could have interacted with it via voice, I may well have thanked it, but then that would have been AI.

As I hand back the Chief Data Scientist badge, I commend the renewed NSW AI Assurance Framework to you. If you close one eye and squint, you will see that it is a thinly disguised data sharing and use framework. If you also lean a little to one side, you will see years of experimentation from the early days of the DAC (NSW Data Analytics Centre) and the lessons learned from COVID.

It has been a privilege to serve as the first ever NSW Chief Data Scientist. I wish you well in your adventures with data.

With thanks, “Oppermann Out”.

Dr Ian Oppermann is the NSW Government’s Chief Data Scientist and an Industry Professor at the University of Technology Sydney (UTS). He is a Fellow of the Institute of Engineers Australia, the IEEE, the Australian Academy of Technological Sciences and Engineering, the Royal Society of NSW, and the Australian Computer Society, of which he is also Immediate Past President. Ian is Chair of Australia’s IEC National Committee and JTC1, the NSW AI Review Committee and the SmartNSW Advisory Council.

Image credit: Getty Images

Features

  • Rys Farthing and Lorna Woods

  • Maxwell Yong

  • Ehsan Noroozinejad Farsangi

Subscribe to The Policymaker

Explore more articles

  • Thea Snow

  • José-Miguel Bello y Villarino

  • Juliet Bennett

Features

  • Rys Farthing and Lorna Woods

  • Maxwell Yong

  • Ehsan Noroozinejad Farsangi

Explore more articles

  • Thea Snow

  • José-Miguel Bello y Villarino

  • Juliet Bennett

Subscribe to The Policymaker