24 October 2023
Despite what you might be thinking from the title of this article, these are not words used to describe myself, the public service or engagement with government. They are the characteristics of tasks which robotic automation have historical targeted. Think autonomous mining trucks, autonomous diggers or hot metal carriers.
In the world of information management or knowledge work, the equivalent might be repetitive, fragmented and low complexity. These general descriptions can be applied to many tasks in the world of government.
No one likes to do repetitive tasks. Human beings have a natural desire for variety, so being forced to do the same thing again and again seems like a punishment. It is hard to find “joy” in your tasks if you feel like you are on a production line for widgets or are a rubber-stamping machine. Think of the people who do basic data entry or manual data cleaning.
Similarly, if you are the link that joins up a series of fragmented partial processes, then finding job satisfaction may require some imagination. If you have ever had to copy-and-paste details from one form or system to other forms or systems, you will know what this means.
Low complexity could apply to both the “production line-like” repetitive tasks, as well as being the cog that joins disparate processes. But low complexity can also apply to situations with formulaic decision making of the sort “if ‘A’ and ‘B’, then ‘C’”. Anything which you could map to a decision tree is arguably a low complexity decision, provided that: (1) there is little ambiguity as to the criteria which drive selection of one branch or the other; (2) there are no other “real-world” considerations which enter into the decision-making process (such as hardship), which would trigger an exception.
Interacting with people, however, is not a low complexity task. Context, social cues, interaction rates, external real-world knowledge, and even small talk make dealing with people a much more complex task than a decision tree. That is true even if the nature of the interaction is formalised and may literally be underpinned by a decision tree. The reason is simply that people can be ambiguous, uncertain or unclear. They may misunderstand, may be frustrated and may even be mischievous. Many a chatbot interaction has ended with the modern-day equivalent of slamming down the phone.
These general applications of artificial intelligence have a counterpart in the specialised world where one very narrow task is done at a higher level of precision or with greater consistency than humans. In NSW, the Ministry of Health has been undertaking trials of a sepsis monitoring using a prediction tool which constantly monitors patients in the emergency department for sepsis, a potentially life-treating condition. This constant monitoring is far more than a human nurse could do given the many other tasks required in the emergency room. Similarly, pilots of data and AI driven wound management tools were shown to have contributed to enhanced access to quality care and improved quality of life for some patients, carers, and families.
And of course, all of these AI applications are passed through the NSW AI Assurance Framework. This framework was endorsed by NSW Cabinet in December 2021 and became mandatory to apply in NSW from March 2022. It is becoming the basis for a nationally consistent approach to AI assurance and will evolve during 2023 to deal more explicitly with newer AI including, generative AI and large language models.
So, back to general applications of AI. What areas can AI be used for? Here is what ChatGPT says:
Generally, tasks that are repetitive, rule-based, time-consuming, and require precision are ideal candidates for automation. Some of the common tasks that are well-suited to automation include: |
---|
|
Yes, this list was really generated by the AI itself.
As technology evolves and more government services for our communities incorporate digital elements, there is a huge opportunity for Al to make these services even simpler, personalised and secure.
But a note of caution about customer support and chatbots: they can only be used for low complexity or unambiguous tasks.
Thanks AI, for helping me write this short article. |
---|
😊 You’re welcome! |
Dr Ian Oppermann is the NSW Government’s Chief Data Scientist and an Industry Professor at the University of Technology Sydney (UTS). He is a Fellow of the Institute of Engineers Australia, the IEEE, the Australian Academy of Technological Sciences and Engineering, the Royal Society of NSW, and the Australian Computer Society, of which he is also Immediate Past President. Ian is Chair of Australia’s IEC National Committee and JTC1, the NSW AI Review Committee and the SmartNSW Advisory Council.
Image credit: Image Generated Using AI/Canva Magic Media
Features
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Subscribe to The Policymaker
Explore more articles
Philippa Higgins, Chris Turney, Jonathan Palmer, Tristan Derham, Lucinda Duxbury and Matthew Fielding
Veena Sahajwalla & Anirban Ghose
Features
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Jo-An Occhipinti, Ante Prodan, William Hynes, John Buchanan, Goran Ujdur and Harris Eyre
Explore more articles
Philippa Higgins, Chris Turney, Jonathan Palmer, Tristan Derham, Lucinda Duxbury and Matthew Fielding
Veena Sahajwalla & Anirban Ghose
Subscribe to The Policymaker