Explainable AI – building trust through understanding

The AI Forum of New Zealand launched its Explainable AI (XAI) white paper on November 8th 2023 to an online audience of around 200 people.  The white paper discusses approaches to making AI models, systems, and their decision-making processes more understandable, interpretable, and transparent to humans. This work has a clear focus on bringing the XAI discussion into the Aotearoa New Zealand context, with relevant use cases and discussions.

The white paper provides a comprehensive overview of opportunities, benefits, challenges and governance considerations associated with XAI in New Zealand and it will help organisations and individuals make informed decisions about how to use XAI responsibly and ethically.

At the event, Chair of AI Forum’s XAI Working Group Matt Lythe was joined in discussion by Maria Mingallon (Mott MacDonald), Dr Andrew Lensen (Victoria University of Wellington), Sarah Auva’a (Spark), David Knox (Lynker Analytics), and Dr CK Jin (Te Whatu Ora).

A highlight of the report was a case study which explored factors contributing to landslips on the east coast of the North Island after Cyclone Gabrielle.  In this study, Silver Fern Farms in partnership with environmental data science specialist Lynker Analytics turned to interpretML, an open-source machine learning library that provides interpretable glass box models. 

Image taken by Lynker staff of Emerald Hill.

The project team assembled data such as slope, rainfall, elevation, geology, and land cover alongside mapped landslips from Manaaki Whenua Landcare Research.  The interpretable model produced a probability map of landslips which revealed a strong correspondence between the predicted slips and actual slips as well as a clear relationship with slope, aspect and vegetative cover.   

XAI model, data inputs: Clockwise from left: Rainfall, landslides, aspect, elevation, LCDB v5 landcover.

The free whitepaper including this and other case studies which also has an extensive list of additional reading is available here.