Auditing AI Systems: A Metadata Approach

Authors: M'manga, A., Adams, C., Eslamnejad, M., Khadka, A., Shaw, H. and Zhao, Y.

Conference: SGAI International Conference on Artificial Intelligence

Dates: 12-14 December 2023


The EU AI regulatory framework and corresponding AI Act, call for stronger ‘product safety regime’ for AI development and set out requirements for more testing, transparency, and impact evaluation in AI based systems, along with significant penalties for corporations that do not follow these requirements. Similar rhetoric is emerging from the UK and USA governments. There is an immediate emerging theme within AI looking at how to test and how to audit compliance within these evolving requirements. This paper presents a metadata model to support auditing compliance and capturing key attributes of bounding of applicability of AI elements to support compliant reuse within AI systems development. The metadata model builds on the IEEE Learning Object Metadata (LOM) model standard to develop the AI-LOM, which provides a base for compliance within the ISTQB’ AI Development Framework’ covering testing of AI.

Source: Manual