Carsten Jung and Philippe Bracke

Whether in case of a breakup (Backstreet Boys), wondering why a relationship isn’t working (Mary J. Blige) or bad weather (Travis) – humans really care about explanations. The same holds in the world of finance, where firms increasingly deploy artificial intelligence (AI) software. But AI is often so complex that it becomes hard to explain why exactly it made a decision in a certain way. This issue isn’t purely hypothetical. Our recent survey found that AI already impacts customers – whether it’s calculating the price of an insurance policy or assessing a borrower’s credit-worthiness. In our new paper, we argue that so-called ‘explainability methods’ can help address this problem. But we also caution that, perhaps as with humans, gaining a deeper understanding of such models remains very hard.
Continue reading “Tell me why! Looking under the bonnet of machine learning models”


