Strategy

Managing the Machines: A Q&A With KPMG’s Brad Fisher


Uncertainty about who is accountable when analytics go wrong begs the question of what proactive governance should be in place to ensure and protect the use of analytics.

©Kinwun/ISTOCK/THINKSTOCK

A recent KPMG survey revealed a worrisome reality: uncertainty around who’s responsible when D&A goes wrong. 62 percent said responsibility lies with technology functions, not the C-suite, while 25 percent said it should fall on the shoulders of the core business and 13 percent said regulatory and control functions. 

Even with the low confidence over the reputational and financial risks of analytics errors or misuse, respondents were not clear about who should be accountable if a poor business decision results in financial loss, or the loss of customers. FEI spoke with KPMG’s Brad Fisher, partner and U.S. leader of Data & Analytics (D&A) on why the responsibility should be with the C-suite.

 

FEI Daily: What is problematic about the lack of clarity around who's responsible for D&A, especially when things go wrong? 

Brad Fisher: There's this belief in all of us, that if a machine does something, it's right 100% of the time because it's programmed, it knows what it’s doing.

And you might expect that a human is going to make some mistakes. You have a bad day or you miss something. But you never expect the machine to make mistakes. And I think that belies the nature of advanced analytics in AI. The output is only as good as the models, and machine learning models basically learn over time and change. The notion that you would have to supervise analytics like you supervise an employee is something that cuts against the grain of what we naturally believe. 

In fact, there are a lot of reasons that the numbers might not be right coming out of the models. And there are a lot of things you need to do to oversee and govern them.

KPMG professionals believe that,  in today’s analytical enterprise,  the governance of machines is as important as the governance of people. The governance of machines and the governance of humans must be integrated across the organization to ensure that risks are properly managed and that the ‘hand-off’ between human and machine is seamless.

For example, systems at risk of high-impact ‘misbehavior’ require constant attention, which will be especially critical in contexts such as driving on busy roads or interacting with customers and the public. In addition to technical and industry expertise, these kinds of scenarios require participation from ethicists, anthropologists, psychologists, lawyers and ‘ordinary’ members of the public. This oversight will help build trust in AI and optimize performance at the boundary between human and machine.

FEI Daily: The report offers recommendations for building trust within an organization, one of which is the improve and adapt regulations to build confidence in D&A. What does that look like?

Fisher: When we say regulations, we mean the internal regulation of building D&A. 

People think that machines are infallible. They think that when advanced analytics are created, it's naturally correct. They see it as a black and a white. The fact of the matter is, although we call it data science, there's as much art as there is science, and there are a lot of interpretations and choices that a data scientist will make. And having clear guidelines and regulations about how they build the models and how they use the models is really important.

If you build something, and it's experimental or it's to corroborate, maybe you can take liberties. But now, clients want to use the insights on a regular and sustainable basis. I can tell you, five years ago you would have a side team that would be doing advanced analytics and they might hand that to somebody for decisions. What you see today, is they want to build the infrastructure, build the analytics, and bake it into production. So if you think about it, it's almost like a supply chain system, or a financial system, it just runs. And this is a big step for advanced analytics.

You juxtapose this increasing reliance with the fact that the C-suite says we don't fully trust it, it's IT's responsibility. Let's say a company misstates earnings. And the CEO has to go and explain it to TheStreet. Well, they're not saying it's all IT's fault, right? Usually accounting and finance is responsible, right? Just because a machine is evolved, doesn't mean you can naturally say it's IT's responsibility. 

FEI Daily: The report calls for stronger accountability at the C-level. How should financial executives be looking at D&A and AI when it comes to responsibility?

Fisher: Think about machines doing the work of people, and following processes. You have internal controls and, if you think about the governance of machines paralleling the governance of people, you need to think about the internal controls over the machines paralleling the internal controls over people. 

And, in my mind, there's a broad responsibility for controls, but the finance and accounting function is largely associated with the ownership of accounting controls. The controls are going to look a little bit different and feel a little bit different, and you’re not putting it all on them. And in fact, if you talk about the recommendations, a lot of them have kind of control orientation: standards, policies, procedures, regulations, transparency, codes for data scientists, and internal and external assessment mechanisms. 

If you took out the machine part of this, if it was just people, you'd say "That's just internal controls." And I'd say that the financial organization certainly has to have a role in that. 

FEI Daily: How can a company empower their C-Suite to feel like this is their responsibility?

Fisher: The best way to empower them is to help them understand it in their own terms. One of the challenges is that it's been shrouded in mystery. It’s about transparency and understanding in a business context.