TECHNOLOGY

KPMG: The C-suite rather than the tech team is responsible for analytics errors

As the trust gap around data and analytics (D&A) and artificial intelligence (AI) continues, only 35% of executives say they have a high level of trust in the way their organization uses D&A, said KPMG recently.

KPMG recently published its report titled Guardians of trust, which is based on a survey of 2,190 senior executives from Australia, Brazil, China, France, Germany, India, South Africa, UK and the US to take the pulse on the enduring trust gap between businesses and their data and analytics.

According to the report, concerns over the risks of D&A and AI are high: more than 65% of executives have some reservations or active mistrust in their data and analytics, and 92% are concerned about the negative impact of D&A on corporate reputation.

Stop pointing the finger at IT

Even with the low confidence over the reputational and financial risks of analytics errors or misuse, respondents were not clear about who should be accountable if a poor business decision results in financial loss, or the loss of customers, KPMG said.

In addition to the 62% who said the primary responsibility should lie with technology functions within their organizations, 25% thought it was on the shoulders of the core business, and 13% felt it should be regulatory and control functions.

Taking a closer look at which roles within the C-suite should hold the blame when analytics go wrong, the broad distribution of responses suggest a lack of clarity: only 19% said the CIO, 13% said the Chief Data Officer, and only 7% said C-level executive decision makers such as the CEO.

“Our survey of senior executives is telling us that there is a tendency to absolve the core business for decisions made with machines,” said Brad Fisher, US Data & Analytics Leader and a partner with KPMG in the US.

“While we understand technology’s legacy as a support service and the so-called ‘experts’ in all technical matters, many IT professionals do not have the domain knowledge or the overall capacity required to ensure trust in D&A. We believe the responsibility lies with the C-suite,” he noted.

What should good governance look like?

As organizations begin to think about behavior of machines as parallel to the behavior of people, they should also consider new models of governance to support the leaps in trust that the human-machine workforce requires, Erwin said.

“At a fundamental level, accountability of machines must be held firmly by the CEO and functional leaders,” he pointed out. "Based on recommendations from respondents, there is a strong indication that any governance framework should include standards and controls beyond the technical and cover strategic, cultural and ethical areas, which are the domains of the C-suite." 

The five recommendations for building trust within an organization according to respondents in the survey are:

  • Develop standards to create effective policies and procedures for all organizations
  • Improve and adapt regulations to build confidence in D&A
  • Increase transparency of algorithms and methodologies
  • Create professional codes for data scientists
  • Strengthen internal and external assurance mechanisms that validate and identify areas of weakness

“Building and ensuring trust across the analytics/AI lifecycle requires organized, scalable and distributed approaches,” Erwin said. “We are seeing a lot of businesses experimenting in this area which will likely drive future standards and new governance frameworks.”

Related Articles

New research which examined how organizations are evolving from a traditional...
Despite security worries, adoption of cloud-based financial systems is reaching...
Despite the efforts of many software developers to wean finance professionals...
Nearly two-thirds of companies with well-established advanced analytics...