UPDATED 14:30 EDT / MAY 28 2020

BIG DATA

Q&A: Standard Bank Group shares how it built a mature data operations framework

Data is an asset. But does enterprise treat it as such?

Knowing something and doing it are two different things. Even after a business has made the transition to digital operations, data managers are challenged to shift the cultural mindset of data as a sidebar overseen by the IT department to one where data is valued throughout the organization.

“Data is a business capability, a business function,” said Itumeleng Monale (pictured), head of enterprise information management and Personal and Business Banking Data office at Standard Bank Group. “It resides in business next to product management, next to marketing, next to everything else that the business needs. [It] has to be called to every role and every function.”

Monale spoke with Dave Vellante, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during the IBM DataOps in Action event. They discussed how the Standard Bank Group successfully developed a framework for mature data operations and proved to the workforce how responsible data management benefits all aspects of business. (* Disclosure below.)

[Editor’s note: The following content has been condensed for clarity.]

Tell us about your role at the Standard Bank Group and the changes you’ve seen during your time with the company.

Monale: I head up a data operations function and a data management function, which really is the foundational part of the data value chain that then allows other parts of the organization to monetize data and deliver it as the use cases apply. We’re an enterprise-wide organization that ensures that data quality is managed, data’s governed, that we have effective practices applied to the entire lineage of the data, ownership and curation is in place, and everything else from a regulatory as well as opportunity perspective, then is able to be leveraged upon.

My previous role in the early 2000s was head of digital banking, and at the time we thought digital was the panacea. Lo and behold, we realized that once you’ve gotten all of your digital platforms ready, they are just the plate or the pipe. Nothing is flowing through it, and there’s no food on the plate if data’s not the main focus. So, really, data has always been an asset. I think organizations just never consciously knew that.

What were some of the challenges that you faced in transforming Standard Bank’s data operations, and how did you solve them?

Monale: Convincing my colleagues that data was their problem and not something that they just kind of leave us to it was the first step in terms of getting the data operations journey going. They didn’t embrace it in the beginning.  It wasn’t an, “Oh, yeah, that makes sense. Let’s do that,” type of conversation. So, we developed a framework for a fully mature data operations capability in the organization and what that would look like in a target state scenario. And then we waited for a good crisis.

When a challenge occurred, in that our local regulator found us a little bit wanting in terms of our data quality, it brought the case for data quality management to the forefront. Now there’s a burning platform; people say, “Okay, we need this to comply, so help us out.”

When they saw data ops in action, they bought into the concept. Sometimes you need to just wait for a good crisis and leverage it, and only do that which the organization will appreciate at that time.

When that crisis hit, you probably had to deal with it in terms of people, process and technology. Can you talk about that?

Monale: From a technology perspective, that was when we partnered with IBM Corp. to implement InfoSphere Information Analyzer. It was important for us to make strides in terms of showing the organization progress, but also being able to give employees access to self-service tools that will give them insight into their data. People-wise, we began a data stewardship journey. I had soldiers planted in each department who were data managers. They worked to continue building the culture, maturing the data practices as applicable to each business unit’s use cases.

If money’s important to you, you have somebody helping you take accountability and execute on your responsibilities in managing that money. If data is equally important as an asset, you will have a leader, a manager helping you execute on your data ownership accountability.

In terms of process, it’s about threading through the entire ecosystem. Data management, as a practice, can be quite lonely in the sense that unless the core business of the organization is managing data, they’re worried about doing what they do to make money. So, for us it was important to have a community of practice, a process with all the data managers across the business, as well as the technology parts and the specialists who are data-management professionals coming together and making sure that we work together on specific use cases.

Could you describe some specific DataOps use cases within the Standard Bank Group?

Monale: Our very first use case of DataOps was probably when we implemented IBM Information Analyzer in our business unit, simply because it was the first time that IT and business, as well as data professionals came together. To spec the use case, we would literally, in an agile fashion with a multidisciplinary team, come together to make sure that we got the outcomes that we require. We moved from 6% quality from our client data, now we’re sitting at 99%; and that 1% literally is just a timing issue.

To get from 6% to 99%, you have to make sure that the entire value chain is engaged. So, you’d have up-front determination of the outcome with business. Then the team would go into an agile cycle of maybe two-week sprints where we’d develop certain things, and then the output would be dashboarded in a prototype kind of fashion where business then gets to go double check the outcome. That was the first iteration.

The most recent one, which was in late 2019 coming into early this year, was pertinent to business execution and business productivity. We’re 158 years old as an organization, so this bank was born before technology. It was also born in the days of no integration because every branch was a standalone entity. So, our architecture has had a huge legacy burden on it, and going into a place where you can be agile with data is something that we’re constantly working toward.

In some places based on how our ledgers roll up and reconciliation between various systems and accounts work, it used to take six weeks to verify whether sales tactics were effective or not because to actually see the revenue hitting our general ledger and our balance sheet might take that long. That is an ineffective way to operate in such a competitive environment.

So, what we did is we sat down and defined all the requirements from a reporting perspective, and the objective was to move from six weeks latency to 24 hours. We literally had the frontline teams defining what they want to see in a dashboard, the business teams defining what the business rules behind the quality and the definitions would be, and then we had an entire analytics team and the data-management team working around sourcing the data, optimizing it, curating it, and making sure that the latency is cut down.

Now we’re in a place where people can look at a dashboard, it’s a cube, it’s self-service, they can log in at any time and see the sales they’ve made. That’s our latest use case for DataOps.

We were talking earlier about how you took advantage of a crisis to kickstart this cultural shift. How have you responded as a result of COVID-19? 

Monale: Because of the quality of fine data that we have now, we were able to very quickly respond to COVID-19. So, within the first weekend of lockdown in South Africa, we were the first bank to proactively and automatically offer small businesses and students with loans on our books an instant three-month payment holiday assuming they were in good standing.

We did that upfront, so it was actually an opt-out process rather than you had to call in and arrange for that to happen. And I don’t believe we would have been able to do that if our data quality was not where it was supposed to be.

We have since made many more initiatives to try and keep the economy going, to try and keep our clients in a state of liquidity. And so, data quality at that juncture is critical to know who you’re talking to, who needs what, and which solution would best be fitted towards various segments.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the IBM DataOps in Action event. (* Disclosure: TheCUBE is a paid media partner for the IBM DataOps in Action event. Neither IBM, the sponsor for theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU