Image Title

Search Results for Dharma:

Itumeleng Monale, Standard Bank | IBM DataOps 2020


 

from the cube studios in Palo Alto in Boston connecting with thought leaders all around the world this is a cube conversation hi buddy welcome back to the cube this is Dave Volante and you're watching a special presentation data ops enacted made possible by IBM you know what's what's happening is the innovation engine in the IT economy is really shifted used to be Moore's Law today it's applying machine intelligence and AI to data really scaling that and operationalizing that new knowledge the challenges that is not so easy to operationalize AI and infuse it into the data pipeline but what we're doing in this program is bringing in practitioners who have actually had a great deal of success in doing just that and I'm really excited to have it Kumal a Himalayan Manali is here she's the executive head of data management or personal and business banking at Standard Bank of South Africa the tomb of length thanks so much for coming in the queue thank you for having me Dave you're very welcome and first of all how you holding up with this this bovid situation how are things in Johannesburg um things in Johannesburg are fine we've been on lockdown now I think it's day 33 if I'm not mistaken lost count and but we're really grateful for the swift action of government we we only I mean we have less than 4,000 places in the country and infection rate is is really slow so we've really I think been able to find the curve and we're grateful for being able to be protected in this way so all working from home or learning the new normal and we're all in this together that's great to hear why don't you tell us a little bit about your your role you're a data person we're really going to get into it but here with us you know how you spend your time okay well I head up a date operations function and a data management function which really is the foundational part of the data value chain that then allows other parts of the organization to monetize data and liberate it as as as the use cases apply we monetize it ourselves as well but really we're an enterprise wide organization that ensures that data quality is managed data is governed that we have the effective practices applied to the entire lineage of the data ownership and curation is in place and everything else from a regulatory as well as opportunity perspective then is able to be leveraged upon so historically you know data has been viewed as sort of this expense it's it's big it's growing it needs to be managed deleted after a certain amount of time and then you know ten years ago of the Big Data move data became an asset you had a lot of shadow I people going off and doing things that maybe didn't comply to the corporate ethics probably drove here here you're a part of the organization crazy but talk about that how what has changed but they in the last you know five years or so just in terms of how people approach data oh I mean you know the story I tell my colleague who are all bankers obviously is the fact that the banker in 1989 had to mainly just know debits credits and be able to look someone in the eye and know whether or not they'd be a credit risk or not you know if we lend you money and you pay it back the the banker of the late 90s had to then contend with the emergence of technologies that made their lives easier and allowed for automation and processes to run much more smoothly um in the early two-thousands I would say that digitization was a big focus and in fact my previous role was head of digital banking and at the time we thought digital was the panacea it is the be-all and end-all it's the thing that's gonna make organizations edit lo and behold we realized that once you've gotten all your digital platforms ready they are just the plate or the pipe and nothing is flowing through it and there's no food on the face if data is not the main photo really um it's always been an asset I think organizations just never consciously knew that data was that okay so so what sounds like once you've made that sort of initial digital transformation you really had to work it and what we're hearing from a lot of practitioners like self as challenges related to that involve different parts of the organization different skill sets of challenges and sort of getting everybody to work together on the same page it's better but maybe you could take us back to sort of when you started on this initiative around data Ops what was that like what were some of the challenges that you faced and how'd you get through them okay first and foremost Dave organizations used to believe that data was I t's problem and that's probably why you you then saw the emergence of things like chatter IP but when you really acknowledge that data is an essay just like money is an asset then you you have to then take accountability for it just the same way as you would any other asset in the organization and you will not abdicate its management to a separate function that's not cold to the business and oftentimes IT are seen as a support or an enabling but not quite the main show in most organizations right so what we we then did is first emphasize that data is a business capability the business function it presides in business makes to product management makes to marketing makes to everything else that the business needs for data management also has to be for to every role in every function to different degrees and varying bearing offense and when you take accountability as an owner of a business unit you also take accountability for the data in the systems that support the business unit for us that was the first picture um and convincing my colleagues that data was their problem and not something that we had to worry about they just kind of leave us to to it was was also a journey but that was kind of the first step into it in terms of getting the data operations journey going um you had to first acknowledge please carry on no you just had to first acknowledge that it's something you must take accountability of as a banker not just need to a different part of the organization that's a real cultural mindset you know in the game of rock-paper-scissors you know culture kinda beats everything doesn't it it's almost like a yep a trump card and so so the businesses embrace that but but what did you do to support that is there has to be trust in the data that it has to be a timeliness and so maybe you could take us through how you achieve those objectives and maybe some other objectives that business the man so the one thing I didn't mention Dave is that obviously they didn't embrace it in the beginning it wasn't a it wasn't there oh yeah that make sense they do that type of conversation um what what he had was a few very strategic people with the right mindset that I could partner with that understood the case for data management and while we had that as as an in we developed a framework for a fully matured data operations capability in the organization and what that would look like in a target date scenario and then what you do is you wait for a good crisis so we had a little bit of a challenge in that our local regulator found us a little bit wanting in terms of our date of college and from that perspective it then brought the case for data quality management so now there's a burning platform you have an appetite for people to partner with you and say okay we need this to comply to help us out and when they start seeing their opt-in action do they then buy into into the concept so sometimes you need to just wait for a good Christ and leverage it and only do that which the organization will appreciate at that time you don't have to go Big Bang data quality management was the use case at the time five years ago so we focused all our energy on that and after that it gave us leeway and license really bring to maturity all the other capabilities at the business might not well understand as well so when that crisis hit of thinking about people process in technology you probably had to turn some knobs in each of those areas can you talk about that so from a technology perspective that that's when we partnered with with IBM to implement information analyzer for us in terms of making sure that then we could profile the data effectively what was important for us is to to make strides in terms of showing the organization progress but also being able to give them access to self-service tools that will give them insight into their data from a technology perspective that was kind of I think the the genesis of of us implementing and the IBM suite in earnest from a data management perspective people wise we really then also began a data stewardship journey in which we implemented business unit stewards of data I don't like using the word steward because in my organization it's taken lightly almost like a part-time occupation so we converted them we call them data managers and and the analogy I would give is every department with a P&L any department worth its salt has a FDA or financial director and if money is important to you you have somebody helping you take accountability and execute on your responsibilities in managing that that money so if data is equally important as an asset you will have a leader a manager helping you execute on your data ownership accountability and that was the people journey so firstly I had kind of soldiers planted in each department which were data managers that would then continue building the culture maturing the data practices as as applicable to each business unit use cases so what was important is that every manager in every business unit to the Data Manager focus their energy on making that business unit happy by ensuring that they data was of the right compliance level and the right quality the right best practices from a process and management perspective and was governed and then in terms of process really it's about spreading through the entire ecosystem data management as a practice and can be quite lonely um in the sense that unless the whole business of an organization is managing data they worried about doing what they do to make money and most people in most business units will be the only unicorn relative to everybody else who does what they do and so for us it was important to have a community of practice a process where all the data managers across business as well as the technology parts and the specialists who were data management professionals coming together and making sure that we we work together on on specific you say so I wonder if I can ask you so the the industry sort of likes to market this notion of of DevOps applied to data and data op have you applied that type of mindset approach agile of continuous improvement is I'm trying to understand how much is marketing and how much actually applicable in the real world can you share well you know when I was reflecting on this before this interview I realized that our very first use case of data officers probably when we implemented information analyzer in our business unit simply because it was the first time that IT and business as well as data professionals came together to spec the use case and then we would literally in an agile fashion with a multidisciplinary team come together to make sure that we got the outcomes that we required I mean for you to to firstly get a data quality management paradigm where we moved from 6% quality at some point from our client data now we're sitting at 99 percent and that 1% literally is just the timing issue to get from from 6 to 99 you have to make sure that the entire value chain is engaged so our business partners will the fundamental determinant of the business rules apply in terms of what does quality mean what are the criteria of quality and then what we do is translate that into what we put in the catalog and ensure that the profiling rules that we run are against those business rules that were defined at first so you'd have upfront determination of the outcome with business and then the team would go into an agile cycle of maybe two-week sprints where we develop certain things have stand-ups come together and then the output would be - boarded in a prototype in a fashion where business then gets to go double check that out so that was the first iterate and I would say we've become much more mature at it and we've got many more use cases now and there's actually one that it's quite exciting that we we recently achieved over the end of of 2019 into the beginning of this year so what we did was they I'm worried about the sunlight I mean through the window you look creative to me like sunset in South Africa we've been on the we've been on CubeSat sometimes it's so bright we have to put on sunglasses but so the most recent one which was in in mates 2019 coming in too early this year we we had long kind of achieved the the compliance and regulatory burning platform issues and now we are in a place of I think opportunity and luxury where we can now find use cases that are pertinent to business execution and business productivity um the one that comes to mind is we're a hundred and fifty eight years old as an organization right so so this Bank was born before technology it was also born in the days of light no no no integration because every branch was a standalone entity you'd have these big ledges that transactions were documented in and I think once every six months or so these Ledger's would be taken by horse-drawn carriage to a central place to get go reconcile between branches and paper but the point is if that is your legacy the initial kind of ERP implementations would have been focused on process efficiency based on old ways of accounting for transactions and allocating information so it was not optimized for the 21st century our architecture had has had huge legacy burden on it and so going into a place where you can be agile with data is something that we constantly working toward so we get to a place where we have hundreds of branches across the country and all of them obviously telling to client servicing clients as usual and and not being able for any person needing sales teams or executional teams they were not able in a short space of time to see the impact of the tactic from a database fee from a reporting history and we were in a place where in some cases based on how our Ledger's roll up and the reconciliation between various systems and accounts work it would take you six weeks to verify whether your technique were effective or not because to actually see the revenue hitting our our general ledger and our balance sheet might take that long that is an ineffective way to operate in a such a competitive environment so what you had our frontline sales agents literally manually documenting the sales that they had made but not being able to verify whether that or not is bringing revenue until six weeks later so what we did then is we sat down and defined all the requirements were reporting perspective and the objective was moved from six weeks latency to 24 hours um and even 24 hours is not perfect our ideal would be that bite rows of day you're able to see what you've done for that day but that's the next the next epoch that will go through however um we literally had the frontline teams defining what they'd want to see in a dashboard the business teams defining what the business rules behind the quality and the definitions would be and then we had an entire I'm analytics team and the data management team working around sourcing the data optimising and curating it and making sure that the latency had done that's I think only our latest use case for data art um and now we're in a place where people can look at a dashboard it's a cubed self-service they can learn at any time I see the sales they've made which is very important right now at the time of covert nineteen from a form of productivity and executional competitiveness those are two great use cases of women lying so the first one you know going from data quality 6% the 99% I mean 6% is all you do is spend time arguing about the data bills profanity and then 99% you're there and you said it's just basically a timing issue use latency in the timing and then the second one is is instead of paving the cow path with an outdated you know ledger Barret data process week you've now compressed that down to 24 hours you want to get the end of day so you've built in the agility into your data pipeline I'm going to ask you then so when gdpr hit were you able to very quickly leverage this capability and and apply and then maybe other of compliance edik as well well actually you know what we just now was post TDP our us um and and we got GDP all right about three years ago but literally all we got right was reporting for risk and compliance purposes they use cases that we have now are really around business opportunity lists so the risk so we prioritize compliance report a long time it but we're able to do real-time reporting from a single transaction perspective I'm suspicious transactions etc I'm two hours in Bank and our governor so from that perspective that was what was prioritize in the beginning which was the initial crisis so what you found is an entire engine geared towards making sure that data quality was correct for reporting and regulatory purposes but really that is not the be-all and end-all of it and if that's all we did I believe we really would not have succeeded or could have stayed dead we succeeded because Dana monetization is actually the penis' t the leveraging of data for business opportunity is is actually then what tells you whether you've got the right culture or not you're just doing it to comply then it means the hearts and minds of the rest of the business still aren't in the data game I love this story because it's me it's nirvana for so many years we've been pouring money to mitigate risk and you have no choice do it you know the general council signs off on it the the CFO but grudgingly signs off on it but it's got to be done but for years decades we've been waiting to use these these risk initiatives to actually drive business value you know it kind of happened with enterprise data warehouse but it was too slow it was complicated and it certainly didn't happen with with email archiving that was just sort of a tech balk it sounds like you know we're at that point today and I want to ask you I mean like you know you we talking earlier about you know the crisis gonna perpetuated this this cultural shift and you took advantage of that so we're out who we the the mother nature dealt up a crisis like we've never seen before how do you see your data infrastructure your data pipeline your data ops what kind of opportunities do you see in front of you today as a result of ovid 19 well I mean because of of the quality of kind data that we had now we were able to very quickly respond to to pivot nineteen in in our context where the government put us on lockdown relatively early in in the curve or in the cycle of infection and what it meant is it brought a little bit of a shock to the economy because small businesses all of a sudden didn't have a source of revenue or potentially three to six weeks and based on the data quality work that we did before it was actually relatively easy to be agile enough to do the things that we did so within the first weekend of of lockdown in South Africa we were the first bank to proactively and automatically offer small businesses and student and students with loans on our books a instant three month payment holiday assuming they were in good standing and we did that upfront though it was actually an opt-out process rather than you had to fall in and arrange for that to happen and I don't believe we would have been able to do that if our data quality was not with um we have since made many more initiatives to try and keep the economy going to try and keep our clients in in a state of of liquidity and so you know data quality at that point and that Dharma is critical to knowing who you're talking to who needs what and in which solutions would best be fitted towards various segments I think the second component is um you know working from home now brings an entirely different normal right so so if we had not been able to provide productivity dashboard and and and sales and dashboards to to management and all all the users that require it we would not be able to then validate or say what our productivity levels are now that people are working from home I mean we still have essential services workers that physically go into work but a lot of our relationship bankers are operating from home and that face the baseline and the foundation that we said productivity packing for various methods being able to be reported on in a short space of time has been really beneficial the next opportunity for us is we've been really good at doing this for the normal operational and front line and type of workers but knowledge workers have also know not necessarily been big productivity reporters historically they kind of get an output then the output might be six weeks down the line um but in a place where teams now are not locate co-located and work needs to flow in an edge of passion we need to start using the same foundation and and and data pipeline that we've laid down as a foundation for the reporting of knowledge work and agile team type of metric so in terms of developing new functionality and solutions there's a flow in a multidisciplinary team and how do those solutions get architected in a way where data assists in the flow of information so solutions can be optimally developed well it sounds like you're able to map a metric but business lines care about you know into these dashboards you usually the sort of data mapping approach if you will which makes it much more relevant for the business as you said before they own the data that's got to be a huge business benefit just in terms of again we talked about cultural we talked about speed but but the business impact of being able to do that it has to be pretty substantial it really really is um and and the use cases really are endless because every department finds their own opportunity to utilize in terms of their also I think the accountability factor has has significantly increased because as the owner of a specific domain of data you know that you're not only accountable to yourself and your own operation but people downstream to you as a product and in an outcome depend on you to ensure that the quality of the data you produces is of a high nature so so curation of data is a very important thing and business is really starting to understand that so you know the cards Department knows that they are the owners of card data right and you know the vehicle asset Department knows that they are the owners of vehicle they are linked to a client profile and all of that creates an ecosystem around the plan I mean when you come to a bank you you don't want to be known as a number and you don't want to be known just for one product you want to be known across everything that you do with that with that organization but most banks are not structured that way they still are product houses and product systems on which your data reside and if those don't act in concert then we come across extremely schizophrenic as if we don't know our clients and so that's very very important stupid like I can go on for an hour talking about this topic but unfortunately we're we're out of time thank you so much for sharing your deep knowledge and your story it's really an inspiring one and congratulations on all your success and I guess I'll leave it with you know what's next you gave us you know a glimpse of some of the things you wanted to do pressing some of the the elapsed times and the time cycle but but where do you see this going in the next you know kind of mid term and longer term currently I mean obviously AI is is a big is a big opportunity for all organizations and and you don't get automation of anything right if the foundations are not in place so you believe that this is a great foundation for anything AI to be applied in terms of the use cases that we can find the second one is really providing an API economy where certain data product can be shared with third parties I think that probably where we want to take things as well we are really utilizing external third-party data sources I'm in our data quality management suite to ensure validity of client identity and and and residents and things of that nature but going forward because been picked and banks and other organizations are probably going to partner to to be more competitive going forward we need to be able to provide data product that can then be leveraged by external parties and vice-versa to be like thanks again great having you thank you very much Dave appreciate the opportunity thank you for watching everybody that we go we are digging in the data ops we've got practitioners we've got influencers we've got experts we're going in the crowd chat it's the crowd chat net flash data ops but keep it right there way back but more coverage this is Dave Volante for the cube [Music] you

Published Date : May 28 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
JohannesburgLOCATION

0.99+

1989DATE

0.99+

six weeksQUANTITY

0.99+

Dave VolantePERSON

0.99+

IBMORGANIZATION

0.99+

DavePERSON

0.99+

threeQUANTITY

0.99+

24 hoursQUANTITY

0.99+

two-weekQUANTITY

0.99+

6%QUANTITY

0.99+

Palo AltoLOCATION

0.99+

two hoursQUANTITY

0.99+

South AfricaLOCATION

0.99+

less than 4,000 placesQUANTITY

0.99+

99 percentQUANTITY

0.99+

Standard BankORGANIZATION

0.99+

99%QUANTITY

0.99+

21st centuryDATE

0.99+

6QUANTITY

0.99+

second componentQUANTITY

0.99+

hundreds of branchesQUANTITY

0.99+

2019DATE

0.99+

first stepQUANTITY

0.99+

five yearsQUANTITY

0.99+

first bankQUANTITY

0.99+

1%QUANTITY

0.98+

five years agoDATE

0.98+

first timeQUANTITY

0.98+

BostonLOCATION

0.98+

99QUANTITY

0.98+

each departmentQUANTITY

0.98+

firstQUANTITY

0.98+

late 90sDATE

0.97+

six weeks laterDATE

0.97+

todayDATE

0.97+

three monthQUANTITY

0.97+

ten years agoDATE

0.96+

an hourQUANTITY

0.96+

a hundred and fifty eight years oldQUANTITY

0.96+

firstlyQUANTITY

0.95+

second oneQUANTITY

0.95+

first weekendQUANTITY

0.94+

one productQUANTITY

0.94+

nineteenQUANTITY

0.94+

first pictureQUANTITY

0.93+

each business unitQUANTITY

0.91+

eachQUANTITY

0.91+

KumalPERSON

0.89+

single transactionQUANTITY

0.89+

Big BangEVENT

0.88+

first oneQUANTITY

0.88+

once every six monthsQUANTITY

0.87+

2020DATE

0.86+

LedgerORGANIZATION

0.85+

first use caseQUANTITY

0.84+

every branchQUANTITY

0.83+

about three years agoDATE

0.82+

ChristPERSON

0.81+

oneQUANTITY

0.8+

Itumeleng MonalePERSON

0.79+

DevOpsTITLE

0.78+

two great use casesQUANTITY

0.78+

yearsQUANTITY

0.77+

Standard Bank of SouthORGANIZATION

0.76+

DharmaORGANIZATION

0.76+

early this yearDATE

0.74+

l councilORGANIZATION

0.71+

FDAORGANIZATION

0.7+

endDATE

0.69+

this yearDATE

0.68+

Moore's LawTITLE

0.67+

IBM DataOpsORGANIZATION

0.65+

DanaPERSON

0.63+

every businessQUANTITY

0.62+