Paula D'Amico, Webster Bank | Io Tahoe | Enterprise Data Automation
>>from around the globe. It's the Cube with digital coverage of enterprise data automation, an event Siri's brought to you by Iot. Tahoe, >>my buddy, We're back. And this is Dave Volante, and we're covering the whole notion of automating data in the Enterprise. And I'm really excited to have Paul Damico here. She's a senior vice president of enterprise data Architecture at Webster Bank. Good to see you. Thanks for coming on. >>Hi. Nice to see you, too. Yes. >>So let's let's start with Let's start with Webster Bank. You guys are kind of a regional. I think New York, New England, uh, leave headquartered out of Connecticut, but tell us a little bit about the bank. >>Yeah, Um, Webster Bank >>is regional Boston And that again, and New York, Um, very focused on in Westchester and Fairfield County. Um, they're a really highly rated saying regional bank for this area. They, um, hold, um, quite a few awards for the area for being supportive for the community and, um, are really moving forward. Technology lives. They really want to be a data driven bank, and they want to move into a more robust Bruce. >>Well, we got a lot to talk about. So data driven that is an interesting topic. And your role as data architect. The architecture is really senior vice president data architecture. So you got a big responsibility as it relates to It's kind of transitioning to this digital data driven bank. But tell us a little bit about your role in your organization, >>right? Um, currently, >>today we have, ah, a small group that is just working toward moving into a more futuristic, more data driven data warehouse. That's our first item. And then the other item is to drive new revenue by anticipating what customers do when they go to the bank or when they log into there to be able to give them the best offer. The only way to do that is you >>have uh huh. >>Timely, accurate, complete data on the customer and what's really a great value on off something to offer that or a new product or to help them continue to grow their savings or do and grow their investment. >>Okay. And I really want to get into that. But before we do and I know you're sort of part way through your journey, you got a lot of what they do. But I want to ask you about Cove. It how you guys you're handling that? I mean, you had the government coming down and small business loans and P p p. And huge volume of business and sort of data was at the heart of that. How did you manage through that? >>But we were extremely successful because we have a big, dedicated team that understands where their data is and was able to switch much faster than a larger bank to be able to offer. The TPP longs at to our customers within lightning speeds. And part of that was is we adapted to Salesforce very, for we've had salesforce in house for over 15 years. Um, you know, pretty much, uh, that was the driving vehicle to get our CPP is loans in on and then developing logic quickly. But it was a 24 7 development role in get the data moving, helping our customers fill out the forms. And a lot of that was manual. But it was a It was a large community effort. >>Well, think about that. Think about that too. Is the volume was probably much, much higher the volume of loans to small businesses that you're used to granting. But and then also, the initial guidelines were very opaque. You really didn't know what the rules were, but you were expected to enforce them. And then finally, you got more clarity. So you had to essentially code that logic into the system in real time, right? >>I wasn't >>directly involved, but part of my data movement Team Waas, and we had to change the logic overnight. So it was on a Friday night was released. We've pushed our first set of loans through and then the logic change, Um, from, you know, coming from the government and changed. And we had to re develop our our data movement piece is again and we design them and send them back. So it was It was definitely kind of scary, but we were completely successful. We hit a very high peak and I don't know the exact number, but it was in the thousands of loans from, you know, little loans to very large loans, and not one customer who buy it's not yet what they needed for. Um, you know, that was the right process and filled out the rate and pace. >>That's an amazing story and really great support for the region. New York, Connecticut, the Boston area. So that's that's fantastic. I want to get into the rest of your story. Now let's start with some of the business drivers in banking. I mean, obviously online. I mean, a lot of people have sort of joked that many of the older people who kind of shunned online banking would love to go into the branch and see their friendly teller had no choice, You know, during this pandemic to go to online. So that's obviously a big trend you mentioned. So you know the data driven data warehouse? I wanna understand that. But well, at the top level, what were some of what are some of the key business drivers there catalyzing your desire for change? >>Um, the ability to give the customer what they need at the time when they need it. And what I mean by that is that we have, um, customer interactions in multiple ways, right? >>And I want >>to be able for the customer, too. Walk into a bank, um, or online and see the same the same format and being able to have the same feel, the same look, and also to be able to offer them the next best offer for them. But they're you know, if they want looking for a new a mortgage or looking to refinance or look, you know, whatever it iss, um, that they have that data, we have the data and that they feel comfortable using it. And that's a untethered banker. Um, attitude is, you know, whatever my banker is holding and whatever the person is holding in their phone, that that is the same. And it's comfortable, so they don't feel that they've, you know, walked into the bank and they have to do a lot of different paperwork comparative filling out paperwork on, you know, just doing it on their phone. >>So you actually want the experience to be better. I mean, and it is in many cases now, you weren't able to do this with your existing against mainframe based Enterprise data warehouse. Is is that right? Maybe talk about that a little bit. >>Yeah, we were >>definitely able to do it with what we have today. The technology we're using, but one of the issues is that it's not timely, Um, and and you need a timely process to be able to get the customers to understand what's happening. Um, you want you need a timely process so we can enhance our risk management. We can apply for fraud issues and things like that. >>Yeah, so you're trying to get more real time in the traditional e g W. It's it's sort of a science project. There's a few experts that know how to get it. You consider line up. The demand is tremendous, and often times by the time you get the answer, you know it's outdated. So you're trying to address that problem. So So part of it is really the cycle time, the end end cycle, time that you're pressing. And then there's if I understand it, residual benefits that are pretty substantial from a revenue opportunity. Other other offers that you can you can make to the right customer, Um, that that you, you maybe know through your data. Is that right? >>Exactly. It's drive new customers, Teoh new opportunities. It's enhanced the risk, and it's to optimize the banking process and then obviously, to create new business. Um, and the only way we're going to be able to do that is that we have the ability to look at the data right when the customer walks in the door or right when they open up their app. And, um, by doing, creating more to New York time near real time data for the data warehouse team that's giving the lines of business the ability to to work on the next best offer for that customer. >>Paulo, we're inundated with data sources these days. Are there their data sources that you maybe maybe had access to before? But perhaps the backlog of ingesting and cleaning and cataloging and you know of analyzing. Maybe the backlog was so great that you couldn't perhaps tap some of those data sources. You see the potential to increase the data sources and hence the quality of the data, Or is that sort of premature? >>Oh, no. Um, >>exactly. Right. So right now we ingest a lot of flat files and from our mainframe type of Brennan system that we've had for quite a few years. But now that we're moving to the cloud and off Prem and on France, you know, moving off Prem into like an s three bucket. Where That data king, We can process that data and get that data faster by using real time tools to move that data into a place where, like, snowflake could utilize that data or we can give it out to our market. >>Okay, so we're >>about the way we do. We're in batch mode. Still, so we're doing 24 hours. >>Okay, So when I think about the data pipeline and the people involved, I mean, maybe you could talk a little bit about the organization. I mean, you've got I know you have data. Scientists or statisticians? I'm sure you do. Ah, you got data architects, data engineers, quality engineers, you know, developers, etcetera, etcetera. And oftentimes, practitioners like yourself will will stress about pay. The data's in silos of the data quality is not where we want it to be. We have to manually categorize the data. These are all sort of common data pipeline problems, if you will. Sometimes we use the term data ops, which is kind of a play on Dev Ops applied to the data pipeline. I did. You just sort of described your situation in that context. >>Yeah. Yes. So we have a very large data ops team and everyone that who is working on the data part of Webster's Bay has been there 13 14 years. So they get the data, they understand that they understand the lines of business. Um, so it's right now, um, we could we have data quality issues, just like everybody else does. We have. We have places in him where that gets clans, Um, and we're moving toward. And there was very much silo data. The data scientists are out in the lines of business right now, which is great, cause I think that's where data science belongs. We should give them on. And that's what we're working towards now is giving them more self service, giving them the ability to access the data, um, in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own like tableau dashboards and then pushing the data back out. Um, so they're going to more not, I don't want to say a central repository, but a more of a robust repository that's controlled across multiple avenues where multiple lines of business can access. That said, how >>got it? Yes, and I think that one of the key things that I'm taking away from your last comment is the cultural aspects of this bite having the data. Scientists in the line of business, the line of lines of business, will feel ownership of that data as opposed to pointing fingers, criticizing the data quality they really own that that problem, as opposed to saying, Well, it's it's It's Paulus problem, >>right? Well, I have. My problem >>is, I have a date. Engineers, data architects, they database administrators, right, Um, and then data traditional data forwarding people. Um, and because some customers that I have that our business customers lines of business, they want to just subscribe to a report. They don't want to go out and do any data science work. Um, and we still have to provide that. So we still want to provide them some kind of regimen that they wake up in the morning and they open up their email. And there's the report that they just drive, um, which is great. And it works out really well. And one of the things is why we purchase I o waas. I would have the ability to give the lines of business the ability to do search within the data. And we read the data flows and data redundancy and things like that help me cleanup the data and also, um, to give it to the data. Analysts who say All right, they just asked me. They want this certain report, and it used to take Okay, well, we're gonna four weeks, we're going to go. We're gonna look at the data, and then we'll come back and tell you what we dio. But now with Iot Tahoe, they're able to look at the data and then, in one or two days of being able to go back and say, yes, we have data. This is where it is. This is where we found that this is the data flows that we've found also, which is that what I call it is the birth of a column. It's where the calm was created and where it went live as a teenager. And then it went to, you know, die very archive. Yeah, it's this, you know, cycle of life for a column. And Iot Tahoe helps us do that, and we do. Data lineage has done all the time. Um, and it's just takes a very long time. And that's why we're using something that has AI and machine learning. Um, it's it's accurate. It does it the same way over and over again. If an analyst leads, you're able to utilize talked something like, Oh, to be able to do that work for you. I get that. >>Yes. Oh, got it. So So a couple things there is in in, In researching Iot Tahoe, it seems like one of the strengths of their platform is the ability to visualize data the data structure and actually dig into it. But also see it, um, and that speeds things up and gives everybody additional confidence. And then the other pieces essentially infusing AI or machine intelligence into the data pipeline is really how you're attacking automation, right? And you're saying it's repeatable and and then that helps the data quality, and you have this virtuous cycle. Is there a firm that and add some color? Perhaps >>Exactly. Um, so you're able to let's say that I have I have seven cause lines of business that are asking me questions and one of the questions I'll ask me is. We want to know if this customer is okay to contact, right? And you know, there's different avenues, so you can go online to go. Do not contact me. You can go to the bank and you can say I don't want, um, email, but I'll take tests and I want, you know, phone calls. Um, all that information. So seven different lines of business asked me that question in different ways once said okay to contact the other one says, you know, customer one to pray All these, You know, um, and each project before I got there used to be siloed. So one customer would be 100 hours for them to do that and analytical work, and then another cut. Another analysts would do another 100 hours on the other project. Well, now I can do that all at once, and I can do those type of searches and say, Yes, we already have that documentation. Here it is. And this is where you can find where the customer has said, you know, you don't want I don't want to get access from you by email, or I've subscribed to get emails from you. >>Got it. Okay? Yeah. Okay. And then I want to come back to the cloud a little bit. So you you mentioned those three buckets? So you're moving to the Amazon cloud. At least I'm sure you're gonna get a hybrid situation there. You mentioned Snowflake. Um, you know what was sort of the decision to move to the cloud? Obviously, snowflake is cloud only. There's not an on Prem version there. So what precipitated that? >>Alright, So, from, um, I've been in >>the data I t Information field for the last 35 years. I started in the US Air Force and have moved on from since then. And, um, my experience with off brand waas with Snowflake was working with G McGee capital. And that's where I met up with the team from Iot to house as well. And so it's a proven. So there's a couple of things one is symptomatic of is worldwide. Now to move there, right, Two products, they have the on frame in the offering. Um, I've used the on Prem and off Prem. They're both great and it's very stable and I'm comfortable with other people are very comfortable with this. So we picked. That is our batch data movement. Um, we're moving to her, probably HBR. It's not a decision yet, but we're moving to HP are for real time data which has changed capture data, you know, moves it into the cloud. And then So you're envisioning this right now in Petrit, you're in the S three and you have all the data that you could possibly want. And that's Jason. All that everything is sitting in the S three to be able to move it through into snowflake and snowflake has proven cto have a stability. Um, you only need to learn in train your team with one thing. Um, aws has is completely stable at this 10.2. So all these avenues, if you think about it going through from, um, you know, this is your your data lake, which is I would consider your s three. And even though it's not a traditional data leg like you can touch it like a like a progressive or a dupe and into snowflake and then from snowflake into sandboxes. So your lines of business and your data scientists and just dive right in, Um, that makes a big, big win. and then using Iot. Ta ho! With the data automation and also their search engine, um, I have the ability to give the data scientists and eight analysts the the way of they don't need to talk to i t to get, um, accurate information or completely accurate information from the structure. And we'll be right there. >>Yes, so talking about, you know, snowflake and getting up to speed quickly. I know from talking to customers you get from zero to snowflake, you know, very fast. And then it sounds like the i o Ta ho is sort of the automation cloud for your data pipeline within the cloud. This is is that the right way to think about it? >>I think so. Um, right now I have I o ta >>ho attached to my >>on Prem. And, um, I >>want to attach it to my offering and eventually. So I'm using Iot Tahoe's data automation right now to bring in the data and to start analyzing the data close to make sure that I'm not missing anything and that I'm not bringing over redundant data. Um, the data warehouse that I'm working off is not a It's an on Prem. It's an Oracle database and its 15 years old. So it has extra data in it. It has, um, things that we don't need anymore. And Iot. Tahoe's helping me shake out that, um, extra data that does not need to be moved into my S three. So it's saving me money when I'm moving from offering on Prem. >>And so that was a challenge prior because you couldn't get the lines of business to agree what to delete or what was the issue there. >>Oh, it was more than that. Um, each line of business had their own structure within the warehouse, and then they were copying data between each other and duplicating the data and using that, uh so there might be that could be possibly three tables that have the same data in it. But it's used for different lines of business. And so I had we have identified using Iot Tahoe. I've identified over seven terabytes in the last, um, two months on data that is just been repetitive. Um, it just it's the same exact data just sitting in a different scheme. >>And and that's not >>easy to find. If you only understand one schema that's reporting for that line of business so that >>yeah, more bad news for the storage companies out there. Okay to follow. >>It's HCI. That's what that's what we were telling people you >>don't know and it's true, but you still would rather not waste it. You apply it to, you know, drive more revenue. And and so I guess Let's close on where you see this thing going again. I know you're sort of part way through the journey. May be you could sort of describe, you know, where you see the phase is going and really what you want to get out of this thing, You know, down the road Midterm. Longer term. What's your vision or your your data driven organization? >>Um, I want >>for the bankers to be able to walk around with on iPad in their hands and be able to access data for that customer really fast and be able to give them the best deal that they can get. I want Webster to be right there on top, with being able to add new customers and to be able to serve our existing customers who had bank accounts. Since you were 12 years old there and now our, you know, multi. Whatever. Um, I want them to be able to have the best experience with our our bankers, and >>that's awesome. I mean, that's really what I want is a banking customer. I want my bank to know who I am, anticipate my needs and create a great experience for me. And then let me go on with my life. And so that is a great story. Love your experience, your background and your knowledge. Can't thank you enough for coming on the Cube. >>No, thank you very much. And you guys have a great day. >>Alright, Take care. And thank you for watching everybody keep it right there. We'll take a short break and be right back. >>Yeah, yeah, yeah, yeah.
SUMMARY :
of enterprise data automation, an event Siri's brought to you by Iot. And I'm really excited to have Paul Damico here. Hi. Nice to see you, too. So let's let's start with Let's start with Webster Bank. awards for the area for being supportive for the community So you got a big responsibility as it relates to It's kind of transitioning to And then the other item is to drive new revenue Timely, accurate, complete data on the customer and what's really But I want to ask you about Cove. And part of that was is we adapted to Salesforce very, And then finally, you got more clarity. Um, from, you know, coming from the government and changed. I mean, a lot of people have sort of joked that many of the older people Um, the ability to give the customer what they a new a mortgage or looking to refinance or look, you know, whatever it iss, So you actually want the experience to be better. Um, you want you need a timely process so we can enhance Other other offers that you can you can make to the right customer, Um, and the only way we're going to be You see the potential to Prem and on France, you know, moving off Prem into like an s three bucket. about the way we do. quality engineers, you know, developers, etcetera, etcetera. Um, so they're going to more not, I don't want to say a central criticizing the data quality they really own that that problem, Well, I have. We're gonna look at the data, and then we'll come back and tell you what we dio. it seems like one of the strengths of their platform is the ability to visualize data the data structure and to contact the other one says, you know, customer one to pray All these, You know, So you you mentioned those three buckets? All that everything is sitting in the S three to be able to move it through I know from talking to customers you get from zero to snowflake, Um, right now I have I o ta Um, the data warehouse that I'm working off is And so that was a challenge prior because you couldn't get the lines Um, it just it's the same exact data just sitting If you only understand one schema that's reporting Okay to That's what that's what we were telling people you You apply it to, you know, drive more revenue. for the bankers to be able to walk around with on iPad And so that is a great story. And you guys have a great day. And thank you for watching everybody keep it right there.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Paul Damico | PERSON | 0.99+ |
Dave Volante | PERSON | 0.99+ |
Webster Bank | ORGANIZATION | 0.99+ |
Westchester | LOCATION | 0.99+ |
Paula D'Amico | PERSON | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
New York | LOCATION | 0.99+ |
one | QUANTITY | 0.99+ |
Connecticut | LOCATION | 0.99+ |
100 hours | QUANTITY | 0.99+ |
S three | COMMERCIAL_ITEM | 0.99+ |
15 years | QUANTITY | 0.99+ |
Jason | PERSON | 0.99+ |
France | LOCATION | 0.99+ |
Siri | TITLE | 0.99+ |
first item | QUANTITY | 0.99+ |
three tables | QUANTITY | 0.99+ |
24 hours | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
two months | QUANTITY | 0.99+ |
each line | QUANTITY | 0.99+ |
Fairfield County | LOCATION | 0.99+ |
HP | ORGANIZATION | 0.99+ |
Friday night | DATE | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Two products | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
four weeks | QUANTITY | 0.99+ |
US Air Force | ORGANIZATION | 0.98+ |
over 15 years | QUANTITY | 0.98+ |
two days | QUANTITY | 0.98+ |
New England | LOCATION | 0.98+ |
each project | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
Iot Tahoe | PERSON | 0.98+ |
Paulo | PERSON | 0.98+ |
Iot Tahoe | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.97+ |
one thing | QUANTITY | 0.97+ |
first set | QUANTITY | 0.97+ |
TPP | TITLE | 0.97+ |
Paulus | PERSON | 0.97+ |
seven cause | QUANTITY | 0.97+ |
one schema | QUANTITY | 0.97+ |
one customer | QUANTITY | 0.96+ |
13 14 years | QUANTITY | 0.96+ |
over seven terabytes | QUANTITY | 0.96+ |
three | QUANTITY | 0.96+ |
single source | QUANTITY | 0.95+ |
Webster's Bay | ORGANIZATION | 0.95+ |
Webster | ORGANIZATION | 0.94+ |
seven different lines | QUANTITY | 0.94+ |
Cove | ORGANIZATION | 0.94+ |
Prem | ORGANIZATION | 0.93+ |
Enterprise Data Automation | ORGANIZATION | 0.92+ |
eight analysts | QUANTITY | 0.92+ |
10.2 | QUANTITY | 0.89+ |
12 years old | QUANTITY | 0.89+ |
Iot | ORGANIZATION | 0.88+ |
three buckets | QUANTITY | 0.88+ |
Snowflake | EVENT | 0.86+ |
last 35 years | DATE | 0.84+ |
Team Waas | ORGANIZATION | 0.8+ |
Io Tahoe | PERSON | 0.79+ |
24 7 development | QUANTITY | 0.72+ |
Salesforce | ORGANIZATION | 0.68+ |
each | QUANTITY | 0.68+ |
Amazon cloud | ORGANIZATION | 0.66+ |
Tahoe | PERSON | 0.66+ |
zero | QUANTITY | 0.64+ |
snowflake | EVENT | 0.61+ |
things | QUANTITY | 0.57+ |
Paula D'Amico, Webster Bank | Io Tahoe | Enterprise Data Automation
>> Narrator: From around the Globe, it's theCube with digital coverage of Enterprise Data Automation, and event series brought to you by Io-Tahoe. >> Everybody, we're back. And this is Dave Vellante, and we're covering the whole notion of Automated Data in the Enterprise. And I'm really excited to have Paula D'Amico here. Senior Vice President of Enterprise Data Architecture at Webster Bank. Paula, good to see you. Thanks for coming on. >> Hi, nice to see you, too. >> Let's start with Webster bank. You guys are kind of a regional I think New York, New England, believe it's headquartered out of Connecticut. But tell us a little bit about the bank. >> Webster bank is regional Boston, Connecticut, and New York. Very focused on in Westchester and Fairfield County. They are a really highly rated regional bank for this area. They hold quite a few awards for the area for being supportive for the community, and are really moving forward technology wise, they really want to be a data driven bank, and they want to move into a more robust group. >> We got a lot to talk about. So data driven is an interesting topic and your role as Data Architecture, is really Senior Vice President Data Architecture. So you got a big responsibility as it relates to kind of transitioning to this digital data driven bank but tell us a little bit about your role in your Organization. >> Currently, today, we have a small group that is just working toward moving into a more futuristic, more data driven data warehousing. That's our first item. And then the other item is to drive new revenue by anticipating what customers do, when they go to the bank or when they log in to their account, to be able to give them the best offer. And the only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on offer something to offer that, or a new product, or to help them continue to grow their savings, or do and grow their investments. >> Okay, and I really want to get into that. But before we do, and I know you're, sort of partway through your journey, you got a lot to do. But I want to ask you about Covid, how you guys handling that? You had the government coming down and small business loans and PPP, and huge volume of business and sort of data was at the heart of that. How did you manage through that? >> We were extremely successful, because we have a big, dedicated team that understands where their data is and was able to switch much faster than a larger bank, to be able to offer the PPP Long's out to our customers within lightning speed. And part of that was is we adapted to Salesforce very for we've had Salesforce in house for over 15 years. Pretty much that was the driving vehicle to get our PPP loans in, and then developing logic quickly, but it was a 24 seven development role and get the data moving on helping our customers fill out the forms. And a lot of that was manual, but it was a large community effort. >> Think about that too. The volume was probably much higher than the volume of loans to small businesses that you're used to granting and then also the initial guidelines were very opaque. You really didn't know what the rules were, but you were expected to enforce them. And then finally, you got more clarity. So you had to essentially code that logic into the system in real time. >> I wasn't directly involved, but part of my data movement team was, and we had to change the logic overnight. So it was on a Friday night it was released, we pushed our first set of loans through, and then the logic changed from coming from the government, it changed and we had to redevelop our data movement pieces again, and we design them and send them back through. So it was definitely kind of scary, but we were completely successful. We hit a very high peak. Again, I don't know the exact number but it was in the thousands of loans, from little loans to very large loans and not one customer who applied did not get what they needed for, that was the right process and filled out the right amount. >> Well, that is an amazing story and really great support for the region, your Connecticut, the Boston area. So that's fantastic. I want to get into the rest of your story now. Let's start with some of the business drivers in banking. I mean, obviously online. A lot of people have sort of joked that many of the older people, who kind of shunned online banking would love to go into the branch and see their friendly teller had no choice, during this pandemic, to go to online. So that's obviously a big trend you mentioned, the data driven data warehouse, I want to understand that, but what at the top level, what are some of the key business drivers that are catalyzing your desire for change? >> The ability to give a customer, what they need at the time when they need it. And what I mean by that is that we have customer interactions in multiple ways. And I want to be able for the customer to walk into a bank or online and see the same format, and being able to have the same feel the same love, and also to be able to offer them the next best offer for them. But they're if they want looking for a new mortgage or looking to refinance, or whatever it is that they have that data, we have the data and that they feel comfortable using it. And that's an untethered banker. Attitude is, whatever my banker is holding and whatever the person is holding in their phone, that is the same and it's comfortable. So they don't feel that they've walked into the bank and they have to do fill out different paperwork compared to filling out paperwork on just doing it on their phone. >> You actually do want the experience to be better. And it is in many cases. Now you weren't able to do this with your existing I guess mainframe based Enterprise Data Warehouses. Is that right? Maybe talk about that a little bit? >> Yeah, we were definitely able to do it with what we have today the technology we're using. But one of the issues is that it's not timely. And you need a timely process to be able to get the customers to understand what's happening. You need a timely process so we can enhance our risk management. We can apply for fraud issues and things like that. >> Yeah, so you're trying to get more real time. The traditional EDW. It's sort of a science project. There's a few experts that know how to get it. You can so line up, the demand is tremendous. And then oftentimes by the time you get the answer, it's outdated. So you're trying to address that problem. So part of it is really the cycle time the end to end cycle time that you're progressing. And then there's, if I understand it residual benefits that are pretty substantial from a revenue opportunity, other offers that you can make to the right customer, that you maybe know, through your data, is that right? >> Exactly. It's drive new customers to new opportunities. It's enhanced the risk, and it's to optimize the banking process, and then obviously, to create new business. And the only way we're going to be able to do that is if we have the ability to look at the data right when the customer walks in the door or right when they open up their app. And by doing creating more to New York times near real time data, or the data warehouse team that's giving the lines of business the ability to work on the next best offer for that customer as well. >> But Paula, we're inundated with data sources these days. Are there other data sources that maybe had access to before, but perhaps the backlog of ingesting and cleaning in cataloging and analyzing maybe the backlog was so great that you couldn't perhaps tap some of those data sources. Do you see the potential to increase the data sources and hence the quality of the data or is that sort of premature? >> Oh, no. Exactly. Right. So right now, we ingest a lot of flat files and from our mainframe type of front end system, that we've had for quite a few years. But now that we're moving to the cloud and off-prem and on-prem, moving off-prem, into like an S3 Bucket, where that data we can process that data and get that data faster by using real time tools to move that data into a place where, like snowflake could utilize that data, or we can give it out to our market. Right now we're about we do work in batch mode still. So we're doing 24 hours. >> Okay. So when I think about the data pipeline, and the people involved, maybe you could talk a little bit about the organization. You've got, I don't know, if you have data scientists or statisticians, I'm sure you do. You got data architects, data engineers, quality engineers, developers, etc. And oftentimes, practitioners like yourself, will stress about, hey, the data is in silos. The data quality is not where we want it to be. We have to manually categorize the data. These are all sort of common data pipeline problems, if you will. Sometimes we use the term data Ops, which is sort of a play on DevOps applied to the data pipeline. Can you just sort of describe your situation in that context? >> Yeah, so we have a very large data ops team. And everyone that who is working on the data part of Webster's Bank, has been there 13 to 14 years. So they get the data, they understand it, they understand the lines of business. So it's right now. We could the we have data quality issues, just like everybody else does. But we have places in them where that gets cleansed. And we're moving toward and there was very much siloed data. The data scientists are out in the lines of business right now, which is great, because I think that's where data science belongs, we should give them and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own, like Tableau dashboards, and then pushing the data back out. So they're going to more not, I don't want to say, a central repository, but a more of a robust repository, that's controlled across multiple avenues, where multiple lines of business can access that data. Is that help? >> Got it, Yes. And I think that one of the key things that I'm taking away from your last comment, is the cultural aspects of this by having the data scientists in the line of business, the lines of business will feel ownership of that data as opposed to pointing fingers criticizing the data quality. They really own that that problem, as opposed to saying, well, it's Paula's problem. >> Well, I have my problem is I have data engineers, data architects, database administrators, traditional data reporting people. And because some customers that I have that are business customers lines of business, they want to just subscribe to a report, they don't want to go out and do any data science work. And we still have to provide that. So we still want to provide them some kind of regiment that they wake up in the morning, and they open up their email, and there's the report that they subscribe to, which is great, and it works out really well. And one of the things is why we purchased Io-Tahoe was, I would have the ability to give the lines of business, the ability to do search within the data. And we'll read the data flows and data redundancy and things like that, and help me clean up the data. And also, to give it to the data analysts who say, all right, they just asked me they want this certain report. And it used to take okay, four weeks we're going to go and we're going to look at the data and then we'll come back and tell you what we can do. But now with Io-Tahoe, they're able to look at the data, and then in one or two days, they'll be able to go back and say, Yes, we have the data, this is where it is. This is where we found it. This is the data flows that we found also, which is what I call it, is the break of a column. It's where the column was created, and where it went to live as a teenager. (laughs) And then it went to die, where we archive it. And, yeah, it's this cycle of life for a column. And Io-Tahoe helps us do that. And we do data lineage is done all the time. And it's just takes a very long time and that's why we're using something that has AI in it and machine running. It's accurate, it does it the same way over and over again. If an analyst leaves, you're able to utilize something like Io-Tahoe to be able to do that work for you. Is that help? >> Yeah, so got it. So a couple things there, in researching Io-Tahoe, it seems like one of the strengths of their platform is the ability to visualize data, the data structure and actually dig into it, but also see it. And that speeds things up and gives everybody additional confidence. And then the other piece is essentially infusing AI or machine intelligence into the data pipeline, is really how you're attacking automation. And you're saying it repeatable, and then that helps the data quality and you have this virtual cycle. Maybe you could sort of affirm that and add some color, perhaps. >> Exactly. So you're able to let's say that I have seven cars, lines of business that are asking me questions, and one of the questions they'll ask me is, we want to know, if this customer is okay to contact, and there's different avenues so you can go online, do not contact me, you can go to the bank and you can say, I don't want email, but I'll take texts. And I want no phone calls. All that information. So, seven different lines of business asked me that question in different ways. One said, "No okay to contact" the other one says, "Customer 123." All these. In each project before I got there used to be siloed. So one customer would be 100 hours for them to do that analytical work, and then another analyst would do another 100 hours on the other project. Well, now I can do that all at once. And I can do those types of searches and say, Yes, we already have that documentation. Here it is, and this is where you can find where the customer has said, "No, I don't want to get access from you by email or I've subscribed to get emails from you." >> Got it. Okay. Yeah Okay. And then I want to go back to the cloud a little bit. So you mentioned S3 Buckets. So you're moving to the Amazon cloud, at least, I'm sure you're going to get a hybrid situation there. You mentioned snowflake. What was sort of the decision to move to the cloud? Obviously, snowflake is cloud only. There's not an on-prem, version there. So what precipitated that? >> Alright, so from I've been in the data IT information field for the last 35 years. I started in the US Air Force, and have moved on from since then. And my experience with Bob Graham, was with snowflake with working with GE Capital. And that's where I met up with the team from Io-Tahoe as well. And so it's a proven so there's a couple of things one is Informatica, is worldwide known to move data. They have two products, they have the on-prem and the off-prem. I've used the on-prem and off-prem, they're both great. And it's very stable, and I'm comfortable with it. Other people are very comfortable with it. So we picked that as our batch data movement. We're moving toward probably HVR. It's not a total decision yet. But we're moving to HVR for real time data, which is changed capture data, moves it into the cloud. And then, so you're envisioning this right now. In which is you're in the S3, and you have all the data that you could possibly want. And that's JSON, all that everything is sitting in the S3 to be able to move it through into snowflake. And snowflake has proven to have a stability. You only need to learn and train your team with one thing. AWS as is completely stable at this point too. So all these avenues if you think about it, is going through from, this is your data lake, which is I would consider your S3. And even though it's not a traditional data lake like, you can touch it like a Progressive or Hadoop. And then into snowflake and then from snowflake into sandbox and so your lines of business and your data scientists just dive right in. That makes a big win. And then using Io-Tahoe with the data automation, and also their search engine. I have the ability to give the data scientists and data analysts the way of they don't need to talk to IT to get accurate information or completely accurate information from the structure. And we'll be right back. >> Yeah, so talking about snowflake and getting up to speed quickly. I know from talking to customers you can get from zero to snowflake very fast and then it sounds like the Io-Tahoe is sort of the automation cloud for your data pipeline within the cloud. Is that the right way to think about it? >> I think so. Right now I have Io-Tahoe attached to my on-prem. And I want to attach it to my off-prem eventually. So I'm using Io-Tahoe data automation right now, to bring in the data, and to start analyzing the data flows to make sure that I'm not missing anything, and that I'm not bringing over redundant data. The data warehouse that I'm working of, it's an on-prem. It's an Oracle Database, and it's 15 years old. So it has extra data in it. It has things that we don't need anymore, and Io-Tahoe's helping me shake out that extra data that does not need to be moved into my S3. So it's saving me money, when I'm moving from off-prem to on-prem. >> And so that was a challenge prior, because you couldn't get the lines of business to agree what to delete, or what was the issue there? >> Oh, it was more than that. Each line of business had their own structure within the warehouse. And then they were copying data between each other, and duplicating the data and using that. So there could be possibly three tables that have the same data in it, but it's used for different lines of business. We have identified using Io-Tahoe identified over seven terabytes in the last two months on data that has just been repetitive. It's the same exact data just sitting in a different schema. And that's not easy to find, if you only understand one schema, that's reporting for that line of business. >> More bad news for the storage companies out there. (both laughs) So far. >> It's cheap. That's what we were telling people. >> And it's true, but you still would rather not waste it, you'd like to apply it to drive more revenue. And so, I guess, let's close on where you see this thing going. Again, I know you're sort of partway through the journey, maybe you could sort of describe, where you see the phase is going and really what you want to get out of this thing, down the road, mid-term, longer term, what's your vision or your data driven organization. >> I want for the bankers to be able to walk around with an iPad in their hand, and be able to access data for that customer, really fast and be able to give them the best deal that they can get. I want Webster to be right there on top with being able to add new customers, and to be able to serve our existing customers who had bank accounts since they were 12 years old there and now our multi whatever. I want them to be able to have the best experience with our bankers. >> That's awesome. That's really what I want as a banking customer. I want my bank to know who I am, anticipate my needs, and create a great experience for me. And then let me go on with my life. And so that follow. Great story. Love your experience, your background and your knowledge. I can't thank you enough for coming on theCube. >> Now, thank you very much. And you guys have a great day. >> All right, take care. And thank you for watching everybody. Keep right there. We'll take a short break and be right back. (gentle music)
SUMMARY :
to you by Io-Tahoe. And I'm really excited to of a regional I think and they want to move it relates to kind of transitioning And the only way to do But I want to ask you about Covid, and get the data moving And then finally, you got more clarity. and filled out the right amount. and really great support for the region, and being able to have the experience to be better. to be able to get the customers that know how to get it. and it's to optimize the banking process, and analyzing maybe the backlog was and get that data faster and the people involved, And everyone that who is working is the cultural aspects of this the ability to do search within the data. and you have this virtual cycle. and one of the questions And then I want to go back in the S3 to be able to move it Is that the right way to think about it? and to start analyzing the data flows and duplicating the data and using that. More bad news for the That's what we were telling people. and really what you want and to be able to serve And so that follow. And you guys have a great day. And thank you for watching everybody.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Paula D'Amico | PERSON | 0.99+ |
Paula | PERSON | 0.99+ |
Connecticut | LOCATION | 0.99+ |
Westchester | LOCATION | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
24 hours | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
13 | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
100 hours | QUANTITY | 0.99+ |
Bob Graham | PERSON | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
Webster Bank | ORGANIZATION | 0.99+ |
GE Capital | ORGANIZATION | 0.99+ |
first item | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
two products | QUANTITY | 0.99+ |
seven | QUANTITY | 0.99+ |
New York | LOCATION | 0.99+ |
Boston | LOCATION | 0.99+ |
three tables | QUANTITY | 0.99+ |
Each line | QUANTITY | 0.99+ |
first set | QUANTITY | 0.99+ |
two days | QUANTITY | 0.99+ |
DevOps | TITLE | 0.99+ |
Webster bank | ORGANIZATION | 0.99+ |
14 years | QUANTITY | 0.99+ |
over 15 years | QUANTITY | 0.99+ |
seven cars | QUANTITY | 0.98+ |
each project | QUANTITY | 0.98+ |
Friday night | DATE | 0.98+ |
Enterprise Data Automation | ORGANIZATION | 0.98+ |
New England | LOCATION | 0.98+ |
Io-Tahoe | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
Webster's Bank | ORGANIZATION | 0.98+ |
one schema | QUANTITY | 0.97+ |
Fairfield County | LOCATION | 0.97+ |
One | QUANTITY | 0.97+ |
one customer | QUANTITY | 0.97+ |
over seven terabytes | QUANTITY | 0.97+ |
Salesforce | ORGANIZATION | 0.96+ |
both | QUANTITY | 0.95+ |
single source | QUANTITY | 0.93+ |
one thing | QUANTITY | 0.93+ |
US Air Force | ORGANIZATION | 0.93+ |
Webster | ORGANIZATION | 0.92+ |
S3 | COMMERCIAL_ITEM | 0.92+ |
Enterprise Data Architecture | ORGANIZATION | 0.91+ |
Io Tahoe | PERSON | 0.91+ |
Oracle | ORGANIZATION | 0.9+ |
15 years old | QUANTITY | 0.9+ |
Io-Tahoe | PERSON | 0.89+ |
12 years old | QUANTITY | 0.88+ |
Tableau | TITLE | 0.87+ |
four weeks | QUANTITY | 0.86+ |
S3 Buckets | COMMERCIAL_ITEM | 0.84+ |
Covid | PERSON | 0.81+ |
Data Architecture | ORGANIZATION | 0.79+ |
JSON | TITLE | 0.79+ |
Senior Vice President | PERSON | 0.78+ |
24 seven development role | QUANTITY | 0.77+ |
last 35 years | DATE | 0.77+ |
both laughs | QUANTITY | 0.75+ |
Io-Tahoe | TITLE | 0.73+ |
each | QUANTITY | 0.72+ |
loans | QUANTITY | 0.71+ |
zero | QUANTITY | 0.71+ |
Paula D'Amico, Webster Bank
>> Narrator: From around the Globe, it's theCube with digital coverage of Enterprise Data Automation, and event series brought to you by Io-Tahoe. >> Everybody, we're back. And this is Dave Vellante, and we're covering the whole notion of Automated Data in the Enterprise. And I'm really excited to have Paula D'Amico here. Senior Vice President of Enterprise Data Architecture at Webster Bank. Paula, good to see you. Thanks for coming on. >> Hi, nice to see you, too. >> Let's start with Webster bank. You guys are kind of a regional I think New York, New England, believe it's headquartered out of Connecticut. But tell us a little bit about the bank. >> Webster bank is regional Boston, Connecticut, and New York. Very focused on in Westchester and Fairfield County. They are a really highly rated regional bank for this area. They hold quite a few awards for the area for being supportive for the community, and are really moving forward technology wise, they really want to be a data driven bank, and they want to move into a more robust group. >> We got a lot to talk about. So data driven is an interesting topic and your role as Data Architecture, is really Senior Vice President Data Architecture. So you got a big responsibility as it relates to kind of transitioning to this digital data driven bank but tell us a little bit about your role in your Organization. >> Currently, today, we have a small group that is just working toward moving into a more futuristic, more data driven data warehousing. That's our first item. And then the other item is to drive new revenue by anticipating what customers do, when they go to the bank or when they log in to their account, to be able to give them the best offer. And the only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on offer something to offer that, or a new product, or to help them continue to grow their savings, or do and grow their investments. >> Okay, and I really want to get into that. But before we do, and I know you're, sort of partway through your journey, you got a lot to do. But I want to ask you about Covid, how you guys handling that? You had the government coming down and small business loans and PPP, and huge volume of business and sort of data was at the heart of that. How did you manage through that? >> We were extremely successful, because we have a big, dedicated team that understands where their data is and was able to switch much faster than a larger bank, to be able to offer the PPP Long's out to our customers within lightning speed. And part of that was is we adapted to Salesforce very for we've had Salesforce in house for over 15 years. Pretty much that was the driving vehicle to get our PPP loans in, and then developing logic quickly, but it was a 24 seven development role and get the data moving on helping our customers fill out the forms. And a lot of that was manual, but it was a large community effort. >> Think about that too. The volume was probably much higher than the volume of loans to small businesses that you're used to granting and then also the initial guidelines were very opaque. You really didn't know what the rules were, but you were expected to enforce them. And then finally, you got more clarity. So you had to essentially code that logic into the system in real time. >> I wasn't directly involved, but part of my data movement team was, and we had to change the logic overnight. So it was on a Friday night it was released, we pushed our first set of loans through, and then the logic changed from coming from the government, it changed and we had to redevelop our data movement pieces again, and we design them and send them back through. So it was definitely kind of scary, but we were completely successful. We hit a very high peak. Again, I don't know the exact number but it was in the thousands of loans, from little loans to very large loans and not one customer who applied did not get what they needed for, that was the right process and filled out the right amount. >> Well, that is an amazing story and really great support for the region, your Connecticut, the Boston area. So that's fantastic. I want to get into the rest of your story now. Let's start with some of the business drivers in banking. I mean, obviously online. A lot of people have sort of joked that many of the older people, who kind of shunned online banking would love to go into the branch and see their friendly teller had no choice, during this pandemic, to go to online. So that's obviously a big trend you mentioned, the data driven data warehouse, I want to understand that, but what at the top level, what are some of the key business drivers that are catalyzing your desire for change? >> The ability to give a customer, what they need at the time when they need it. And what I mean by that is that we have customer interactions in multiple ways. And I want to be able for the customer to walk into a bank or online and see the same format, and being able to have the same feel the same love, and also to be able to offer them the next best offer for them. But they're if they want looking for a new mortgage or looking to refinance, or whatever it is that they have that data, we have the data and that they feel comfortable using it. And that's an untethered banker. Attitude is, whatever my banker is holding and whatever the person is holding in their phone, that is the same and it's comfortable. So they don't feel that they've walked into the bank and they have to do fill out different paperwork compared to filling out paperwork on just doing it on their phone. >> You actually do want the experience to be better. And it is in many cases. Now you weren't able to do this with your existing I guess mainframe based Enterprise Data Warehouses. Is that right? Maybe talk about that a little bit? >> Yeah, we were definitely able to do it with what we have today the technology we're using. But one of the issues is that it's not timely. And you need a timely process to be able to get the customers to understand what's happening. You need a timely process so we can enhance our risk management. We can apply for fraud issues and things like that. >> Yeah, so you're trying to get more real time. The traditional EDW. It's sort of a science project. There's a few experts that know how to get it. You can so line up, the demand is tremendous. And then oftentimes by the time you get the answer, it's outdated. So you're trying to address that problem. So part of it is really the cycle time the end to end cycle time that you're progressing. And then there's, if I understand it residual benefits that are pretty substantial from a revenue opportunity, other offers that you can make to the right customer, that you maybe know, through your data, is that right? >> Exactly. It's drive new customers to new opportunities. It's enhanced the risk, and it's to optimize the banking process, and then obviously, to create new business. And the only way we're going to be able to do that is if we have the ability to look at the data right when the customer walks in the door or right when they open up their app. And by doing creating more to New York times near real time data, or the data warehouse team that's giving the lines of business the ability to work on the next best offer for that customer as well. >> But Paula, we're inundated with data sources these days. Are there other data sources that maybe had access to before, but perhaps the backlog of ingesting and cleaning in cataloging and analyzing maybe the backlog was so great that you couldn't perhaps tap some of those data sources. Do you see the potential to increase the data sources and hence the quality of the data or is that sort of premature? >> Oh, no. Exactly. Right. So right now, we ingest a lot of flat files and from our mainframe type of front end system, that we've had for quite a few years. But now that we're moving to the cloud and off-prem and on-prem, moving off-prem, into like an S3 Bucket, where that data we can process that data and get that data faster by using real time tools to move that data into a place where, like snowflake could utilize that data, or we can give it out to our market. Right now we're about we do work in batch mode still. So we're doing 24 hours. >> Okay. So when I think about the data pipeline, and the people involved, maybe you could talk a little bit about the organization. You've got, I don't know, if you have data scientists or statisticians, I'm sure you do. You got data architects, data engineers, quality engineers, developers, etc. And oftentimes, practitioners like yourself, will stress about, hey, the data is in silos. The data quality is not where we want it to be. We have to manually categorize the data. These are all sort of common data pipeline problems, if you will. Sometimes we use the term data Ops, which is sort of a play on DevOps applied to the data pipeline. Can you just sort of describe your situation in that context? >> Yeah, so we have a very large data ops team. And everyone that who is working on the data part of Webster's Bank, has been there 13 to 14 years. So they get the data, they understand it, they understand the lines of business. So it's right now. We could the we have data quality issues, just like everybody else does. But we have places in them where that gets cleansed. And we're moving toward and there was very much siloed data. The data scientists are out in the lines of business right now, which is great, because I think that's where data science belongs, we should give them and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own, like Tableau dashboards, and then pushing the data back out. So they're going to more not, I don't want to say, a central repository, but a more of a robust repository, that's controlled across multiple avenues, where multiple lines of business can access that data. Is that help? >> Got it, Yes. And I think that one of the key things that I'm taking away from your last comment, is the cultural aspects of this by having the data scientists in the line of business, the lines of business will feel ownership of that data as opposed to pointing fingers criticizing the data quality. They really own that that problem, as opposed to saying, well, it's Paula's problem. >> Well, I have my problem is I have data engineers, data architects, database administrators, traditional data reporting people. And because some customers that I have that are business customers lines of business, they want to just subscribe to a report, they don't want to go out and do any data science work. And we still have to provide that. So we still want to provide them some kind of regiment that they wake up in the morning, and they open up their email, and there's the report that they subscribe to, which is great, and it works out really well. And one of the things is why we purchased Io-Tahoe was, I would have the ability to give the lines of business, the ability to do search within the data. And we'll read the data flows and data redundancy and things like that, and help me clean up the data. And also, to give it to the data analysts who say, all right, they just asked me they want this certain report. And it used to take okay, four weeks we're going to go and we're going to look at the data and then we'll come back and tell you what we can do. But now with Io-Tahoe, they're able to look at the data, and then in one or two days, they'll be able to go back and say, Yes, we have the data, this is where it is. This is where we found it. This is the data flows that we found also, which is what I call it, is the break of a column. It's where the column was created, and where it went to live as a teenager. (laughs) And then it went to die, where we archive it. And, yeah, it's this cycle of life for a column. And Io-Tahoe helps us do that. And we do data lineage is done all the time. And it's just takes a very long time and that's why we're using something that has AI in it and machine running. It's accurate, it does it the same way over and over again. If an analyst leaves, you're able to utilize something like Io-Tahoe to be able to do that work for you. Is that help? >> Yeah, so got it. So a couple things there, in researching Io-Tahoe, it seems like one of the strengths of their platform is the ability to visualize data, the data structure and actually dig into it, but also see it. And that speeds things up and gives everybody additional confidence. And then the other piece is essentially infusing AI or machine intelligence into the data pipeline, is really how you're attacking automation. And you're saying it repeatable, and then that helps the data quality and you have this virtual cycle. Maybe you could sort of affirm that and add some color, perhaps. >> Exactly. So you're able to let's say that I have seven cars, lines of business that are asking me questions, and one of the questions they'll ask me is, we want to know, if this customer is okay to contact, and there's different avenues so you can go online, do not contact me, you can go to the bank and you can say, I don't want email, but I'll take texts. And I want no phone calls. All that information. So, seven different lines of business asked me that question in different ways. One said, "No okay to contact" the other one says, "Customer 123." All these. In each project before I got there used to be siloed. So one customer would be 100 hours for them to do that analytical work, and then another analyst would do another 100 hours on the other project. Well, now I can do that all at once. And I can do those types of searches and say, Yes, we already have that documentation. Here it is, and this is where you can find where the customer has said, "No, I don't want to get access from you by email or I've subscribed to get emails from you." >> Got it. Okay. Yeah Okay. And then I want to go back to the cloud a little bit. So you mentioned S3 Buckets. So you're moving to the Amazon cloud, at least, I'm sure you're going to get a hybrid situation there. You mentioned snowflake. What was sort of the decision to move to the cloud? Obviously, snowflake is cloud only. There's not an on-prem, version there. So what precipitated that? >> Alright, so from I've been in the data IT information field for the last 35 years. I started in the US Air Force, and have moved on from since then. And my experience with Bob Graham, was with snowflake with working with GE Capital. And that's where I met up with the team from Io-Tahoe as well. And so it's a proven so there's a couple of things one is Informatica, is worldwide known to move data. They have two products, they have the on-prem and the off-prem. I've used the on-prem and off-prem, they're both great. And it's very stable, and I'm comfortable with it. Other people are very comfortable with it. So we picked that as our batch data movement. We're moving toward probably HVR. It's not a total decision yet. But we're moving to HVR for real time data, which is changed capture data, moves it into the cloud. And then, so you're envisioning this right now. In which is you're in the S3, and you have all the data that you could possibly want. And that's JSON, all that everything is sitting in the S3 to be able to move it through into snowflake. And snowflake has proven to have a stability. You only need to learn and train your team with one thing. AWS as is completely stable at this point too. So all these avenues if you think about it, is going through from, this is your data lake, which is I would consider your S3. And even though it's not a traditional data lake like, you can touch it like a Progressive or Hadoop. And then into snowflake and then from snowflake into sandbox and so your lines of business and your data scientists just dive right in. That makes a big win. And then using Io-Tahoe with the data automation, and also their search engine. I have the ability to give the data scientists and data analysts the way of they don't need to talk to IT to get accurate information or completely accurate information from the structure. And we'll be right back. >> Yeah, so talking about snowflake and getting up to speed quickly. I know from talking to customers you can get from zero to snowflake very fast and then it sounds like the Io-Tahoe is sort of the automation cloud for your data pipeline within the cloud. Is that the right way to think about it? >> I think so. Right now I have Io-Tahoe attached to my on-prem. And I want to attach it to my off-prem eventually. So I'm using Io-Tahoe data automation right now, to bring in the data, and to start analyzing the data flows to make sure that I'm not missing anything, and that I'm not bringing over redundant data. The data warehouse that I'm working of, it's an on-prem. It's an Oracle Database, and it's 15 years old. So it has extra data in it. It has things that we don't need anymore, and Io-Tahoe's helping me shake out that extra data that does not need to be moved into my S3. So it's saving me money, when I'm moving from off-prem to on-prem. >> And so that was a challenge prior, because you couldn't get the lines of business to agree what to delete, or what was the issue there? >> Oh, it was more than that. Each line of business had their own structure within the warehouse. And then they were copying data between each other, and duplicating the data and using that. So there could be possibly three tables that have the same data in it, but it's used for different lines of business. We have identified using Io-Tahoe identified over seven terabytes in the last two months on data that has just been repetitive. It's the same exact data just sitting in a different schema. And that's not easy to find, if you only understand one schema, that's reporting for that line of business. >> More bad news for the storage companies out there. (both laughs) So far. >> It's cheap. That's what we were telling people. >> And it's true, but you still would rather not waste it, you'd like to apply it to drive more revenue. And so, I guess, let's close on where you see this thing going. Again, I know you're sort of partway through the journey, maybe you could sort of describe, where you see the phase is going and really what you want to get out of this thing, down the road, mid-term, longer term, what's your vision or your data driven organization. >> I want for the bankers to be able to walk around with an iPad in their hand, and be able to access data for that customer, really fast and be able to give them the best deal that they can get. I want Webster to be right there on top with being able to add new customers, and to be able to serve our existing customers who had bank accounts since they were 12 years old there and now our multi whatever. I want them to be able to have the best experience with our bankers. >> That's awesome. That's really what I want as a banking customer. I want my bank to know who I am, anticipate my needs, and create a great experience for me. And then let me go on with my life. And so that follow. Great story. Love your experience, your background and your knowledge. I can't thank you enough for coming on theCube. >> Now, thank you very much. And you guys have a great day. >> All right, take care. And thank you for watching everybody. Keep right there. We'll take a short break and be right back. (gentle music)
SUMMARY :
to you by Io-Tahoe. And I'm really excited to of a regional I think and they want to move it relates to kind of transitioning And the only way to do But I want to ask you about Covid, and get the data moving And then finally, you got more clarity. and filled out the right amount. and really great support for the region, and being able to have the experience to be better. to be able to get the customers that know how to get it. and it's to optimize the banking process, and analyzing maybe the backlog was and get that data faster and the people involved, And everyone that who is working is the cultural aspects of this the ability to do search within the data. and you have this virtual cycle. and one of the questions And then I want to go back in the S3 to be able to move it Is that the right way to think about it? and to start analyzing the data flows and duplicating the data and using that. More bad news for the That's what we were telling people. and really what you want and to be able to serve And so that follow. And you guys have a great day. And thank you for watching everybody.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Paula D'Amico | PERSON | 0.99+ |
Paula | PERSON | 0.99+ |
Connecticut | LOCATION | 0.99+ |
Westchester | LOCATION | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
24 hours | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
13 | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
100 hours | QUANTITY | 0.99+ |
Bob Graham | PERSON | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
Webster Bank | ORGANIZATION | 0.99+ |
GE Capital | ORGANIZATION | 0.99+ |
first item | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
two products | QUANTITY | 0.99+ |
seven | QUANTITY | 0.99+ |
New York | LOCATION | 0.99+ |
Boston | LOCATION | 0.99+ |
three tables | QUANTITY | 0.99+ |
Each line | QUANTITY | 0.99+ |
first set | QUANTITY | 0.99+ |
two days | QUANTITY | 0.99+ |
DevOps | TITLE | 0.99+ |
Webster bank | ORGANIZATION | 0.99+ |
14 years | QUANTITY | 0.99+ |
over 15 years | QUANTITY | 0.99+ |
seven cars | QUANTITY | 0.98+ |
each project | QUANTITY | 0.98+ |
Friday night | DATE | 0.98+ |
New England | LOCATION | 0.98+ |
Io-Tahoe | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
Webster's Bank | ORGANIZATION | 0.98+ |
one schema | QUANTITY | 0.97+ |
Fairfield County | LOCATION | 0.97+ |
One | QUANTITY | 0.97+ |
one customer | QUANTITY | 0.97+ |
over seven terabytes | QUANTITY | 0.97+ |
Salesforce | ORGANIZATION | 0.96+ |
both | QUANTITY | 0.95+ |
single source | QUANTITY | 0.93+ |
one thing | QUANTITY | 0.93+ |
US Air Force | ORGANIZATION | 0.93+ |
Webster | ORGANIZATION | 0.92+ |
S3 | COMMERCIAL_ITEM | 0.92+ |
Enterprise Data Architecture | ORGANIZATION | 0.91+ |
Oracle | ORGANIZATION | 0.9+ |
15 years old | QUANTITY | 0.9+ |
Io-Tahoe | PERSON | 0.89+ |
12 years old | QUANTITY | 0.88+ |
Tableau | TITLE | 0.87+ |
four weeks | QUANTITY | 0.86+ |
S3 Buckets | COMMERCIAL_ITEM | 0.84+ |
Covid | PERSON | 0.81+ |
Data Architecture | ORGANIZATION | 0.79+ |
JSON | TITLE | 0.79+ |
Senior Vice President | PERSON | 0.78+ |
24 seven development role | QUANTITY | 0.77+ |
last 35 years | DATE | 0.77+ |
both laughs | QUANTITY | 0.75+ |
Io-Tahoe | TITLE | 0.73+ |
each | QUANTITY | 0.72+ |
loans | QUANTITY | 0.71+ |
zero | QUANTITY | 0.71+ |
Amazon cloud | ORGANIZATION | 0.65+ |
last two months | DATE | 0.65+ |
Steven Webster, asensei | Sports Data {Silicon Valley} 2018
(spirited music) >> Hey, welcome back everybody. Jeff Frick here with theCUBE. We are in the Palo Alto Studios for a CUBE Conversation. Part of our Western Digital Data Makes Possible Series, really looking at a lot of cool applications. At the end of the day, data's underneath everything. There's infrastructure and storage that's holding that, but it's much more exciting to talk about the applications. We're excited to have somebody who's kind of on the cutting edge of a next chapter of something you're probably familiar with. He's Steven Webster, and he is the founder and CEO of Asensei. Steven, great to see you. >> Likewise, likewise. >> So, you guys are taking, I think everyone's familiar with Fitbits, as probably one of the earliest iterations of a biometric feedback, for getting more steps. At the end of the day, get more steps. And you guys are really taking it to the next level, which is, I think you call it connected coaching, so I wondered if you could give everyone a quick overview, and then we'll dig into it a little bit. >> Yeah, I think we're all very familiar now with connected fitness in hindsight, as a category that appeared and emerged, as, like you say, first it was activity trackers. We saw those trackers primarily move into smartwatches, and the category's got life in it, life in it left. I see companies like Flywheel and Peloton, we all know Peloton now. >> [Jeff] Right. >> We're starting to make the fitness equipment itself, the treadmill, the bike, connected. So, there's plenty of growth in that category. But our view is that tracking isn't teaching, and counting and cheering isn't coaching. And so we see this opportunity for this new category that's emerging alongside connected fitness, and that's what we call connected coaching. >> Connected coaching. So the biggest word, obviously, instead of fitness tracker, to the connected coaching, is coaching. >> Yeah. >> So, you guys really think that the coaching piece of it is core. And are you targeting high-end athletes, or is this for the person that just wants to take a step up from their fitness tracker? Where in the coaching spectrum are you guys targeting? >> I saw your shoe dog, Phil Knight, founder of Nike, a book on the shelf behind you there, and his co-founder, Bill Bowerman, has a great quote that's immortalized in Nike offices and stores around the world: "If you have a body, you're an athlete." So, that's how we think about our audience. Our customer base is anyone that wants to unlock their athletic potential. I think if you look at elite sports, and elite athletes, and Olympic athletes, they've had access to this kind of technology going back to the Sydney Olympics, so we're really trying to consumerize that technology and make it available to the people that want to be those athletes, but aren't those athletes yet. You might call it the weekend warrior, or just the committed athlete, that would identify, identify themselves according to a sport that they play. >> So, there's different parts of coaching, right? One, is kind of knowing the techniques, so that you've got the best practices by which to try to practice. >> [Steven] Yep. >> And then there's actually coaching to those techniques, so people practice, right? Practice doesn't make perfect. It's perfect practice that makes perfect. >> [Steven] You stole our line, which we stole from someone else. >> So, what are you doing? How do you observe the athlete? How do you communicate with the athlete? How do you make course corrections to the athlete to move it from simply tracking to coaching? >> [Steven] I mean, it starts with, you have to see everything and miss nothing. So, you need to have eyes on the athlete, and there's really two ways we think you can do that. One is, you're using cameras and computer vision. I think most of us are familiar with technologies like Microsoft Connect, where an external camera can allow you to see the skeleton and the biomechanics of the athlete. And that's a big thing for us. We talk about the from to being from just measuring biometrics: how's your heart rate, how much exertion are you making, how much power are you laying down. We need to move from biometrics to biomechanics, and that means looking at technique, and posture, and movement, and timing. So, we're all familiar with cameras, but we think the more important innovation is the emergence of smart clothing, or smart apparel, and the ability to take sensors that would have been discrete, hard components, and infuse those sensors into smart apparel. We've actually created a reference design for a motion capture sensor, and a network of those sensors infused in your apparel allows us to recover your skeleton, but as easily as pulling on a shirt or shorts. >> [Jeff] So you've actually come up with a reference design. So, obviously, begs a question: you're not working with any one particular apparel manufacturer. You really want to come up with a standard and publish the standard by which anyone could really define, capture, and record body movements, and to convert those movements from the clothing into a model. >> No, that's exactly it. We have no desire to be in the apparel industry. We have no desire to unseat Nike, Adidas, or Under Armour. We're actually licensing our technology royalty-free. We just want to accelerate the adoption of smart apparel. And I think the thing about smart apparel is, no one's going to walk into Niketown and say, "Where's the smart apparel department? "I don't want dumb apparel anymore." There needs to be a compelling reason to buy digitally enhanced apparel, and we think one of the most compelling reasons to buy that is so that we can be coached in the sport of our choice. >> [Jeff] So, then you're starting out with rowing, I believe, is your first sport, right? >> [Steven] That's correct, yeah. >> And so the other really important piece of it, is if people don't have smart apparel, or the smart apparel's not there yet, or maybe when they have smart apparel, there's a lot of opportunities to bring in other data sources beyond just that single set. >> [Steven] And that's absolutely key. When I think about biomechanics, that's what goes in, but there's also what comes out. Good form isn't just aesthetic. Good form is in any given sport. Good form and good technique is about organizing yourself so that you perform most efficiently and perform most effectively. Yeah, so you corrected a point in that we've chosen rowing as one of the sports. Rowing is all about technique. It's all about posture. It's all about form. If you've got two rowers who, essentially, have the same strength, the same cardiovascular capability, the one with the best technique will make the boat move faster. But for the sport of rowing, we also get a tremendous amount of telemetry coming off the rowing machine itself. A force curve weakened on every single pull of that handle. We can see how you're laying down that force, and we can read those force curves. We can look at them and tell things like, are you using your legs enough? Are you opening your back too late or too early? Are you dominant on your arms, where you shouldn't be? Is your technique breaking down at higher stroke rates, but is good at lower stroke rates? So it's a good place for us to start. We can take all of that knowledge and information and coach the athlete. And then when we get down to more marginal gains, we can start to look at their posture and form through that technology like smart apparel. >> There's the understanding what they're doing, and understanding the effort relative to best practices, but there's also, within their journey. Maybe today, they're working on cardio, and tomorrow, they're working on form. The next day, they're working on sprints. So the actual best practices in coaching a sport or particular activity, how are you addressing that? How are you bringing in that expertise beyond just the biometric information? >> [Steven] So yeah, we don't think technology is replacing coaches. We just think that coaches that use technology will replace coaches that don't. It's not an algorithm that's trying to coach you. We're taking the knowledge and the expertise of world-class coaches in the sport, that athletes want to follow, and we're taking that coaching, and essentially, think of it as putting it into a learning management system. And then for any given athlete, Just think of it the way a coach coaches. If you walked into a rowing club, I don't know if you've ever rowed before or not, but a coach will look at you, they'll sit you on a rowing machine or sit you on a boat, and just look at you and decide, what's the one next thing that I'm going to teach you that's going to make you better? And really, that's the art of coaching right there. It's looking for that next improvement, that next marginal gain. It's not just about being able to look at the athlete, but then decide where's the improvement that we want to coach the athlete? And then the whole sports psychology of, how do you coach his improvements? >> Because there's the whole hammer versus carrot. That's another thing. You need to learn how the individual athlete responds, what types of things do they respond better to? Do they like to get yelled? Do they like to be encouraged? Did they like it at the beginning? Did they like it at the end? So, do you guys incorporate some of these softer coaching techniques into the application? >> Our team have all coached sport at university-level typically. We care a lot and we think a lot about the role of the coach. The coach's job is to attach technique to the athlete's body. It's to take what's in your head and what you've seen done before, and give that to the athlete, so absolutely, we're thinking about how do you establish the correct coaching cues. How do you positively reinforce, not just negatively reinforce? Is that person a kinesthetic learner, where they need to feel how to do it correctly? Are they a more visual learner, where they respond better to metaphor? Now, one of the really interesting things with a digital coach is the more people we teach, the better we can get at teaching, because we can start to use some of the techniques of enlarged datasets, and looking at what's working and what's not working. In fact, it's the same technology we would use in marketing or advertising, to segment an audience, and target content. >> Right. >> [Steven] We can take that same technology and apply it how we think about coaching sports. >> So is your initial target to help active coaches that are looking for an edge? Or are you trying to go for the weakend warrior, if you will? Where's your initial market? >> For rowing, we've actually zeroed in on three athletes, where we have a point of view that Asensei can be of help. I'll tell you who the three are. First, is the high school athlete who wants to go to college and get recruited. So, we're selling to the parent as much as we're selling to the student. >> [Jeff] That's an easy one. Just show up and be tall. >> Well, show up, be tall, but also what's your 2k time? How fast can you row 2,000 meters? That's a pretty important benchmark. So for that high school athlete, that's a very specific audience where we're bringing very specific coaches. In fact, the coach that we're launching with to that market, his story is one of, high school to college to national team, and he just came back from the Olympics in Rio. The second athlete that we're looking at is the person who never wants to go on the water, but likes that indoor rowing machine, so it's that CrossFit athlete or it's an indoor rower. And again, we have a very specific coach who coaches indoor rowing. And then the third target customer is-- >> What's that person's motivation, just to get a better time? >> Interesting, in that community, there's a lot of competitiveness, so yeah, it's about I want to get good at this, I want to get better at this. Maybe enter local competitions, either inside your gym or your box. This weekend, in Boston, we have just had one of the largest indoor world, it was the World Indoor Rowing Championships, the C.R.A.S.H B's. There's these huge indoor rowing competitions, so that's a very competitive athlete. And then finally we have, what would be the master's rower or the person for whom rowing is. There's lots of people who don't identify themselves as a rower, but they'll get on a rowing machine two or three times a week, whether it's in their gym or whether it's at home. Your focus is strength, conditioning, working out, but staying injury-free, and just fun and fitness. I think Palaton validated the existence of that market, and we see a lot of people wanting to do that with a rowing machine, and not with a bike. >> I think most of these people will or will not have access to a primary coach, and this augments it, or does this become their primary coach based on where they are in their athletic life? >> [Steven] I think it's both, and certainly, and certainly, we're able to support both. I think when you're that high school rower that wants to make college, you're probably a member of either your school rowing crew or you're a member of a club, but you spend a tremendous amount of time on an erg, the indoor rowing machine, and your practice is unsupervised. Even though you know what you should be doing, there's nobody there in that moment watching you log those 10,000 meters. One of our advisors is, actually, a two-times Olympic world medalist from team Great Britain, Helen Glover. And Helen, I have a great quote from Helen, where she calculated for the Rio Olympics, in the final of the Rio Olympics, every stroke she took in the final, she'd taken 16,000 strokes in practice, which talks to the importance of the quality of that practice, and making sure it's supervised. >> The bigger take on the old 10,000 reps, right? 16,000 per stroke. >> Right? >> Kind of looking forward, right, what were some of the biggest challenges you had to overcome? And then, as you looked forward, right, since the beginning, were ubiquitous, and there's 3D goggles, and there'll be outside-in centers for that whole world. How do you see this world evolving in the immediate short-term for you guys to have success, and then, just down the road a year or two? >> That's a really good question. I think in the short-term, I think it's incumbent on us to just stay really focused in a single community, and get that product right for them. It's more about introducing people to the idea. This is a category creation exercise, so we need to go through that adoption curve of find the early adopters, find the early majority, and before we take that technology anywhere towards our mass market, we need to nail the experience for that early majority. And we think that it's largely going to be in the sport of rowing or with rowers. The cross participation studies in rowing are pretty strong for other sports. Typically, somewhere between 60-80% of rowers weight lift, bike, run, and take part in yoga, whether yoga for mobility and flexibility. There's immediately adjacent markets available to us where the rowers are already in those markets. We're going to stick there for awhile, and really just nail the experience down. >> And is it a big reach to go from tracking to coaching? I mean, these people are all super data focused, right? The beauty of rowing, as you mentioned, it's all about your 2k period. It's one single metric. And they're running, and they're biking, and they're doing all kinds of data-based things, but you're trying to get them to think really more on terms of the coaching versus just the tracking. Has that been hard for them to accept? Do you have any kind of feel for the adoption or the other thing, I would imagine, I spent all this money for these expensive clothing. Is this a killer app that I can now justify having? >> Right, right, right. >> Maybe fancier connected clothes, rather than just simply tracking my time? >> I mean, I think, talking about pricing in the first instance. What we're finding with consumers that we've been testing with, is if you can compare the price of a shirt to the price of shirt without sensors, it's really the wrong value proposition. The question we ask is, How much money are you spending on your CrossFit box membership or your Equinox gym membership? The cost of a personal trainer is easily upwards of $75-100 for an hour. Now, we can give you 24/7 access to that personal coaching. You'll pay the same in a year as you would pay in an hour for coaching. I think for price, it's someone who's already thinking about paying for personal coaching and personal training, that's really where the pricing market is. >> That's interesting, we see that time and time again. We did an interview with Knightscope, and they have security robots, and basically, it's the same thing. They're priced comparisons was the hourly rate for a human counterpart, or we can give it to you for a much less hourly rate. And now, you don't just get it for an hour, you get it for as long as you want to use it. Well, it's exciting times. You guys in the market in terms of when you're going G80? Have a feel for-- >> Any minute now. >> Any minute now? >> We have people using the product, giving us feedback. My phone's switched off. That's the quietest it's been for awhile. But we have people using the product right now, giving us feedback on the product. We're really excited. One in three people, when we ask, the metric that matters for us is net promoter score. How likely would someone recommend asensei to someone else? One in three athletes are giving us a 10 out of 10, so we feel really good about the experience. Now, we're just focused on making sure we have enough content in place from our coaches. General availability is anytime soon. >> [Jeff] Good. Very exciting. >> Yeah, we're excited. >> Thanks for taking a few minutes of your day, and I actually know some rowers, so we'll have to look into the application. >> Right, introduce us. Good stuff. >> He's Steven Webster, I'm Jeff Frick. You're watching theCUBE. We're having a CUBE Conversation in our Palo Alto Studios. Thanks for watching. (bright music)
SUMMARY :
and he is the founder and CEO of Asensei. And you guys are really taking it to the next level, and the category's got life in it, life in it left. And so we see this opportunity for this new category So the biggest word, obviously, instead of fitness tracker, Where in the coaching spectrum are you guys targeting? a book on the shelf behind you there, One, is kind of knowing the techniques, to those techniques, so people practice, right? [Steven] You stole our line, and the ability to take sensors that would have been and publish the standard by which is so that we can be coached in the sport of our choice. And so the other really important piece of it, But for the sport of rowing, we also get a tremendous amount There's the understanding what they're doing, that's going to make you better? So, do you guys incorporate some of these softer coaching and give that to the athlete, and apply it how we think about coaching sports. First, is the high school athlete [Jeff] That's an easy one. In fact, the coach that we're launching with to that market, or the person for whom rowing is. in the final of the Rio Olympics, The bigger take on the old 10,000 reps, right? in the immediate short-term for you guys to have success, and really just nail the experience down. And is it a big reach to go from tracking to coaching? Now, we can give you 24/7 access to that personal coaching. for a human counterpart, or we can give it to you the metric that matters for us is net promoter score. [Jeff] Good. and I actually know some rowers, Good stuff. We're having a CUBE Conversation in our Palo Alto Studios.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Nike | ORGANIZATION | 0.99+ |
Phil Knight | PERSON | 0.99+ |
Steven Webster | PERSON | 0.99+ |
Helen | PERSON | 0.99+ |
Bill Bowerman | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Flywheel | ORGANIZATION | 0.99+ |
One | QUANTITY | 0.99+ |
Adidas | ORGANIZATION | 0.99+ |
Helen Glover | PERSON | 0.99+ |
Boston | LOCATION | 0.99+ |
Steven | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
16,000 strokes | QUANTITY | 0.99+ |
Asensei | PERSON | 0.99+ |
Olympics | EVENT | 0.99+ |
two | QUANTITY | 0.99+ |
2,000 meters | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
10 | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
$75-100 | QUANTITY | 0.99+ |
10,000 meters | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
two rowers | QUANTITY | 0.99+ |
asensei | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
2k | QUANTITY | 0.99+ |
10,000 reps | QUANTITY | 0.99+ |
two-times | QUANTITY | 0.99+ |
first instance | QUANTITY | 0.99+ |
Under Armour | ORGANIZATION | 0.99+ |
Knightscope | ORGANIZATION | 0.99+ |
three athletes | QUANTITY | 0.99+ |
Peloton | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
World Indoor Rowing Championships | EVENT | 0.98+ |
Asensei | ORGANIZATION | 0.98+ |
first sport | QUANTITY | 0.98+ |
a year | QUANTITY | 0.98+ |
Rio Olympics | EVENT | 0.98+ |
one | QUANTITY | 0.98+ |
an hour | QUANTITY | 0.98+ |
Equinox | ORGANIZATION | 0.98+ |
Olympic | EVENT | 0.98+ |
Silicon Valley | LOCATION | 0.98+ |
three people | QUANTITY | 0.97+ |
two ways | QUANTITY | 0.97+ |
16,000 per stroke | QUANTITY | 0.97+ |
next day | DATE | 0.96+ |
second athlete | QUANTITY | 0.96+ |
third target | QUANTITY | 0.96+ |
60-80% | QUANTITY | 0.96+ |
Microsoft | ORGANIZATION | 0.95+ |
three times a week | QUANTITY | 0.95+ |
This weekend | DATE | 0.95+ |
Sydney Olympics | EVENT | 0.94+ |
one single metric | QUANTITY | 0.93+ |
single set | QUANTITY | 0.92+ |
Great Britain | ORGANIZATION | 0.92+ |
Palo Alto Studios | LOCATION | 0.92+ |
CrossFit | ORGANIZATION | 0.89+ |
first | QUANTITY | 0.87+ |
single community | QUANTITY | 0.86+ |
Ajay Vohora, Io Tahoe | Enterprise Data Automation
>>from around the globe. It's the Cube with digital coverage of enterprise data automation an event Siri's brought to you by Iot. Tahoe. >>Okay, we're back. Welcome back to data Automated. A J ahora is CEO of I o Ta ho, JJ. Good to see you. How have things in London? >>Big thing. Well, thinking well, where we're making progress, I could see you hope you're doing well and pleasure being back here on the Cube. >>Yeah, it's always great to talk to. You were talking enterprise data automation. As you know, with within our community, we've been pounding the whole data ops conversation. Little different, though. We're gonna We're gonna dig into that a little bit. But let's start with a J how you've seen the response to Covert and I'm especially interested in the role that data has played in this pandemic. >>Yeah, absolutely. I think everyone's adapting both essentially, um, and and in business, the customers that I speak to on day in, day out that we partner with, um they're busy adapting their businesses to serve their customers. It's very much a game of and showing the week and serve our customers to help their customers um, you know, the adaptation that's happening here is, um, trying to be more agile, kind of the most flexible. Um, a lot of pressure on data. A lot of demand on data and to deliver more value to the business, too. Serve that customer. >>Yeah. I mean, data machine intelligence and cloud, or really three huge factors that have helped organizations in this pandemic. And, you know, the machine intelligence or AI piece? That's what automation is all about. How do you see automation helping organizations evolve maybe faster than they thought they might have to >>Sure. I think the necessity of these times, um, there's there's a says a lot of demand doing something with data data. Uh huh. A lot of a lot of businesses talk about being data driven. Um, so interesting. I sort of look behind that when we work with our customers, and it's all about the customer. You know, the mic is cios invested shareholders. The common theme here is the customer. That customer experience starts and ends with data being able to move from a point that is reacting. So what the customer is expecting and taking it to that step forward where you can be proactive to serve what that customer's expectation to and that's definitely come alive now with they, um, the current time. >>Yes. So, as I said, we've been talking about data ops a lot. The idea being Dev Ops applied to the data pipeline. But talk about enterprise data automation. What is it to you and how is it different from data off? >>Yeah, Great question. Thank you. I am. I think we're all familiar with felt more more awareness around. So as it's applied, Teoh, uh, processes methodologies that have become more mature of the past five years around devil that managing change, managing an application, life cycles, managing software development data about, you know, has been great. But breaking down those silos between different roles functions and bringing people together to collaborate. Andi, you know, we definitely see that those tools, those methodologies, those processes, that kind of thinking, um, landing itself to data with data is exciting. We're excited about that, Andi shifting the focus from being I t versus business users to you know who are the data producers. And here the data consumers in a lot of cases, it concert in many different lines of business. So in data role, those methods those tools and processes well we look to do is build on top of that with data automation. It's the is the nuts and bolts of the the algorithms, the models behind machine learning that the functions. That's where we investors our R and D and bringing that in to build on top of the the methods, the ways of thinking that break down those silos on injecting that automation into the business processes that are going to drive a business to serve its customers. It's, um, a layer beyond Dev ops data ops. They can get to that point where well, I think about it is, Is the automation behind the automation we can take? I'll give you an example. Okay, a bank where we did a lot of work to do make move them into accelerating that digital transformation. And what we're finding is that as we're able to automate the jobs related to data a managing that data and serving that data that's going into them as a business automating their processes for their customer. Um, so it's it's definitely having a compound effect. >>Yeah, I mean I think that you did. Data ops for a lot of people is somewhat new to the whole Dev Ops. The data ops thing is is good and it's a nice framework. Good methodology. There is obviously a level of automation in there and collaboration across different roles. But it sounds like you're talking about so supercharging it, if you will, the automation behind the automation. You know, I think organizations talk about being data driven. You hear that? They have thrown around a lot of times. People sit back and say, We don't make decisions without data. Okay? But really, being data driven is there's a lot of aspects there. There's cultural, but it's also putting data at the core of your organization, understanding how it effects monetization. And, as you know, well, silos have been built up, whether it's through M and a, you know, data sprawl outside data sources. So I'm interested in your thoughts on what data driven means and specifically Hi, how Iot Tahoe plays >>there. Yeah, I'm sure we'll be happy. That look that three David, we've We've come a long way in the last four years. We started out with automating some of those simple, um, to codify. Um, I have a high impact on organization across the data, a data warehouse. There's data related tasks that classify data on and a lot of our original pattern. Senai people value that were built up is is very much around. They're automating, classifying data across different sources and then going out to so that for some purpose originally, you know, some of those simpler I'm challenges that we have. Ah, custom itself, um, around data privacy. You know, I've got a huge data lake here. I'm a telecoms business. I've got millions of six subscribers. Um, quite often the chief data office challenges. How do I cover the operational risk? Where, um, I got so much data I need to simplify my approach to automating, classifying that data. Recent is you can't do that manually. We can for people at it. And the the scale of that is is prohibitive, right? Often, if you had to do it manually by the time you got a good picture of it, it's already out of date. Then, starting with those those simple challenges that we've been able to address, we're then going on and build on that to say, What else do we serve? What else do we serve? The chief data officer, Chief marketing officer on the CFO. Within these times, um, where those decision makers are looking for having a lot of choices in the platform options that they say that the tooling they're very much looking for We're that Swiss army. Not being able to do one thing really well is is great, but more more. Where that cost pressure challenge is coming in is about how do we, um, offer more across the organization, bring in those business lines of business activities that depend on data to not just with a T. Okay, >>so we like the cube. Sometimes we like to talk about Okay, what is it? And then how does it work? And what's the business impact? We kind of covered what it is but love to get into the tech a little bit in terms of how it works. And I think we have a graphic here that gets into that a little bit. So, guys, if you bring that up, I wonder if you could tell us and what is the secret sauce behind Iot Tahoe? And if you could take us through this slot. >>Sure. I mean, right there in the middle that the heart of what we do It is the intellectual property. Yeah, that was built up over time. That takes from Petra genius data sources Your Oracle relational database, your your mainframe. If they lay in increasingly AP eyes and devices that produce data and that creates the ability to automatically discover that data, classify that data after it's classified them have the ability to form relationships across those different, uh, source systems, silos, different lines of business. And once we've automated that that we can start to do some cool things that just puts a contact and meaning around that data. So it's moving it now from bringing data driven on increasingly well. We have really smile, right people in our customer organizations you want do some of those advanced knowledge tasks, data scientists and, uh, quants in some of the banks that we work with. The the onus is on, then, putting everything we've done there with automation, pacifying it, relationship, understanding that equality policies that you apply to that data. I'm putting it in context once you've got the ability to power. A a professional is using data, um, to be able to put that data and contacts and search across the entire enterprise estate. Then then they can start to do some exciting things and piece together the tapestry that fabric across that different systems could be crm air P system such as s AP on some of the newer cloud databases that we work with. Snowflake is a great Well, >>yes. So this is you're describing sort of one of the one of the reasons why there's so many stove pipes and organizations because data is gonna locked in the silos of applications. I also want to point out, you know, previously to do discovery to do that classification that you talked about form those relationship to glean context from data. A lot of that, if not most of that in some cases all that would have been manual. And of course, it's out of date so quickly. Nobody wants to do it because it's so hard. So this again is where automation comes into the the the to the idea of really becoming data driven. >>Sure. I mean the the efforts. If we if I look back, maybe five years ago, we had a prevalence of daily technologies at the cutting edge. Those have said converging me to some of these cloud platforms. So we work with Google and AWS, and I think very much is, as you said it, those manual attempts to try and grasp. But it is such a complex challenge at scale. I quickly runs out of steam because once, um, once you've got your hat, once you've got your fingers on the details Oh, um, what's what's in your data estate? It's changed, you know, you've onboard a new customer. You signed up a new partner, Um, customer has no adopted a new product that you just Lawrence and there that that slew of data it's keeps coming. So it's keeping pace with that. The only answer really is is some form of automation. And what we found is if we can tie automation with what I said before the expertise the, um, the subject matter expertise that sometimes goes back many years within an organization's people that augmentation between machine learning ai on and on that knowledge that sits within inside the organization really tends to involve a lot of value in data? >>Yes, So you know Well, a J you can't be is a smaller company, all things to all people. So your ecosystem is critical. You working with AWS? You're working with Google. You got red hat. IBM is as partners. What is attracting those folks to your ecosystem and give us your thoughts on the importance of ecosystem? >>Yeah, that's that's fundamental. So I mean, when I caimans, we tell her here is the CEO of one of the, um, trends that I wanted us to to be part of was being open, having an open architecture that allowed one thing that was nice to my heart, which is as a CEO, um, a C I O where you've got a budget vision and you've already made investments into your organization, and some of those are pretty long term bets. They should be going out 5 10 years, sometimes with CRM system training up your people, getting everybody working together around a common business platform. What I wanted to ensure is that we could openly like it using ap eyes that were available, the love that some investment on the cost that has already gone into managing in organizations I t. But business users to before So part of the reason why we've been able to be successful with, um, the partners like Google AWS and increasingly, a number of technology players. That red hat mongo DB is another one where we're doing a lot of good work with, um, and snowflake here is, um it's those investments have been made by the organizations that are our customers, and we want to make sure we're adding to that, and they're leveraging the value that they've already committed to. >>Okay, so we've talked about kind of what it is and how it works, and I want to get into the business impact. I would say what I would be looking for from from this would be Can you help me lower my operational risk? I've got I've got tasks that I do many year sequential, some who are in parallel. But can you reduce my time to task? And can you help me reduce the labor intensity and ultimately, my labor costs? And I put those resources elsewhere, and ultimately, I want to reduce the end and cycle time because that is going to drive Telephone number R. A. Y So, um, I missing anything? Can you do those things? And maybe you could give us some examples of the tiara y and the business impact. >>Yeah. I mean, the r a y David is is built upon on three things that I mentioned is a combination off leveraging the existing investment with the existing state, whether that's home, Microsoft, Azure or AWS or Google IBM. And I'm putting that to work because, yeah, the customers that we work with have had made those choices. On top of that, it's, um, is ensuring that we have you got the automation that is working right down to the level off data, a column level or the file level so we don't do with meta data. It is being very specific to be at the most granular level. So as we've grown our processes and on the automation, gasification tagging, applying policies from across different compliance and regulatory needs, that an organization has to the data, everything that then happens downstream from that is ready to serve a business outcome. It could be a customer who wants that experience on a mobile device. A tablet oh, face to face within, within the store. I mean game. Would you provision the right data and enable our customers do that? But their customers, with the right data that they can trust at the right time, just in that real time moment where decision or an action is being expected? That's, um, that's driving the r a y two b in some cases, 20 x but and that's that's really satisfying to see that that kind of impact it is taking years down to months and in many cases, months of work down to days. In some cases, our is the time to value. I'm I'm impressed with how quickly out of the box with very little training a customer and think about, too. And you speak just such a search. They discovery knowledge graph on DM. I don't find duplicates. Onda Redundant data right off the bat within hours. >>Well, it's why investors are interested in this space. I mean, they're looking for a big, total available market. They're looking for a significant return. 10 X is you gotta have 10 x 20 x is better. So so that's exciting and obviously strong management and a strong team. I want to ask you about people and culture. So you got people process technology we've seen with this pandemic that processes you know are really unpredictable. And the technology has to be able to adapt to any process, not the reverse. You can't force your process into some static software, so that's very, very important. But the end of the day you got to get people on board. So I wonder if you could talk about this notion of culture and a data driven culture. >>Yeah, that's that's so important. I mean, current times is forcing the necessity of the moment to adapt. But as we start to work their way through these changes on adapt ah, what with our customers, But that is changing economic times. What? What we're saying here is the ability >>to I >>have, um, the technology Cartman, in a really smart way, what those business uses an I T knowledge workers are looking to achieve together. So I'll give you an example. We have quite often with the data operations teams in the companies that we, um, partnering with, um, I have a lot of inbound enquiries on the day to day level. I really need this set of data they think it can help my data scientists run a particular model? Or that what would happen if we combine these two different silence of data and gets the Richmond going now, those requests you can, sometimes weeks to to realize what we've been able to do with the power is to get those answers being addressed by the business users themselves. And now, without without customers, they're coming to the data. And I t folks saying, Hey, I've now built something in the development environment. Why don't we see how that can scale up with these sets of data? I don't need terabytes of it. I know exactly the columns and the feet in the data that I'm going to use on that gets seller wasted in time, um, angle to innovate. >>Well, that's huge. I mean, the whole notion of self service and the lines of business actually feeling like they have ownership of the data as opposed to, you know, I t or some technology group owning the data because then you've got data quality issues or if it doesn't line up there their agenda, you're gonna get a lot of finger pointing. So so that is a really important. You know a piece of it. I'll give you last word A J. Your final thoughts, if you would. >>Yeah, we're excited to be the only path. And I think we've built great customer examples here where we're having a real impact in in a really fast pace, whether it helping them migrate to the cloud, helping the bean up their legacy, Data lake on and write off there. Now the conversation is around data quality as more of the applications that we enable to a more efficiently could be data are be a very robotic process automation along the AP, eyes that are now available in the cloud platforms. A lot of those they're dependent on data quality on and being able to automate. So business users, um, to take accountability off being able to so look at the trend of their data quality over time and get the signals is is really driving trust. And that trust in data is helping in time. Um, the I T teams, the data operations team, with do more and more quickly that comes back to culture being out, supply this technology in such a way that it's visual insensitive. Andi. How being? Just like Dev Ops tests with with a tty Dave drops putting intelligence in at the data level to drive that collaboration. We're excited, >>you know? You remind me of something. I lied. I don't want to go yet. It's OK, so I know we're tight on time, but you mentioned migration to the cloud. And I'm thinking about conversation with Paula from Webster Webster. Bank migrations. Migrations are, you know, they're they're a nasty word for for organizations. So our and we saw this with Webster. How are you able to help minimize the migration pain and and why is that something that you guys are good at? >>Yeah. I mean, there were many large, successful companies that we've worked with. What's There's a great example where, you know, I'd like to give you the analogy where, um, you've got a lot of people in your teams if you're running a business as a CEO on this bit like a living living grade. But imagine if those different parts of your brain we're not connected, that with, um, so diminish how you're able to perform. So what we're seeing, particularly with migration, is where banks retailers. Manufacturers have grown over the last 10 years through acquisition on through different initiatives, too. Um, drive customer value that sprawl in their data estate hasn't been fully dealt with. It sometimes been a good thing, too. Leave whatever you're fired off the agent incent you a side by side with that legacy mainframe on your oracle, happy and what we're able to do very quickly with that migration challenges shine a light on all the different parts. Oh, data application at the column level or higher level if it's a day late and show an enterprise architect a CDO how everything's connected, where they may not be any documentation. The bright people that created some of those systems long since moved on or retired or been promoted into so in the rose on within days, being out to automatically generate Anke refreshed the states of that data across that man's game on and put it into context, then allows you to look at a migration from a confidence that you did it with the back rather than what we've often seen in the past is teams of consultant and business analysts. Data around this spend months getting an approximation and and a good idea of what it could be in the current state and try their very best to map that to the future Target state. Now, without all hoping out, run those processes within hours of getting started on, um well, that picture visualize that picture and bring it to life. You know, the Yarra. Why, that's off the bat with finding data that should have been deleted data that was copies off on and being able to allow the architect whether it's we're working on gcb or migration to any other clouds such as AWS or a multi cloud landscape right now with yeah, >>that visibility is key. Teoh sort of reducing operational risks, giving people confidence that they can move forward and being able to do that and update that on an ongoing basis, that means you can scale a J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have >>you. Thank you, David. Look towards smoking in. >>Alright, keep it right there, everybody. We're here with data automated on the Cube. This is Dave Volante and we'll be right back. Short break. >>Yeah, yeah, yeah, yeah
SUMMARY :
enterprise data automation an event Siri's brought to you by Iot. Good to see you. Well, thinking well, where we're making progress, I could see you hope As you know, with within A lot of demand on data and to deliver more value And, you know, the machine intelligence I sort of look behind that What is it to you that automation into the business processes that are going to drive at the core of your organization, understanding how it effects monetization. that for some purpose originally, you know, some of those simpler I'm challenges And if you could take us through this slot. produce data and that creates the ability to that you talked about form those relationship to glean context from data. customer has no adopted a new product that you just Lawrence those folks to your ecosystem and give us your thoughts on the importance of ecosystem? that are our customers, and we want to make sure we're adding to that, that is going to drive Telephone number R. A. Y So, um, And I'm putting that to work because, yeah, the customers that we work But the end of the day you got to get people on board. necessity of the moment to adapt. I have a lot of inbound enquiries on the day to day level. of the data as opposed to, you know, I t or some technology group owning the data intelligence in at the data level to drive that collaboration. is that something that you guys are good at? I'd like to give you the analogy where, um, you've got a lot of people giving people confidence that they can move forward and being able to do that and update We're here with data automated on the Cube.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Paula | PERSON | 0.99+ |
Ajay Vohora | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
IBM | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Dave Volante | PERSON | 0.99+ |
millions | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
Webster | ORGANIZATION | 0.99+ |
London | LOCATION | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Io Tahoe | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
five years ago | DATE | 0.98+ |
Onda | ORGANIZATION | 0.98+ |
Webster Webster | ORGANIZATION | 0.98+ |
Covert | PERSON | 0.97+ |
two | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
5 10 years | QUANTITY | 0.97+ |
three | QUANTITY | 0.96+ |
20 x | QUANTITY | 0.94+ |
10 X | QUANTITY | 0.94+ |
Cube | COMMERCIAL_ITEM | 0.93+ |
Andi | PERSON | 0.93+ |
one | QUANTITY | 0.93+ |
Azure | ORGANIZATION | 0.92+ |
six subscribers | QUANTITY | 0.91+ |
three things | QUANTITY | 0.91+ |
I o Ta ho | ORGANIZATION | 0.91+ |
Google AWS | ORGANIZATION | 0.91+ |
Yarra | ORGANIZATION | 0.89+ |
J ahora | PERSON | 0.89+ |
Anke | ORGANIZATION | 0.89+ |
Dave | PERSON | 0.85+ |
Iot Tahoe | PERSON | 0.84+ |
a day | QUANTITY | 0.82+ |
Lawrence | PERSON | 0.82+ |
one thing | QUANTITY | 0.81+ |
Petra | PERSON | 0.78+ |
pandemic | EVENT | 0.78+ |
Iot. Tahoe | PERSON | 0.78+ |
last four years | DATE | 0.78+ |
past five years | DATE | 0.77+ |
Swiss | ORGANIZATION | 0.76+ |
JJ | PERSON | 0.75+ |
Enterprise Data Automation | ORGANIZATION | 0.73+ |
last 10 years | DATE | 0.62+ |
Dev Ops | ORGANIZATION | 0.59+ |
Richmond | ORGANIZATION | 0.55+ |
Cartman | ORGANIZATION | 0.55+ |
Snowflake | EVENT | 0.51+ |
terabytes | QUANTITY | 0.5+ |
factors | QUANTITY | 0.46+ |
data | TITLE | 0.45+ |