Tiji Mathew, Patrick Zimet and Senthil Karuppaiah | Io-Tahoe Data Quality Active DQ
(upbeat music), (logo pop up) >> Narrator: From around the globe it's theCUBE. Presenting active DQ intelligent automation for data quality brought to you by IO-Tahoe. >> Are you ready to see active DQ on Snowflake in action? Let's get into the show and tell him, do the demo. With me or Tiji Matthew, the Data Solutions Engineer at IO-Tahoe. Also joining us is Patrick Zeimet Data Solutions Engineer at IO-Tahoe and Senthilnathan Karuppaiah, who's the Head of Production Engineering at IO-Tahoe. Patrick, over to you let's see it. >> Hey Dave, thank you so much. Yeah, we've seen a huge increase in the number of organizations interested in Snowflake implementation. Were looking for an innovative, precise and timely method to ingest their data into Snowflake. And where we are seeing a lot of success is a ground up method utilizing both IO-Tahoe and Snowflake. To start you define your as is model. By leveraging IO-Tahoe to profile your various data sources and push the metadata to Snowflake. Meaning we create a data catalog within Snowflake for a centralized location to document items such as source system owners allowing you to have those key conversations and understand the data's lineage, potential blockers and what data is readily available for ingestion. Once the data catalog is built you have a much more dynamic strategies surrounding your Snowflake ingestion. And what's great is that while you're working through those key conversations IO-Tahoe will maintain that metadata push and partnered with Snowflake ability to version the data. You can easily incorporate potential scheme changes along the way. Making sure that the information that you're working on stays as current as the systems that you're hoping to integrate with Snowflake. >> Nice, Patrick I wonder if you could address how you IO-Tahoe Platform Scales and maybe in what way it provides a competitive advantage for customers. >> Great question where IO-Tahoe shines is through its active DQ or the ability to monitor your data's quality in real time. Marking which roads need remediation. According to the customized business rules that you can set. Ensuring that the data quality standards meet the requirements of your organizations. What's great is through our use of RPA. We can scale with an organization. So as you ingest more data sources we can allocate more robotic workers meaning the results will continue to be delivered in the same timely fashion you've grown used to. What's Morrisons IO-Tahoe is doing the heavy lifting on monitoring data quality. That's frees up your data experts to focus on the more strategic tasks such as remediation that augmentations and analytics developments. >> Okay, maybe Tiji, you could address this. I mean, how does all this automation change the operating model that we were talking to to Aj and Dunkin before about that? I mean, if it involves less people and more automation what else can I do in parallel? >> I'm sure the participants today will also be asking the same question. Let me start with the strategic tasks Patrick mentioned, Io-Tahoe does the heavy lifting. Freeing up data experts to act upon the data events generated by IO-Tahoe. Companies that have teams focused on manually building their inventory of the data landscape. Leads to longer turnaround times in producing actionable insights from their own data assets. Thus, diminishing the value realized by traditional methods. However, our operating model involves profiling and remediating at the same time creating a catalog data estate that can be used by business or IT accordingly. With increased automation and fewer people. Our machine learning algorithms augment the data pipeline to tag and capture the data elements into a comprehensive data catalog. As IO-Tahoe automatically catalogs the data estate in a centralized view, the data experts can partly focus on remediating the data events generated from validating against business rules. We envision that data events coupled with this drillable and searchable view will be a comprehensive one to assess the impact of bad quality data. Let's briefly look at the image on screen. For example, the view indicates that bad quality zip code data impacts the contact data which in turn impacts other related entities in systems. Now contrast that with a manually maintained spreadsheet that drowns out the main focus of your analysis. >> Tiji, how do you tag and capture bad quality data and stop that from you've mentioned these printed dependencies. How do you stop that from flowing downstream into the processes within the applications or reports? >> As IO-Tahoe builds the data catalog across source systems. We tag the elements that meet the business rule criteria while segregating the failed data examples associated with the elements that fall below a certain threshold. The elements that meet the business rule criteria are tagged to be searchable. Thus, providing an easy way to identify data elements that may flow through the system. The segregated data examples on the other hand are used by data experts to triage for the root cause. Based on the root cause potential outcomes could be one, changes in the source system to prevent that data from entering the system in the first place. Two, add data pipeline logic, to sanitize bad data from being consumed by downstream applications and reports or just accept the risk of storing bad data and address it when it meets a certain threshold. However, Dave as for your question about preventing bad quality data from flowing into the system? IO-Tahoe will not prevent it because the controls of data flowing between systems is managed outside of IO-Tahoe. Although, IO-Tahoe will alert and notify the data experts to events that indicate bad data has entered the monitored assets. Also we have redesigned our product to be modular and extensible. This allows data events generated by IO-Tahoe to be consumed by any system that wants to control the targets from bad data. Does IO-Tahoe empowers the data experts to control the bad data from flowing into their system. >> Thank you for that. So, one of the things that we've noticed, we've written about is that you've got these hyper specialized roles within the data, the centralized data organization. And wonder how do the data folks get involved here if at all, and how frequently do they get involved? Maybe Senthilnathan you could take that. >> Thank you, Dave for having me here. Well, based on whether the data element in question is in data cataloging or monitoring phase. Different data folks gets involved. When it isn't in the data cataloging stage. The data governance team, along with enterprise architecture or IT involved in setting up the data catalog. Which includes identifying the critical data elements business term identification, definition, documentation data quality rules, and data even set up data domain and business line mapping, lineage PA tracking source of truth. So on and so forth. It's typically in one time set up review certify then govern and monitor. But while when it is in the monitoring phase during any data incident or data issues IO-Tahoe broadcast data signals to the relevant data folks to act and remedy it as quick as possible. And alerts the consumption team it could be the data science, analytics, business opts are both a potential issue so that they are aware and take necessary preventative measure. Let me show you an example, critical data element from data quality dashboard view to lineage view to data 360 degree view for a zip code for conformity check. So in this case the zip code did not meet the past threshold during the technical data quality check and was identified as non-compliant item and notification was sent to the ID folks. So clicking on the zip code. Will take to the lineage view to visualize the dependent system, says that who are producers and who are the consumers. And further drilling down will take us to the detailed view, that a lot of other information's are presented to facilitate for a root cause analysis and not to take it to a final closure. >> Thank you for that. So Tiji? Patrick was talking about the as is to be. So I'm interested in how it's done now versus before. Do you need a data governance operating model for example? >> Typically a company that decides to make an inventory of the data assets would start out by manually building a spreadsheet managed by data experts of the company. What started as a draft now get break into the model of a company. This leads to loss of collaboration as each department makes a copy of their catalog for their specific needs. This decentralized approach leads to loss of uniformity which each department having different definitions which ironically needs a governance model for the data catalog itself. And as the spreadsheet grows in complexity the skill level needed to maintain. It also increases thus leading to fewer and fewer people knowing how to maintain it. About all the content that took so much time and effort to build is not searchable outside of that spreadsheet document. >> Yeah, I think you really hit the nail on my head Tiji. Now companies want to move away from the spreadsheet approach. IO-Tahoe addresses the shortcoming of the traditional approach enabling companies to achieve more with less. >> Yeah, what the customer reaction has been. We had Webster Bank, on one of the early episodes for example, I mean could they have achieved. What they did without something like active data quality and automation maybe Senthilnathan you could address that? >> Sure, It is impossible to achieve full data quality monitoring and remediation without automation or digital workers in place reality that introverts they don't have the time to do the remediation manually because they have to do an analysis conform fix on any data quality issues, as fast as possible before it gets bigger and no exception to Webster. That's why Webster implemented IO-Tahoe's active DQ to set up the business, metadata management and data quality monitoring and remediation in the Snowflake cloud data Lake. We help and building the center of excellence in the data governance, which is managing the data catalog schedule on demand and in-flight data quality checks, but Snowflake, no pipe on stream are super beneficial to achieve in flight quality checks. Then the data assumption monitoring and reporting last but not the least the time saver is persisting the non-compliant records for every data quality run within the Snowflake cloud, along with remediation script. So that during any exceptions the respect to team members is not only alerted. But also supplied with necessary scripts and tools to perform remediation right from the IO-Tahoe's Active DQ. >> Very nice. Okay guys, thanks for the demo. Great stuff. Now, if you want to learn more about the IO-Tahoe platform and how you can accelerate your adoption of Snowflake book some time with a data RPA expert all you got to do is click on the demo icon on the right of your screen and set a meeting. We appreciate you attending this latest episode of the IO-Tahoe data automation series. Look, if you missed any of the content that's all available on demand. This is Dave Vellante theCUBE. Thanks for watching. (upbeat music)
SUMMARY :
the globe it's theCUBE. and tell him, do the demo. and push the metadata to Snowflake. if you could address or the ability to monitor the operating model on remediating the data events generated into the processes within the data experts to events that indicate So, one of the things that So clicking on the zip code. Thank you for that. the skill level needed to maintain. of the traditional approach one of the early episodes So that during any exceptions the respect of the IO-Tahoe data automation series.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Patrick | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Tiji Matthew | PERSON | 0.99+ |
Tiji Mathew | PERSON | 0.99+ |
Senthil Karuppaiah | PERSON | 0.99+ |
Patrick Zimet | PERSON | 0.99+ |
IO-Tahoe | ORGANIZATION | 0.99+ |
Io-Tahoe | ORGANIZATION | 0.99+ |
Tiji | PERSON | 0.99+ |
360 degree | QUANTITY | 0.99+ |
Senthilnathan Karuppaiah | PERSON | 0.99+ |
each department | QUANTITY | 0.99+ |
Snowflake | TITLE | 0.99+ |
today | DATE | 0.99+ |
Webster | ORGANIZATION | 0.99+ |
Aj | PERSON | 0.99+ |
Dunkin | PERSON | 0.98+ |
Two | QUANTITY | 0.98+ |
IO | ORGANIZATION | 0.97+ |
Patrick Zeimet | PERSON | 0.97+ |
Webster Bank | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
one time | QUANTITY | 0.97+ |
both | QUANTITY | 0.96+ |
Senthilnathan | PERSON | 0.96+ |
IO-Tahoe | TITLE | 0.93+ |
first place | QUANTITY | 0.89+ |
IO | TITLE | 0.72+ |
Snowflake | EVENT | 0.71+ |
Tahoe | ORGANIZATION | 0.69+ |
Data Solutions | ORGANIZATION | 0.69+ |
-Tahoe | TITLE | 0.64+ |
Tahoe | TITLE | 0.63+ |
Snowflake | ORGANIZATION | 0.6+ |
Morrisons | ORGANIZATION | 0.6+ |
Tiji Mathew, Patrick Zimet and Senthil Karuppaiah | Io-Tahoe Data Quality: Active DQ
(upbeat music), (logo pop up) >> Narrator: From around the globe it's theCUBE. Presenting active DQ intelligent automation for data quality brought to you by IO-Tahoe. >> Are you ready to see active DQ on Snowflake in action? Let's get into the show and tell him, do the demo. With me or Tiji Matthew, the Data Solutions Engineer at IO-Tahoe. Also joining us is Patrick Zeimet Data Solutions Engineer at IO-Tahoe and Senthilnathan Karuppaiah, who's the Head of Production Engineering at IO-Tahoe. Patrick, over to you let's see it. >> Hey Dave, thank you so much. Yeah, we've seen a huge increase in the number of organizations interested in Snowflake implementation. Were looking for an innovative, precise and timely method to ingest their data into Snowflake. And where we are seeing a lot of success is a ground up method utilizing both IO-Tahoe and Snowflake. To start you define your as is model. By leveraging IO-Tahoe to profile your various data sources and push the metadata to Snowflake. Meaning we create a data catalog within Snowflake for a centralized location to document items such as source system owners allowing you to have those key conversations and understand the data's lineage, potential blockers and what data is readily available for ingestion. Once the data catalog is built you have a much more dynamic strategies surrounding your Snowflake ingestion. And what's great is that while you're working through those key conversations IO-Tahoe will maintain that metadata push and partnered with Snowflake ability to version the data. You can easily incorporate potential scheme changes along the way. Making sure that the information that you're working on stays as current as the systems that you're hoping to integrate with Snowflake. >> Nice, Patrick I wonder if you could address how you IO-Tahoe Platform Scales and maybe in what way it provides a competitive advantage for customers. >> Great question where IO-Tahoe shines is through its active DQ or the ability to monitor your data's quality in real time. Marking which roads need remediation. According to the customized business rules that you can set. Ensuring that the data quality standards meet the requirements of your organizations. What's great is through our use of RPA. We can scale with an organization. So as you ingest more data sources we can allocate more robotic workers meaning the results will continue to be delivered in the same timely fashion you've grown used to. What's Morrisons IO-Tahoe is doing the heavy lifting on monitoring data quality. That's frees up your data experts to focus on the more strategic tasks such as remediation that augmentations and analytics developments. >> Okay, maybe Tiji, you could address this. I mean, how does all this automation change the operating model that we were talking to to Aj and Dunkin before about that? I mean, if it involves less people and more automation what else can I do in parallel? >> I'm sure the participants today will also be asking the same question. Let me start with the strategic task. Patrick mentioned IO-Tahoe does the heavy lifting. Freeing up data experts to act upon the data events generated by IO-Tahoe. Companies that have teams focused on manually building their inventory of the data landscape. Leads to longer turnaround times in producing actionable insights from their own data assets. Thus, diminishing the value realized by traditional methods. However, our operating model involves profiling and remediating at the same time creating a catalog data estate that can be used by business or IT accordingly. With increased automation and fewer people. Our machine learning algorithms augment the data pipeline to tag and capture the data elements into a comprehensive data catalog. As IO-Tahoe automatically catalogs the data estate in a centralized view, the data experts can partly focus on remediating the data events generated from validating against business rules. We envision that data events coupled with this drillable and searchable view will be a comprehensive one to assess the impact of bad quality data. Let's briefly look at the image on screen. For example, the view indicates that bad quality zip code data impacts the contact data which in turn impacts other related entities in systems. Now contrast that with a manually maintained spreadsheet that drowns out the main focus of your analysis. >> Tiji, how do you tag and capture bad quality data and stop that from you've mentioned these printed dependencies. How do you stop that from flowing downstream into the processes within the applications or reports? >> As IO-Tahoe builds the data catalog across source systems. We tag the elements that meet the business rule criteria while segregating the failed data examples associated with the elements that fall below a certain threshold. The elements that meet the business rule criteria are tagged to be searchable. Thus, providing an easy way to identify data elements that may flow through the system. The segregated data examples on the other hand are used by data experts to triage for the root cause. Based on the root cause potential outcomes could be one, changes in the source system to prevent that data from entering the system in the first place. Two, add data pipeline logic, to sanitize bad data from being consumed by downstream applications and reports or just accept the risk of storing bad data and address it when it meets a certain threshold. However, Dave as for your question about preventing bad quality data from flowing into the system? IO-Tahoe will not prevent it because the controls of data flowing between systems is managed outside of IO-Tahoe. Although, IO-Tahoe will alert and notify the data experts to events that indicate bad data has entered the monitored assets. Also we have redesigned our product to be modular and extensible. This allows data events generated by IO-Tahoe to be consumed by any system that wants to control the targets from bad data. Does IO-Tahoe empowers the data experts to control the bad data from flowing into their system. >> Thank you for that. So, one of the things that we've noticed, we've written about is that you've got these hyper specialized roles within the data, the centralized data organization. And wonder how do the data folks get involved here if at all, and how frequently do they get involved? Maybe Senthilnathan you could take that. >> Thank you, Dave for having me here. Well, based on whether the data element in question is in data cataloging or monitoring phase. Different data folks gets involved. When it doesn't the data cataloging stage. The data governance team, along with enterprise architecture or IT involved in setting up the data catalog. Which includes identifying the critical data elements business term identification, definition, documentation data quality rules, and data even set up data domain and business line mapping, lineage PA tracking source of truth. So on and so forth. It's typically in one time set up review certify then govern and monitor. But while when it is in the monitoring phase during any data incident or data issues IO-Tahoe broadcast data signals to the relevant data folks to act and remedy it as quick as possible. And alerts the consumption team it could be the data science, analytics, business opts are both a potential issue so that they are aware and take necessary preventative measure. Let me show you an example, critical data element from data quality dashboard view to lineage view to data 360 degree view for a zip code for conformity check. So in this case the zip code did not meet the past threshold during the technical data quality check and was identified as non-compliant item and notification was sent to the ID folks. So clicking on the zip code. Will take to the lineage view to visualize the dependent system, says that who are producers and who are the consumers. And further drilling down will take us to the detailed view, that a lot of other information's are presented to facilitate for a root cause analysis and not to take it to a final closure. >> Thank you for that. So Tiji? Patrick was talking about the as is to be. So I'm interested in how it's done now versus before. Do you need a data governance operating model for example? >> Typically a company that decides to make an inventory of the data assets would start out by manually building a spreadsheet managed by data experts of the company. What started as a draft now get break into the model of a company. This leads to loss of collaboration as each department makes a copy of their catalog for their specific needs. This decentralized approach leads to loss of uniformity which each department having different definitions which ironically needs a governance model for the data catalog itself. And as the spreadsheet grows in complexity the skill level needed to maintain. It also increases thus leading to fewer and fewer people knowing how to maintain it. About all the content that took so much time and effort to build is not searchable outside of that spreadsheet document. >> Yeah, I think you really hit the nail on my head Tiji. Now companies want to move away from the spreadsheet approach. IO-Tahoe addresses the shortcoming of the traditional approach enabling companies to achieve more with less. >> Yeah, what the customer reaction has been. We had Webster Bank, on one of the early episodes for example, I mean could they have achieved. What they did without something like active data quality and automation maybe Senthilnathan you could address that? >> Sure, It is impossible to achieve full data quality monitoring and remediation without automation or digital workers in place reality that introverts they don't have the time to do the remediation manually because they have to do an analysis conform fix on any data quality issues, as fast as possible before it gets bigger and no exception to Webster. That's why Webster implemented IO-Tahoe's active DQ to set up the business, metadata management and data quality monitoring and remediation in the Snowflake cloud data Lake. We help and building the center of excellence in the data governance, which is managing the data catalog schedule on demand and in-flight data quality checks, but Snowflake, no pipe on stream are super beneficial to achieve in flight quality checks. Then the data assumption monitoring and reporting last but not the least the time saver is persisting the non-compliant records for every data quality run within the Snowflake cloud, along with remediation script. So that during any exceptions the respect to team members is not only alerted. But also supplied with necessary scripts and tools to perform remediation right from the IO-Tahoe's Active DQ. >> Very nice. Okay guys, thanks for the demo. Great stuff. Now, if you want to learn more about the IO-Tahoe platform and how you can accelerate your adoption of Snowflake book some time with a data RPA expert all you got to do is click on the demo icon on the right of your screen and set a meeting. We appreciate you attending this latest episode of the IO-Tahoe data automation series. Look, if you missed any of the content that's all available on demand. This is Dave Vellante theCUBE. Thanks for watching. (upbeat music)
SUMMARY :
the globe it's theCUBE. and tell him, do the demo. and push the metadata to Snowflake. if you could address or the ability to monitor the operating model on remediating the data events generated into the processes within the data experts to events that indicate So, one of the things that So clicking on the zip code. Thank you for that. the skill level needed to maintain. of the traditional approach one of the early episodes So that during any exceptions the respect of the IO-Tahoe data automation series.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Patrick | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Tiji Matthew | PERSON | 0.99+ |
Tiji Mathew | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Patrick Zimet | PERSON | 0.99+ |
IO-Tahoe | ORGANIZATION | 0.99+ |
Senthil Karuppaiah | PERSON | 0.99+ |
360 degree | QUANTITY | 0.99+ |
Tiji | PERSON | 0.99+ |
Senthilnathan Karuppaiah | PERSON | 0.99+ |
each department | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Snowflake | TITLE | 0.99+ |
Webster | ORGANIZATION | 0.99+ |
Aj | PERSON | 0.99+ |
Dunkin | PERSON | 0.98+ |
Two | QUANTITY | 0.98+ |
IO | ORGANIZATION | 0.97+ |
Patrick Zeimet | PERSON | 0.97+ |
one time | QUANTITY | 0.97+ |
Webster Bank | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
Io-Tahoe | ORGANIZATION | 0.96+ |
both | QUANTITY | 0.96+ |
Senthilnathan | PERSON | 0.96+ |
IO-Tahoe | TITLE | 0.95+ |
first place | QUANTITY | 0.89+ |
Snowflake | EVENT | 0.71+ |
Tahoe | ORGANIZATION | 0.69+ |
Data Solutions | ORGANIZATION | 0.69+ |
IO | TITLE | 0.68+ |
-Tahoe | TITLE | 0.64+ |
Snowflake | ORGANIZATION | 0.6+ |
Morrisons | ORGANIZATION | 0.6+ |
Tahoe | TITLE | 0.59+ |
Eron Kelly, AWS | AWS re:Invent 2020
>>from around the globe. It's the Cube with digital coverage of AWS reinvent 2020 sponsored by Intel and AWS. Yeah, welcome to the Cubes Live coverage of AWS reinvent 2020. I'm Lisa Martin and I have a Cube alumni joining me Next. Aaron Kelly, the GM of product marketing at AWS Aaron. Welcome back to the program. >>Thanks, Lisa. It's great to be here. >>Likewise, even though we don't get to all be crammed into Las Vegas together, uh, excited to talk to you about Amazon Connect, talk to our audience about what that is. And then let's talk about it in terms of how it's been a big facilitator during this interesting year, that is 2020. >>Great, yes, for sure. So Amazon Connect is a cloud contact center where we're really looking to really reinvent how contact centers work by bringing it into the cloud. It's an Omni Channel, easy to use contact center that allows customers to spin up contact centers in minutes instead of months. Its very scalable so can scale to 10 tens of thousands of agents. But it also scaled down when you when it's not in use and because it's got a pay as you go business model. You only pay when you're engaging with collars or customers. You're not paying for high upfront per agent fees every month. So it's really been a great service during this pandemic, as there's been a lot of unpredictable spikes in demand, uh, that customers have had to deal with across many sectors, >>and we've been talking for months now about the acceleration that Corbett has delivered with respect to digital transformation. And, of course, as patients has been wearing fin globally. I think with everybody when we're calling a contact center, we want a resolution quickly. And of course, as we all know is we all in any industry are working from home. So are they. So I can imagine during this time that being able to have a cloud contact center has been transformative, I guess, to help some businesses keep the lights on. But now to really be successful moving forward, knowing that they can operate and scale up or down as things change. >>Yeah, that's exactly right. And so one of the key benefits of connect his ability to very quickly on board and get started, you know, we have some very interesting and examples like Morrisons, which is a retailer in the UK They wanted to create a new service as you highlighted, which was a door, you know, doorstep delivery service. And so they needed to spin up a quick new contact center in order to handle those orders. They were able to do it and move all their agents remotely in about a day and be able to immediately start to take those orders, which is really powerful, you know. Another interesting example is the Rhode Island Department of Labor and Training. Which part of their responsibility is to deliver unemployment benefits for their citizens? Obviously a huge surge of demand there they were able to build an entirely new context center in about nine days to support their citizens. They went from a knave ridge of about 74 call volume sort of capacity per minute to 1000 call on capacity per minute. And in the first day of standing up this new context center, they were able to serve 75,000 Rhode Island citizens with their unemployment benefits. So really ah, great example of having that cloud scalability that ability to bring agents remotely and then helping citizens in need during a very, very difficult time, >>right? So a lot of uses private sector, public sector. What are some of the new capabilities of Amazon connected? You're announcing at reinvent. >>Yeah, So we announced five big capabilities this during reinvent yesterday that really spanned the entire experience, and our goal is to make it better for agents so they're more efficient. That actually helps customers reduce their costs but also create a better collar experience so that C sat could go up in the collars, can get what they need quickly and then move on. And so the first capability is Amazon Connect Voice I D, which makes it easier to validate that the person calling is who in fact, they say they are so in this case, Lee. So let's say you're calling in. You can opt in tow, have a voice print made of you. The next time you call in, we're able to use machine learning to match that voiceprint to know. Yes, it is Lisa. I don't need to ask Lisa questions about her mother's maiden name and Social Security number. We can validate you quickly as an agent I'm confident it's you. So I'm less concerned about things like fraud, and we can move on. That's the first great new feature. The second is Amazon Connect customer profiles. So now, once you join the call rather than me is an agent having to click around a different systems and find out your order history, etcetera. I could get that all surface to me directly. So I have that context. I can create a more personalized experience and move faster through the call. The third one is called Wisdom. It's Amazon Connect wisdom, which now based on either what you're asking me or a search that I might make, I could get answers to your questions. Push to me using machine learning. So if you may be asking about a refund policy or the next time a new product may launch, I may not know rather than clicking around and sort of finding that in the different systems is pushed right to me. Um, now the Fourth Feet feature is really time capability of contact lens for Amazon connect, and what this does is while you were having our conversation, it measures the sentiment based on what you're saying or any keywords. So let's say you called it and said, I want a refund or I want to cancel That keyword will trigger a new alert to my supervisor who can see that this call may be going in the wrong direction. Let me go help Aaron with Lisa. Maybe there's a special offer I can provide or extra assistance so I can help turn that call around and create a great customer experience, which right now it feels like it's not going in that direction. And then the last one is, um, Amazon Connect tasks where about half of an agents time is spent on task other than the call follow up items. So you're looking for a refund or you want me Thio to ship you a new version of the product or something? Well, today I might write that on a sticky note or send myself a reminder and email. It's not very tracked very well. With Amazon Connect task, I can create that task for me as a supervisor. I could then X signed those tax and I can make sure that the follow up items air prioritized. And then when I look at my work. You is an agent. I can see both calls, my chats and my task, which allows me to be more efficient. That allows me to follow up faster with you. My customer, Andi. Overall, it's gonna help lower the cost and efficiency of the Contact Center. So we're really excited about all five of these features and how they improve the entire life cycle of a customer contact. >>And that could be table stakes for any business in terms of customer satisfaction. You talked about that, but I always say, You know, customer satisfaction is inextricably linked to employee satisfaction. They need. The agents need to be empowered with that information and really time, but also to be able to look at. I want them to know why I'm calling. They should already know what I have. We have that growing expectation right as a consumer. So the agent experience the customer experience. You've also really streamline. And I could just see this being something that is like I said, kind of table stakes for an organization to reduce churn, to be able to service more customers in a shorter amount of time and also employee satisfaction, right, >>right that's that. That's exactly right. Trader Grills, which is one of our, you know, beta customers using some of these capabilities. You know, they're saying 25% faster, handle times so shorter calls and a 10% increase in customer satisfaction because now it's personalized. When you call in, I know what grill you purchased. And so I have a sense based on the grill, you purchase just what your question might be or what you know, what special offers I might have available to me and that's all pushed to me is an agent, So I feel more empowered. I could give you better service. You have, you know, greater loyalty towards my brand, which is a win for everyone, >>absolutely that empowerment of the agent, that personalization for the customer. I think again we have that growing demanded expectation that you should know why I'm calling, and you should be able to solve my problem. If you can't, I'm gonna turn and find somebody else who can do that. That's a huge risk that businesses face. Let's talk about some of the trends that you're seeing that this has been a very interesting year to say the least, what are some of the trends in the context center space that you guys were seeing that you're working Thio to help facilitate? >>Yeah, absolutely. So I think one of the biggest trends that we're seeing is this move towards remote work. So as you can imagine, with the pandemic almost immediately, most customers needed to quickly move their agents to remote work scenario. And this is where Amazon Connect was a great benefit. For as I mentioned before, we saw about 5000 new contact centers created in March in April. Um, Atiya, very beginning of the pandemic. So that was a very, uh, that's a very big trend we're seeing. And now what we're seeing is customers were saying, Hey, when I have something like Amazon Connect that's in the cloud, it scales up. It provides me a great experience. I just need really a headset in a Internet connection from my agents. I'm not dealing with VPNs and, ah, lot of the complexity that comes with trying to move on on premises system remote. We're seeing a huge, you know, search of adoption and usage around that the ability to very quickly create a new context center around specific scenarios are use cases has been really, really powerful. So, uh, those are the big trends moving to remote remote work and a trend towards, um, spinning of new context that is quickly and then spending them back down as that demand moves or or those those those situations move >>right. And as we're all experiencing, the one thing that is a given during this time is the uncertainty that remains Skilling up. Skilling down volume changes. But looking as if a lot of what's currently going on from home is going to stay for a while longer, I actually not think about it. I'm calling into whether it's, you know, cable service or whatnot. I think What about agent is actually on their couch at home like I am working? And so I think it's being able to facilitate that because is transformative, and I think I think I'll step out on limbs side, you know, very potentially impact the winners and the losers of tomorrow, making sure that the consumer experience is tailored. It's personalized to your point and that the agents are empowered in real time to facilitate a seamless and fast resolution of whatever the issue is. >>Well, and I think you hit on it earlier as well. Agents wanna be helpful. They wanna solve a customer problem. They wanna have that information at their fingertips. They wanna be on power to take action. Because at the end of their day, they want to feel like they helped people, right? And so being able to give them that information safe from wisdom or being able to see your entire customer profile, Right? Right. When you come on board or know that you are Lisa, um, and have the confidence that I'm talking to Lisa, I'm not. This is not some sort of, you know, fishing, exercise, exercise. These are all really important scenarios and features that empower the agent, lowers cost significantly for the customer and creates a much better customer experience for you. The collar? >>Absolutely. And we all know how important that is these days to get some sort of satisfying experience. Last question. Erin, talk to us about, you know, as we all look forward, Thio 2021. For many reasons. What can we expect with Amazon? Connect? >>Well, we're going to continue to listen to our customers and hear their feedback and what they need, which what we certainly anticipate is continued focus on that agent efficiency, giving agents mawr of the information they need to be successful and answer customers questions quickly, continuing to invest in machine learning as a way of doing that. So using ML to identify that you are who you say you are, finding that right information. Getting data that I can use is an agent Thio. Handle those tasks and then automate the things that you know I really shouldn't have to take steps is a human to go do so if we need to send you a follow up email when when your product ships or when your refund is issued. Let me just put that in the system once and have it happened when it executes. So that level of automation continuing to bring machine learning in to make the agent experience better and more efficient, which ultimate leads to lower costs and better see set. These are all the investments. You'll see a sui continue for it next year. >>Excellent stuff, Erin, thank you so much for joining me on the program today, ensuring what's next and the potential the impact that Amazon connect is making. >>Thanks, Lisa. It's great to be here >>for Aaron Kelly. I'm Lisa Martin. You're watching the cubes. Live coverage of AWS reinvent 2020.
SUMMARY :
It's the Cube with digital uh, excited to talk to you about Amazon Connect, talk to our audience about what that It's an Omni Channel, easy to use contact center that allows customers to spin up So I can imagine during this time that being able to have a cloud contact And so one of the key benefits of connect his ability to very What are some of the new capabilities of and I can make sure that the follow up items air prioritized. And I could just see this being something that is like I said, kind of table stakes for an organization to And so I have a sense based on the grill, you purchase just what your question might be or what you the least, what are some of the trends in the context center space that you guys were seeing that you're working So as you can imagine, with the pandemic almost immediately, most customers needed to that the agents are empowered in real time to facilitate a seamless These are all really important scenarios and features that empower the agent, Erin, talk to us about, you know, as we all look forward, Thio 2021. a human to go do so if we need to send you a follow up email when when your product ships or Excellent stuff, Erin, thank you so much for joining me on the program today, ensuring what's next and the potential the impact Live coverage of AWS reinvent
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Steve | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Steve Manly | PERSON | 0.99+ |
Sanjay | PERSON | 0.99+ |
Rick | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Verizon | ORGANIZATION | 0.99+ |
David | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Fernando Castillo | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Dave Balanta | PERSON | 0.99+ |
Erin | PERSON | 0.99+ |
Aaron Kelly | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
Fernando | PERSON | 0.99+ |
Phil Bollinger | PERSON | 0.99+ |
Doug Young | PERSON | 0.99+ |
1983 | DATE | 0.99+ |
Eric Herzog | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Deloitte | ORGANIZATION | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Spain | LOCATION | 0.99+ |
25 | QUANTITY | 0.99+ |
Pat Gelsing | PERSON | 0.99+ |
Data Torrent | ORGANIZATION | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
Aaron | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Pat | PERSON | 0.99+ |
AWS Partner Network | ORGANIZATION | 0.99+ |
Maurizio Carli | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Drew Clark | PERSON | 0.99+ |
March | DATE | 0.99+ |
John Troyer | PERSON | 0.99+ |
Rich Steeves | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
BMW | ORGANIZATION | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
three years | QUANTITY | 0.99+ |
85% | QUANTITY | 0.99+ |
Phu Hoang | PERSON | 0.99+ |
Volkswagen | ORGANIZATION | 0.99+ |
1 | QUANTITY | 0.99+ |
Cook Industries | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
Dave Valata | PERSON | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Boston | LOCATION | 0.99+ |
Stephen Jones | PERSON | 0.99+ |
UK | LOCATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Better Cybercrime Metrics Act | TITLE | 0.99+ |
2007 | DATE | 0.99+ |
John Furrier | PERSON | 0.99+ |