Image Title

Search Results for Saxena:

Sanjay Saxena, Northern Trust Corporation | IBM CDO Strategy Summit 2017


 

>> Announcer: Live from Boston Massachusetts. It's the cube. Covering IBM Chief Data Officer Summit, brought to you by IBM. >> Welcome back to the cube's coverage of the IBM Chief Data Officer Strategy Summit. I'm your host Rebecca Knight, along with my co-host Dave Vallante. We're joined by Sanjay Saxena, He is the senior vice president, enterprise data governance at Northern trust Corporation. Thanks so much for joining us Sanjay. >> Thank you. Thank you for having me. >> So, before the cameras were rolling, we were talking about how data governance is really now seen as a business imperative. Can you talk about what's driving that? >> Initially, when we started our data governance program it was very much a regulatory program, focused on regulations, such as GDPR, anti-money laundering etc. But now, as we have evolved, most of the program in my company is focused on business and business initiatives and a lot of that is actually driven by our customers, who want to clean data. We are custodians of the data. We do asset servicing, asset management, and what the customers have, are expecting, as stable stakes, is really clean data. So, more and more, I'm seeing it as a customer driven initiative. >> Clean data. can you ... >> So, many many businesses rely on data, financial services. It's all about data and technology, but when we talk about clean data, you're talking about providing data at a certain threshold. At a certain level of expectation. You are used to data quality when it comes to cars and gadgets and things like that. But, think about data and having a certain threshold that you and your customer can agree on as the right quality of data is really important. >> Well, and that's a lot of the, sort of, governance role, some of the back-office role, but then it evolved. >> Right. >> And begin to add value, particularly in the days where IBM was talking about data warehouse was king. You know master data management and single version of the truth. Data quality became a way in which folks in your role could really add business value. >> That's right. >> How has that evolved in terms of the challenge of that with all the data explosion? You know, how to do been big data it just increased the volumes of data by massive massive amounts and then lines of business started to initiate projects. What did that do for data quality, the data quality challenge? >> So the data quality challenge has grown on two dimensions. One, is the volume of data. You simply have more data to manage, more data to govern and provide an attestation or a certification, you say "Hey, it's clean data. It's good data." The other dimension is really around discoverability of that data. We have so much of data lying in data lakes and we have so many so much of meta-data about the data, that even governing that is becoming a challenge. So, I think both those dimensions are important and are making the jobs of a CDO more complex. >> And do you feel maybe not specific to you but just as an industry that, Let's take financial services, is the industry keeping pace? Because for years very few organizations, if any have tamed the data. Just a matter of keeping up. >> Has that changed or is it sort of still that treadmill? >> It's still evolving. It's still evolving in my from my perspective. Industries, again are starting to manage their models that they have to deliver to the regulators as essential, right? Now, more and more, they're looking at customer data. their saying "Look, my email IDs have to be correct. My customer addresses have to be correct." It's really important to have an effective customer relationship. Right? So, more and more, we are seeing front-office driving data quality and data quality initiatives. But have we attained a state of perfection? No. We are getting there, in terms of more optimization, more emphasis, more money and financials being put on data quality. But still it is evolving as a >> You talk a little bit about the importance of the customer relationship and this conference is really all about sharing best practices. What you've learned along the way, even from the stakes. Can you share a little bit with our viewers about what you think are sort of the pillars of a strong customer relationship, particularly with a financial services company? >> Right. So, in the industry that we are in, we do a lot of wealth management. We have institutional customers, but let's save the example of wealth management. These are wealthy, wealthy individuals, who have assets all around the world. Right? It's a high touch customer relationship kind of a game. So, we need to not only understand them, we need to understand their other relationships, their accountants, who their doctors are etc. So, in that kind of a business, not only it is about high touch and really understanding what the customer needs are. Right? And going more towards analytics and understanding what customers want, but really having correct data about them. Right? Where they live, who are their kids etc. So, it's really data and CRM, they actually come together in that kind of environment and data plays a pivotal role, when it comes to really effective CRM. >> Sanjay, last time we talked a little bit about GDPR. Can you give us an update on where you're at? I mean, like it or not, it's coming. How does it affect your organization and where are you and being ready for the, I mean GDPR has taken effect. people don't realize that, but the penalties go into effect next May. So, where are you guys at? >> So, we are progressing well on our GDPR program and we are, as we talked before this interview, we are treating GDPR as a foundation to our data governance program and that's how I would like other companies to treat GDP our program as well. Because not only what we are doing in GDPR, which is mapping out sensitive data across hundreds of applications and creating that baseline for the whole company. So that anytime a regulator comes in and wants to know where a particular person's information is, we should be able to tell them with in no uncertain terms. So we are using that to build a foundation for our data governance program. We are progressing well, in terms of all aspects of the program. The other interesting aspect, which is really important to highlight, which I didn't last time is that, there's a huge amount of synergy between GDPR and information security. Which is a much older discipline and data protection, so all companies have to protect the data anyway, right? Think about it. So, now a regulation comes along and we are, in a systematic fashion, trying to figure out where all where all our sensitive data is and whether it is controlled protected etc. It is helping our data protection program as well. So all these things, they come together very nicely from a GDPR perspective. >> I wonder, you, you remember Federal Rules of Civil Procedure. That was a big deal back in 2006, and the courts, you know maybe weren't as advanced and understanding technology as technology wasn't as advanced. What happened back then and I wonder if we could compare it to what you think will happen or is happening with GDPRs. It was impossible to solve the problem. So, people just said "Alright, we're going to fix email archiving and plug a hole." and then it became a case where, if a company could show that it had processes these procedures in place, they were covered, and that gave them defense and litigation. Do you expect the same will happen here or is the bar much much higher with GDPR. >> I believe the bar is much much higher. Because when you look at the different provisions of the regulation, right, customers consent is a big big deal, right? No longer can you use customer data for purposes other than what the customer has given you the consent for. Nor can you collect additional data, right? Historically, companies have gone out and collected not just your basic information, but may have collected other things that are relevant to them but not relevant to you or the relationship that you have with them. So it is, the laws are becoming or the regulations are becoming more restrictive, and really it's not just a matter of checking a box. It is really actually being able to prove that you have your data under control. >> Yeah so, my follow-up there is, can you use technology to prove that? Because you can't manually figure through this stuff. Are things like machine learning and so-called AI coming in to play to help with that problem. Yes, absolutely. So one aspect that we didn't talk about is that GDPR covers not just structured data but it covers unstructured data, which is huge and it's growing by tons. So, there are two tools available in the marketplace including IBM's tools which help you map the data or what we call as the lineage for the data. There are other tools that help you develop a meta-data repository to say "Hey, if it is date of birth, where does it reside in the repository, in all the depositories, in fact?" So, there are tools around meta-data management. There are tools around lineage. There are tools around unstructured data discovery, which is an add-on to the conventional tools and software that we have. So all those are things that you have in your repository that you can use to effectively implement GDPR. >> So my next follow-up on that is, does that lead to a situation where somebody in the governance role can actually, you know going back to the data quality conversation, can actually demonstrate incremental value to the business as a result of becoming expert at using that tooling? >> Absolutely, so as I mentioned earlier on in the conversation, right? You need govern data not just for your customers, for your regulators, but for your analytics. >> Right. >> Right. Now, analytics is yet another dimension effect. So you take all this information that now you're collecting for your GDPR, right? And it's the same information that somebody would need to effectively do a marketing campaign, or effectively do insights on the customer, right? Assuming you have the consent of course, right? We talked about that, right? So, you can mine the same information. Now, you have that information tagged. It's all nicely calibrated in repositories etc. Now, you can use that for your analytics, You can use that for your top line growth or even see what your internal processes are, that can make you more effective from an operations perspective. And how you can get that. >> So you're talking about these new foundations of your data governance strategy and yet we're also talking about this at a time where there's a real shortage of people who are data experts and analytics experts. What are what is Northern Trust doing right now to make sure that you are you have enough talent to fill the pipeline? >> So, we are doing multiple things. Like most companies, we are trying a lot of different things. It's hard to recruit in these areas, especially in the data science area, where analytics. And people not only need to have a certain broad understanding of your business, but they also need to have a deep understanding of all of the statistical techniques etc., right? So, that combination is very hard to find. So, what we do is typically, we get interns, from the universities who have the technology knowledge and we couple them up with business experts. And we work in those collaborated kind of teams, right? Think about agile teams that are working with business experts and technology experts together. So that's one way to solve for that problem. >> Great, well Sanjay, thank you so much for joining us here on the cube. >> Thank you. Thank you. >> Good to see you again. >> We will have more from the IBM CDO Summit just after this.

Published Date : Oct 25 2017

SUMMARY :

brought to you by IBM. of the IBM Chief Data Officer Strategy Summit. Thank you for having me. So, before the cameras were rolling, We are custodians of the data. can you ... having a certain threshold that you and your customer governance role, some of the back-office role, of the truth. in terms of the challenge of that with So the data quality challenge has grown on two dimensions. And do you feel maybe not specific to you So, more and more, we are seeing front-office driving data You talk a little bit about the importance of the customer So, in the industry that we are in, we do a lot of So, where are you guys at? So, we are progressing well on our GDPR program and the courts, you know It is really actually being able to prove that you have your There are other tools that help you develop a meta-data in the conversation, right? So, you can mine the same information. you are you have enough talent to fill the pipeline? especially in the data science area, where analytics. here on the cube. Thank you. We will have more from the IBM CDO Summit

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Rebecca KnightPERSON

0.99+

Dave VallantePERSON

0.99+

SanjayPERSON

0.99+

IBMORGANIZATION

0.99+

Sanjay SaxenaPERSON

0.99+

2006DATE

0.99+

two toolsQUANTITY

0.99+

hundredsQUANTITY

0.99+

next MayDATE

0.99+

Boston MassachusettsLOCATION

0.99+

two dimensionsQUANTITY

0.99+

bothQUANTITY

0.99+

Northern Trust CorporationORGANIZATION

0.99+

GDPRTITLE

0.99+

Federal Rules of Civil ProcedureTITLE

0.99+

Northern TrustORGANIZATION

0.99+

Northern trust CorporationORGANIZATION

0.99+

OneQUANTITY

0.97+

one aspectQUANTITY

0.96+

IBM CDO SummitEVENT

0.94+

one wayQUANTITY

0.92+

IBM CDO Strategy Summit 2017EVENT

0.89+

applicationsQUANTITY

0.84+

IBM Chief Data Officer SummitEVENT

0.82+

OfficerEVENT

0.68+

single versionQUANTITY

0.68+

GDPRsTITLE

0.64+

ChiefEVENT

0.62+

tonsQUANTITY

0.61+

Strategy SummitEVENT

0.61+

Sanjay Saxena, Northern Trust - IBM Fast Track Your Data 2017


 

>> Narrator: Live from Munich, Germany it's theCUBE, covering IBM, fast track your data. brought to you by IBM. >> Welcome back to Munich, Germany everybody. This is theCUBE, the leader in live tech coverage. We go out to the events, we extract the signal from the noise, and we're here at the IBM signature moment, Fast Track Your Data in Munich, where enterprise data governance is a huge theme. We're going to talk about that right now, I'm Dave Vellante with my co-host, Jim Kobielus. Sanjay Saxena is here, he's the senior Vice-President at Northern Trust. Sanjay, welcome to theCUBE, thanks for coming on. >> Thank you, thank you, thanks for having me. >> So, enterprise data governance is a huge theme here, we're going to get into that, but set up Northern Trust, your organization, and your role. >> So, I am the head for enterprise data governance for Northern Trust. It's an essential enterprise role across all the business units. I've been working with Northern Trust for the last three years to set up the program, and prior to this I worked with Bank of Montreal and other institutions doing similar things. >> So how is enterprise governance evolving? I mean, I go back to, sort of, 2006 for the federal rules of civil procedure when electronic, you know, records became admissible in courts, and that set off a whole chain reaction, and plugging the holes with email archiving, and it was just really scratching the surface. We've kind of evolved there, is governance a strategic imperative? Why is it a strategic imperative? And how is it evolving? >> Well, our program has significantly evolved over the past three years. Partly because of how the market conditions are, and what the regulators expect. We fundamentally started our program focused on regulations, risk and compliance, like most banks did. But now we are a very broad based program within the company. So not just a risk department or the finance department, but also the business units are asking for data governance and for quality. And we are in the asset management, asset servicing business. And a lot of our customers, we manage their data. So they are expecting this as stable stakes at this point in time. So we are realizing a lot of value of the data governance in the business units as well, in addition to the risk and compliance usage. >> So how has governance evolved? I mean I went back ten, eleven years which is like ancient history in these days. How has your, sort of, data governance strategy evolved, and where has it come from, and where are you now, and where are you going? >> So, about two to four years back there wasn't anything formal when it comes to governance, it was very specific to certain units of data, certain types of data. For example, most companies are very concerned about the pricing data. And that's where they would have governance. But it was never a broad based program. Nor was there an operating model around governance and organization, structure, teams of people, so over the past three or four years we've seen that evolution. So now I have a number of data stewards as part of my team, and within the business units, whose sole business is to do governance. We have formally establishing data governance principles and practices and policies. Years back, even three years back, you'd go to most organizations you wouldn't find any policies and practices of data governance. So those are two distinct ways that the governance has evolved in terms of the model. And also along with that has been an evolution of tools and technology and where IBM has heard this a lot. >> Generally the line of business people, you know, governance, compliance, even security, and it's changing but, generally if I hear those words as a business person, it's ugh, it's going to slow me down, it's going to cost me time, it's going to cost me money, bureaucracy overhead. How do you as a governance professional address that? Can you make governance a source of value? >> Right, so governance is a very abstract concept. Most people, most businesses, don't want they want to run away from anything close to governance, right? >> Dave: No accountability. >> No accountability, right? They want to be focused on their revenues, etc. So one way to make that, and what we've done is we've made it very tangible by showing them data quality in terms of metrics, in terms of dashboards, in terms of showing them cost of poor data quality, right? In terms of, for example, a simple example is, a customer names an address as being wrong, may not mean very much to a regulator, but it is really important from a business perspective for a relationship manager in our business. So what we've done is shown that to them and shown positive trending towards the mediation and tied it to the business outcomes. So I wouldn't say that we are there yet, it's a journey, but there's been a lot of evolution in the process, they are accepting my organization, they are accepting the roles, and they are accepting the work we're doing. And they want to be part of it. So that's how I see them evolve, I see this as a continuous evolvement even beyond that. And ultimately I see them using governance almost as a product. Right now it's, we provide a lot of data to our end consumers, to our asset management, to management companies, to fund administrators, and others, right? And data governance is an implicit component of that, right? We don't charge money for it, right? But in the world of the future I see that, depending on the tier of the customer, depending on the kind of data that we're supplying them, we can have different tiers of data quality and governance around that, and we could explicitly charge. So they're excited about that project, about that prospect, and they want to work with us on that. >> And you, do you have a chief data officer? >> Sanjay: Yes. >> Okay, so, is it a relatively new role? Or it's been around, I mean typically in your industry it's regulated and so you tend to have more propensity for CDOs, but has there been one for a while, or a couple years? >> Sanjay: It's been around for two years. >> Just two years? Okay. >> Sanjay: Yeah, two plus years, yes. >> Okay, so that chief data officer that emergent role, looks at things like data quality, looks at how to monetize data, tries to form relationships with the line of business, all those things. Companies generally are just starting to understand, all right, how do I, how does data effect my monetization? Not so much how do I sell the data, but how does data help my cut costs or increase revenue? >> Yeah, well, or, yeah, related to that very much is, for example, do you compute a metric such as customer lifetime value that you would sacrifice if you don't, if your business doesn't consolidate multiple inconsistent customer data sets down to one canonical data set that you can use then to, high quality, that you can use to drive targeting marketing, and better engagement. Do you report like a CLV, customer lifetime value, as part of your overall governance strategy or thought about doing that? >> We've thought about doing that, and those metrics are evolving in our organization, but even a little bit more basic metrics around is your customer contactable, right? Do you have the right information about them? Or, for the share of the wallet, is it actually a better example? Like we have different investment products, and we have different products that we sell to our wealthy individuals. What portion of those, what is the average number of products that they have from us? And to be able to monitor, and measure it, across a meter of time, is a really important thing for businesses to do. >> Okay, let's, I see your button here, your badge here, it says IBM analytics, global elite, I think there was a little reception last night by the lake, and you know, all the execs took you guys out and wined and dined you and, you know, that's good. We saw that action going on. But so, what is that mean, a global elite? So that means you're a top-tier customer, what's your relationship with IBM, and how has that evolved? >> Right, so yeah, so North Interest buys a lot of stuff from IBM, lot of technology, tools, consulting, so we are, we are one of the top tier customers, and that's why we are part of the global elite program. And our relationship has really really evolved over time, especially in the governance base I'm talking about, and IBM has been a significant partner for us in terms of the initial strategy around governance, which we implemented and we are still on track to get that fully implemented. Equally important is the tools and technologies that they brought into the space. So most of the vendors provide segregated tools for different portions of data governance. You'll find some people good in lineators, good in meta data, glossary, etc. But IBM has an end to end suite, and we've been able to integrate that, we've been able to make it a single solution, single integrated solution, and that's really benefited us. So that's really been the contribution of IBM. >> And, okay, so can you talk more about the business impact about that single integrated solution? >> So the business impact is that today, unlike ever in the past, we have data quality dashboards. And this is, we are measuring data quality across thousands of data attributes on a monthly basis. We are publishing trends around data quality. We have that, we are also, for people, developers, for business people who are interested in where the data is coming from, we have lineage, we have an enterprise glossary. So it's a one stop solution across all of those. The business people are able to look at that, whether it's risk, finance, or business units, they're able to look at that on a monthly basis. We're able to provide implications of quality, we provide trending, so it is really taking us towards making us a data driven organization. >> Have you been a user, at least a beta-user of the governance catalog that IBM has announced today? What are your thoughts about that? >> Yes, so the information governance catalog we've been using that for the last three years. We have, as I said, about several thousand data elements in the information governance catalog. And what that does is, it creates that single vocabulary within the bank, and you cannot even imagine how difficult that is. Because for two business units to agree on the meaning of a term, it requires a lot of discussions and deliberations. But having a one simple repository that has all of the meta-data is one aspect of it. The second thing is, which has got implications in terms of data security and protection, that we are able to tag the data as sensitive data. For example, for GDPR, so we are using the same tool to be able to tag sensitive data elements and, as I said, the whole lineage, where does it reside, where does the data flow into, all of those things are very very easy and have been implemented in the IGC. >> Sanjay, what would you say is your biggest challenge as an enterprise data governance professional? >> Team management is still the biggest challenge, it is. As I said, it's a journey. And getting to every individual in the enterprise, for example, to start using this glossary that I just talked about. Or getting people to systematically look at data quality across the board. The other piece is the funding around data initiatives, right? So everyone's used to large transformation programs, but when I come up with a list of, here are the top ten data quality issues that need to be fixed, everybody looks over everybody else's shoulder I guess, and says, who's going to pay for it, right? And is this really our problem, or is this the problem of somebody else, right? So we get into a lot of those discussions, but it's a journey, as I said. >> Well, so you need executive support. To get executive support you have to demonstrate how it drives business values. So that's where it's, there's some carrot and stick involved. Well the stick is, well, we got to comply. We've heard a lot about GDRP and how that's going to, you know, cause pain. Okay, so that's the stick. The carrot is the data monetization, and the data value piece, connecting data quality to data value is that, you know, enticement, is it not. >> That's absolutely right, and the more and more we can show monetization of data, or even the fact that, because of that data governance or quality, we were able to acquire a new customer. It doesn't all need to be tangible is what I'm saying. But the more and more we can show monetization, the better off we'll be in terms of selling the program. >> Excellent. Well, Sanjay thanks very much for coming to theCUBE and sharing your experience, we really appreciate it. >> Sanjay: Thank you, thank you very much. >> You're welcome. (techno music)

Published Date : Jun 23 2017

SUMMARY :

brought to you by IBM. We go out to the events, we extract the So, enterprise data governance is a huge theme here, for the last three years to set up the program, and plugging the holes with email archiving, So not just a risk department or the finance department, and where are you now, and where are you going? has evolved in terms of the model. Generally the line of business people, close to governance, right? But in the world of the future I see that, Just two years? Not so much how do I sell the data, that you can use then to, high quality, and we have different products and you know, all the execs took you guys out So most of the vendors provide segregated tools So the business impact is that today, and have been implemented in the IGC. in the enterprise, for example, and the data value piece, But the more and more we can show monetization, for coming to theCUBE and sharing your experience,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jim KobielusPERSON

0.99+

IBMORGANIZATION

0.99+

DavePERSON

0.99+

Dave VellantePERSON

0.99+

SanjayPERSON

0.99+

Bank of MontrealORGANIZATION

0.99+

Sanjay SaxenaPERSON

0.99+

Northern TrustORGANIZATION

0.99+

MunichLOCATION

0.99+

two plus yearsQUANTITY

0.99+

two yearsQUANTITY

0.99+

2006DATE

0.99+

oneQUANTITY

0.99+

tenQUANTITY

0.99+

singleQUANTITY

0.99+

one aspectQUANTITY

0.99+

2017DATE

0.98+

thousandsQUANTITY

0.98+

second thingQUANTITY

0.98+

three years backDATE

0.97+

last nightDATE

0.97+

todayDATE

0.96+

Munich, GermanyLOCATION

0.96+

GDPRTITLE

0.96+

two business unitsQUANTITY

0.96+

North InterestORGANIZATION

0.96+

Munich,LOCATION

0.94+

four years backDATE

0.93+

one simple repositoryQUANTITY

0.93+

eleven yearsQUANTITY

0.93+

two distinct waysQUANTITY

0.9+

theCUBEORGANIZATION

0.88+

single vocabularyQUANTITY

0.86+

GDRPORGANIZATION

0.81+

last three yearsDATE

0.81+

single integratedQUANTITY

0.8+

one stopQUANTITY

0.79+

fourQUANTITY

0.76+

about several thousand data elementsQUANTITY

0.76+

a couple yearsQUANTITY

0.73+

Years backDATE

0.73+

IGCORGANIZATION

0.72+

GermanyLOCATION

0.71+

past three yearsDATE

0.69+

pastDATE

0.6+

yearsDATE

0.59+

topQUANTITY

0.58+

PresidentPERSON

0.57+

threeQUANTITY

0.48+

about twoDATE

0.47+

Compute Session 04


 

>>Good morning. Good, absolute and good evening to all >>those who are listening to this presentation. >>I am rather to Saxena and I manage the platform >>solutions and the thought body operating systems team in the compute workload and solutions group within HP compute >>today I'm >>going to discuss about containers >>and what containers >>do for you >>as a customer >>and why >>should you consider h PE container solutions >>for transforming your business? >>Let's talk about how some of >>the trends seen >>in the industry are impacting the >>customer's day >>in and >>day out and what is it that >>they really need >>cloud services >>and continue your ization, increase operational flexibility, agility and >>speed. >>But non native >>apps seem >>to be a serious issue. >>These legacy apps >>and architecture slow the >>development team, >>making it much harder to meet competitive demand >>and cost pressures. It administrators are >>looking for a way to quickly deploy and manage the resources there. Developers need. >>They want to release more >>updates more quickly. Digital transformation has really shifted >>its focus >>from operations. Two applications, it's all >>about gaining the agility to deploy code faster >>developers want the >>flexibility to choose from a variety of >>Os or containerized ab stacks and to have fast access >>to the resources >>they need. And Ceos >>and line >>of business owners need visibility >>into cost >>and usage so they can optimize their >>spend and drive >>higher utilization of >>their resources. >>So let's define what >>is container technology. >>Container >>technology is a method used to package >>an application >>and software. >>It is a game changer. >>Let's take a closer look at at a couple of >>examples within each area. In the area of cost savings, we achieve savings by reducing the virtualized footprint and by reducing administrative overhead >>through the introduction >>of CIA >>CD pipelines. >>In terms of agility, >>this helps you become more a child by enabling >>your workload portability. It also >>shortens development >>life cycle while increasing the frequency >>of application updates. Within innovation, container platform technologies >>provides >>centralized >>images and source code >>through standard >>repositories, decoupling of application dependencies >>and use of templates >>leading to enhancing >>collaboration. This kick starts your innovation >>container technology would bring >>these benefits to enterprise it and accelerate the transformation of business. >>H. P. E has the proven >>architecture and expertise for the introduction >>of container technology. >>Apps and >>data are no longer centralized in >>the data center. >>They live >>everywhere at the edge, >>in Carlos, >>in the cloud and >>in the data center. This creates >>enormous complexity for application operability >>performance >>and security >>customers are looking >>for a way >>to simplify >>speed and scale their apps and that's driving a rise in container adoption. >>Managing these >>distributed environments requires different skill sets, >>tools and processes >>to manage both >>traditional and cloud environments. >>It is complex >>and time consuming >>all of these workloads are also very >>data dependent Ai >>data analytics and that modernization are the key entry points for >>HB >>Admiral to >>intercept the transformation budget. >>A study from I. T. >>C. Found that >>More than 50 of enterprises are leveraging containers >>to modernize legacy applications >>as is >>without re architect in them. >>These containers are often then deployed >>in on premise cloud environments using kubernetes and Docker. Re implementing legacy applications >>as >>cloud native microservices >>has proven >>more difficult >>than expected, >>held back by the scarcity of the experienced Microsoft >>talent to do that work. >>As a result, only half >>of the new containers deployed leverage microservices >>for cloud native apps. one key element of the >>HB approach is to reduce the effort >>required to >>continue to rise these existing applications. >>One platform for non cloud native and cloud >>native apps >>is the H P E. S. Moral >>container platform. >>Hp Green Lake brings the >>true cloud >>experience to your cloud >>native and non cloud native apps without >>costly. Re factoring with cloud services for containers through Hve Green Lake >>continue rising. >>Non cloud native apps, >>improves >>efficiency, >>increases agility >>and provides >>application affordability. >>Simple applications can take about three months >>while complex once >>up to a year to re factor >>with cloud services for >>containers through HP Green Lake >>customers can save this time and get the benefits >>With 100 open source kubernetes right away with HP >>Asmal >>container platform, non cloud native state fel. Enterprise apps can be deployed in containers without >>costly re factoring >>enabling customers to bring speed and agility >>to non cloud native apps >>with ease. Hp Green Lake is a >>single platform for war clothes and helps customers avoid the cost of moving data and apps and run walk clothes >>securely from the edge >>call occasions >>and data centers >>while meeting the needs for the agency, >>data sovereignty >>and >>regulatory compliance >>with unique type. The >>HBs milk container platform >>provides a container management control plane >>with the fully integrated >>Hve Admiral data fabric. >>The HBs real container platform >>integrates a high performance distributed >>file, an >>object storage. >>These turnkey >>pre configured >>cloud connected >>solutions >>are delivered in >>As little as 14 days and managed for you by HP. E and our partners so >>customers do not need to skill up on kubernetes. >>The key differentiators >>for H. >>B. S. Merrill are providing a complete >>solution that addresses >>a broad set of applications and a consistent multi cloud deployment and management platform. It solves the data integrity >>and application recovery issues >>central >>to business critical >>on >>premise applications. >>It maintains the commitment to open source to ensure customers >>can take >>advantages of future developments >>with these distributions. >>It reduces >>development effort and moves application development >>to self service. >>Now let us look at >>some customer success stories with HBs Merrill. Here is a >>customer who modernize >>their existing legacy applications. >>There were a lot of blind >>spots in the system and the >>utilization >>Was just about 10%. By transitioning to containers, they >>were able to get >>50 >>eight times faster in just performance, reducing a significant >>portion of the cost of >>the customers deployment, significant >>reduction in infrastructure >>footprint resulting >>in lower TCO >>and with HB Green Lake, they received cloud agility >>at a fraction >>of the cost of the alternatives. This customer is expanding its efforts into machine >>learning and >>analytics technologies >>for decision support in areas >>of ingesting and processing large data sets. >>They are enabling data science >>and >>such based applications >>on large >>and low late in data sets using a combination of >>patch >>and streaming transformation processes. >>These data sets support both offline and in line machine learning, deep learning training >>and model execution >>to deploy these >>environments at >>scale and >>move from >>experimentation >>to >>production. They need to connect the dots between their devops teams and the data science teams >>walking on machine learning >>and analytics from an inch for such a standpoint. They're using containers >>and kubernetes >>to drive greater agility >>and flexibility as well as cost savings and efficiency >>as they are >>operationalized. >>These machine >>learning deep learning >>and analytic initiatives. >>This includes >>automated configuration of software stacks and the deployment of data pipeline bills >>in containers. >>The developers >>selected kubernetes >>as the container >>orchestration engine for the enterprise >>and is using H >>P E S, real container >>platform >>for their machine learning >>deep learning and analytic war clothes. This customer had a growing demand for >>data scientists >>and their goals >>were >>to gain continuous insights into existing and new customers >>and develop innovative products >>and get them to >>market faster amongst others. >>The greater >>infrastructure utilization >>on premises resulted in >>significant cost savings Around $6 million three years >>and significantly improved environment >>provisioning time >>From 9 to 18 months to just about 30 minutes. And along those lines, >>there are many >>more examples >>of customer success stories across various industries >>that proved >>transitioning >>to the HP. Es. >>Moral container >>solutions can be >>a total game changer by the way. HB also >>provides container solutions on with various software vendors. >>This customer >>was eager to >>embrace a giant abb development techniques >>that would allow them >>to become more a child >>scalable >>and affordable, helping to deliver >>an exceptional customer service >>and avoid vendor lock in HB. partnered with >>them to deploy >>red hat, open shift running on HP hardware, >>which became a new container >>based devoPS >>platform, effectively >>running on bare metal for >>minimal resource >>overheads and maximum performance. >>The customer now had a platform >>that was capable of supporting >>their virtualization and continue realization ambitions. >>Now let us see how HB Green Lake can help >>you reduce costs, >>risk and time you get speed, time >>to value >>with >>pre integrated hardware, >>software and services the HP ES moral platform to >>design and build >>container based >>services and cell service, catalog and marketplace for rapid >>provisioning >>of these services, >>you get lower risk to the business >>with >>fully managed by contained by HP >>container experts. >>Proactive resolution >>of incidents, >>active capacity management to scale with demand, you can reduce costs >>by avoiding >>upfront capital expense >>and over >>provisioning with pay per use model >>intuitive dashboard for >>cluster costs and storage. >>HB also has a huge >>differentiator when it >>comes to security. >>The HBs. Silicon Root >>of Trust >>secures your >>data at the microcode level >>inside the processor itself, ensuring >>that your digital assets >>remain protected and secure >>with your continued authorization strategy >>built on the world's >>most >>secure industry standard servers, >>you'll be able to >>fully concentrate your resources on your modernization efforts. >>Additionally, >>you can enjoy >>benefits such as HP >>form where threat detection >>along with the with other best in class >>innovations from H B such as malware detection >>and Form where recovery. Your HP servers >>are protected >>from >>silicon to >>software >>and at every touch >>point in between >>preventing bad >>actors from gaining access to containers or infrastructure. >>H B E can help accelerate >>your transformation >>using >>three pillars. >>Hp Green Lake, >>you can deploy >>any workload as a service >>with >>HP Green Lake Services, >>you can now bring >>cloud >>speed >>agility and as a >>service model >>to wear your >>apps and data are today transform the >>way you do business >>with one experience >>And one operating model >>across your distributed clouds >>for apps >>and data >>at the edge in coal occasions >>and in your data center. HB point Next services >>with over >>11,000 >>I'd projects conducted >>And 1.4 million >>customer interactions each year. >>HB point X Services, >>15,000 plus experts and its vast >>ecosystem of solution >>partners and channel partners >>are uniquely able to help you at every stage >>of your digital transformation because we address >>some of the biggest >>areas that can slow you down. >>We bring together technology >>and expertise >>to help you drive >>your business forward >>and last but not the least. >>Hp Financial services, >>flexible investment >>capacity are key >>considerations >>for businesses >>to drive digital transformation initiatives >>in order to forge a path forward. You need >>access two flexible >>payment options >>that allow you to match icty costs >>to usage. >>From helping release >>capital from existing infrastructure, two different payments >>and providing >>pre owned tech >>to relieve capacities. Train >>HP Financial >>services unlocks the value of the customer's entire >>estate from >>edge >>to cloud >>to end user >>with multi vendor >>solutions consistently and sustainably >>around the world. HB Fs >>makes I'd >>investment >>force multiplier, >>not a stumbling block. >>H B S. Moral >>and HB compute are the >>ideal choice >>for your container Ization strategy, >>combining familiar silver hardware >>with a container platform that has been >>optimized for the environment. >>This combination is >>particularly cost effective, >>allowing you to capitalize on existing hardware skills >>as you focus >>on developing innovative >>containerized solutions. >>H beef Admiral >>fits your existing infrastructure and provides potential to scale as required. >>And with that, >>I conclude this session and I hope >>you found this valuable. There are many resources available at hp dot >>com that you can use >>to your benefit. Thank you once again.

Published Date : Apr 9 2021

SUMMARY :

Good, absolute and good evening to all and cost pressures. looking for a way to quickly deploy and manage the resources there. Digital transformation has from operations. And Ceos and by reducing administrative overhead your workload portability. of application updates. This kick starts your innovation these benefits to enterprise it and accelerate the transformation in the data center. speed and scale their apps and that's driving a rise in container in on premise cloud environments using kubernetes and Docker. one key element of the Re factoring with cloud services for containers through Hve Enterprise apps can be deployed in containers without with unique type. E and our partners so It solves the data some customer success stories with HBs Merrill. they of the cost of the alternatives. They need to connect the dots between their devops teams and and analytics from an inch for such a standpoint. This From 9 to 18 months to just about 30 minutes. to the HP. HB also and avoid vendor lock in HB. and Form where recovery. and in your data center. in order to forge a path forward. to relieve capacities. around the world. fits your existing infrastructure and provides potential to you found this valuable. to your benefit.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
HPORGANIZATION

0.99+

9QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

14 daysQUANTITY

0.99+

CIAORGANIZATION

0.99+

Two applicationsQUANTITY

0.99+

1.4 millionQUANTITY

0.99+

More than 50QUANTITY

0.98+

about 30 minutesQUANTITY

0.98+

todayDATE

0.98+

18 monthsQUANTITY

0.98+

each yearQUANTITY

0.98+

bothQUANTITY

0.97+

100 open source kubernetesQUANTITY

0.96+

15,000 plus expertsQUANTITY

0.96+

HBs MerrillORGANIZATION

0.95+

one experienceQUANTITY

0.95+

about 10%QUANTITY

0.95+

about three monthsQUANTITY

0.95+

HB Green LakeORGANIZATION

0.94+

two different paymentsQUANTITY

0.94+

up to a yearQUANTITY

0.92+

single platformQUANTITY

0.92+

twoQUANTITY

0.91+

HBORGANIZATION

0.9+

three yearsQUANTITY

0.9+

three pillarsQUANTITY

0.9+

oneQUANTITY

0.89+

eight timesQUANTITY

0.88+

H. P. EORGANIZATION

0.88+

enterprisesQUANTITY

0.87+

One platformQUANTITY

0.87+

HBsORGANIZATION

0.85+

each areaQUANTITY

0.85+

HP. EORGANIZATION

0.83+

H BTITLE

0.79+

EsTITLE

0.75+

Around $6 millionQUANTITY

0.75+

HB Green LakeCOMMERCIAL_ITEM

0.74+

Hve Green LakeORGANIZATION

0.73+

coupleQUANTITY

0.73+

B. S. MerrillORGANIZATION

0.72+

one keyQUANTITY

0.71+

SaxenaORGANIZATION

0.7+

HpORGANIZATION

0.69+

11,000QUANTITY

0.68+

LakeTITLE

0.66+

H.ORGANIZATION

0.64+

HveORGANIZATION

0.63+

I. T.ORGANIZATION

0.6+

SessionOTHER

0.59+

HpTITLE

0.59+

CeosTITLE

0.55+

halfQUANTITY

0.54+

beefORGANIZATION

0.51+

ESTITLE

0.51+

GreenORGANIZATION

0.49+

Green LakeTITLE

0.49+

hpOTHER

0.49+

AsmalTITLE

0.47+

LakeORGANIZATION

0.45+

GreenCOMMERCIAL_ITEM

0.45+

ServicesORGANIZATION

0.39+

04QUANTITY

0.38+

CarlosLOCATION

0.32+

AdmiralCOMMERCIAL_ITEM

0.31+

HBCOMMERCIAL_ITEM

0.26+

Dr. Vikram Saksena, NETSCOUT | CUBEConversation, July 2019


 

from the silicon angle media office in Boston Massachusetts it's the queue now here's your host still minimun hi I'm Stu minimun and this is a cube conversation from our Boston area studio happy to welcome to the program a first-time guest on the program but from knit scout who we've been digging into the concept of visibility without borders dr. Vikram Saxena who's with the office of the CTO from the for mention net scout thank you so much for joining us thanks to it thanks for having me all right dr. Zana before we get into kind of your role why don't you go back give us a little bit about you know your background you and I have some shared background comm we both work for some of the arms of you know Ma Bell that's right back in the day yeah you work a little bit more senior and yeah you know probably a lot more patents than I have my current count is still sure happy to do that you're right I started in 82 which was two years before the breakup of Marbella so you know and then everything started happening right around that time so yeah I started in Bell Labs you know stayed there close to 20 years did lot of the early pioneering work on packet switching before the days of internet frame relay all of that happened it was a pretty exciting time I was there building up we built up the AT&T business from scratch to a billion dollars in the IP space you know in a voice company that was always challenging so and then I moved on to do startups in the broadband space the two of them moved to the Boston area and then moved on to play the CTO role and public companies sonnez networks Tellabs and then you know came to an EPS card about five years ago yeah you know I I love talking about you know some of those incubators of innovation though I you know historically speaking just you know threw off so much technology that's right been seeing so much the media lately about you know the 50th anniversary of Apollo 11 that's so many things that came out of NASA Bell Labs was one of those places that helped inspire me to study engineering that's you know definitely got me on my career but here we are 2019 that's you're still you know working into with some of these telcos and how they're all you know dealing with this wave of cloud and yeah I know the constant change there so bring us inside you know what's your role inside net Scout that office of the CTO yes so net Scout is in the business of you know mining Network data and and what we excel at is extracting what we call actionable intelligence from network traffic which we use the term smart data but essentially my role is really to be the bridge between our technology group and the customers you know bring out understand the problems the challenges that our customers are facing and then work with the teams to build the right product to you know to fit in to the current environment okay one of our favorite things on the cube is you know talking to customers they're going through their transformation that's what you talk about the enterprise you know digital transformation that's what we think there's more than just the buzzword there yeah I've talked to financial institutions manufacturing you know you name it out there if it's a company that's not necessarily born in the cloud they are undergoing that digital transformation bring us inside you know your customer base that this telcos the service providers you know most of them have a heavy tech component to what they're doing but you know are they embracing digital transformation what what does it mean for them so you know as you said it's it's a big term that catches a lot of things but in one word if I described for the telcos it's all about agility if you look at the telco model historically it has been on a path where services get rolled out every six months year multiple years you know not exactly what we call an agile environment compared to today you know but when the cloud happened it changed the landscape because cloud not only created a new way of delivering services but also changed expectations on how fast things can happen and that created high expectations on the customer side which in turn started putting pressure on the on the telcos and and the service providers to become as agile as cloud providers and and and as you know the the network which is really the main asset of a service provider was built around platforms that were not really designed to be programmable you know so they came in with hardwired services and they would change at a very low timescale and building around that is the whole software layer of OS SPSS which over time became very monolithic very slow to change so coupling the network and the software layer created a very slow moving environment so this is what's really causing the change to go to a model where the networks can be programmable which essentially means moving from a hardware centric model to a software centric model where services can be programmed on-demand and created on the fly and maybe sometimes even under the control of the customers and layering on top of that changing the OS s infrastructure to make it more predictive make it more actionable and driven by advances in machine learning and artificial intelligence to make this entire environment extremely dynamic in agile so that's kind of what we are seeing in the marketplace yeah I totally agree that that agility is usually the first thing put forward I I need to be faster yeah it used to be you know faster better cheaper now like a faster faster faster I can actually help compensate for some of those other pieces there of course service riders usually you know very conscious on the cost of things there because if they can lower their cost they can usually of course make them more competitive and pass that along to their ultimate consumers you know bring us inside that you know you mentions this change to software that's going on you know there are so many waves of change going on there everything from you know you talk about IOT and edge computing yeah it's a big you know massive role at a 5g that ya even gets talked about in the general press that these days and at government states they're so you know where are you know your customers today what are some of the critical challenge they have and yeah you know where is that kind of monitoring observability that that kind of piece fit in so so good so let me give to backdrop points first of all you mentioned cost so they are always very cost-conscious trying to drive it down and the reason for that is the traditional services have been heavily commoditized you know voice texting video data they've been commoditized so the customers worn the same stuff cheaper and cheaper and cheaper all the time right so that puts a pressure on margins and reducing cost but now you the industry is at a point where I think the telcos need to grow the top line you know that's a challenge because you can always reduce cost but at some point you get to a point of diminishing returns so now I think the challenge is how do they grow their top line you know so they can become healthier again in that context and that leads to whole notion of what services they need to innovate on so it's all about once you have a programmable Network and a software that is intelligent and smart that becomes a platform for delivering new services so this is where you know you see on the enterprise side Sdn Enterprise IOT all these services are coming now using technologies of software-defined networking network function virtualization and 5g as you mentioned is the next generation of wireless technology that is coming on board right now and that opens up the possibility for the first time to new things dimensions come into play first not only a consumer centric focus which was always there but now opening it up to enterprises and businesses and IOT and secondly fixed broadband right the the the era where telcos used to either drive copper or fiber slow cumbersome takes a lot of time right and the cable guys have already done that with coaxial cable so they need to go faster and faster means use Wireless and finally with 5g you have a technology that can deliver fixed broadband which means all the high definition video voice data and other services like AR VR into the home so it's opening up a new possibility rather than having a separate fixed network and a separate wireless network for the first time they can collapse that into one common platform and go after both fixed and mobile and both consumers and enterprise force yeah we said what one of the big topics of conversation at Cisco live was at San Diego just a short time ago it was 5g and then it you know Wi-Fi six the next generation of that because I'm still going to need inside my building you know for the companies but the 5g holds the promise - give me - so much faster bandwidth so much dense for environment I guess some of the concerns I hear out there and maybe you can tell me kind of where we are and where the telcos fit in is you know 5g from a technology standpoint we understand where it is but that rollout is going to take time yes you know it's great to say you're going to have this dense and highly available thing but you know that's gonna start the same place all the previous generations all right it's the place where actually we don't have bad connectivity today it's you know it's in the urban areas it's where we have dense populations you know sometimes it's thrown out there o5g is gonna be great for edge and IOT and it's like well you know we don't have balloons and planes you know and you know the you know the towers everywhere so where are we with that rollout of 5g what side of timeframes are your customer base looking at as to where that where that goes to play so I think from what I'm seeing in the marketplace I think there is a less of a focus on building out ubiquitous coverage because you know when the focus is on consumers you need coverage because they're everywhere right but I think where they are focusing on because they want to create new revenue a new top-line growth they're focusing more on industry verticals IOT now that allows you to build out networks and pockets of air your customers are because enterprises are always focused in the top cities and you know heck top metro areas so before you make it available for consumers if you get an opportunity to build out at least in the major metropolitan area an infrastructure where you're getting paid as you're building it out because you're signing up this enterprise customers who are willing to pay for these IOT services you get paid you get to build out the infrastructure and then slowly as new applications emerge I think you can make it widely available for consumers I think the challenge on consumer side is the smart phones have been tapped out you know and and people are not going to get that excited about 5g just to use the next-gen I found right so there it has to be about new applications and services and things that people talk about always on the horizon are a are we are and think like that but they are out there they're not there today because it device has to come on board that becomes mass consumable and exciting to customers so while the industry is waiting for that to happen I think there's a great opportunity right now to turn up services for enterprise verticals in the IOT space because the devices are ready and everybody because enterprises are going through their own digital transformation they want to be in a connected world right so they're putting pressure on telcos to connect all their devices into the network and there is a monetization opportunity there so I think what the carriers are going to do is sign up verticals whether it's transportation health care so if they sign up a bunch of hospitals they're going to deploy infrastructure in that area to sign up hospitals if they're going to sign up manufacturing they're going to build their infrastructure in those areas where they're right so by that model you can build out a 5g network that is concentrated on their customer base and then get to ubiquitous coverage later when the consumer applications come yeah so I like that a lot because you know when I think back if we've learned from the sins of the past it used to be if we build it they will come let's you know dig trenches across all the highways and with as much fiber as we can and then the dot-com burst happens and we have all of this capacity that we can't give away yeah what it sounds like you're describing is really a service centric view yes I've got customers and I've got applications and I'm going to build to that and then I can build off of that yeah piece there could talk a little bit about that focus and you know where yeah where your customers are going yeah so maybe just likely before that what I want to talk about the distributed nature of the 5g network so you mentioned edge right so one of the things that are happening when you want to deliver low latency services or high bandwidth services you need to push things closer to the edge as you know when cloud started it's more in the what we call the core you know the large data centers the hyper scale data centers where applications are are being deployed now but when you demand low latency let's say sub 15 millisecond 10 millisecond latency that has to be pushed much more closer to the customer now this is what's for saying the edge cloud deployment in 5g and then what that does is it also forces you to distribute functionality you know everything is not centralized in the core but it's distributed in the edge and the code the control plane maybe in the core but the user plane moves to the edge so that changes the entire flow of traffic and services in a 5g Network they are no longer centralized which means it becomes more challenging to be able to manage and assure these services in a highly distributed telco cloud environment which has this notion of edge and core now on top of that if you say that you know this is all about top-line growth and customer satisfaction then your focus on operationalizing these services has to change from in network centric view to a service centric view because in the past as you know when we were both in Bell Labs in AT&T you know we were pretty much you know focused on the network you know focused on the data from the network the network elements the switches and the routers and all of that and making sure that the network is healthy now that is good but it's not sufficient to guarantee that the services and the service level agreements for customers are being met so what you need to do is focus at the service layer much more so than you were doing it in the past so that changes the paradigm on what data you need to use how you want to use it and how do you stitch together this view in a highly distributed environment and do it in real-time and do it all very quickly so the customers don't see the pain if anything breaks and actually be more proactive in lot of cases be more predictive and take corrective actions before the impact services so this is the challenge and and clearly from a net Scout point of view I think we are right in the center of this hurricane and you know given the history we sort of have figured out on how to do this yeah you know the networking has a long history of we've got a lot of data we've got all of these flows and things change but right exactly as you said understanding what happened at that application that is we've been really tie to make sure it's just IT sitting on the side but IT driving that business that's my application those data flows so yeah you maybe expound a little bit more net Scouts fit there yeah and you know what why it's so critical for what customers need today yeah happy to do that so so if you look at what are the sources of data that you actually can use and and what you should use so basically they fall into three buckets what I call first is what I call infrastructure data which is all about data you get from hypervisors we switches they're telling you more about how the infrastructure is behaving where you need to add more horsepower CPU is memory storage and so on so that is very infrastructure centric the second one is from network elements you know what the DNS servers give you DHCP servers what your routers and switches are giving you the firewalls are giving you and they are also in a way telling you more about what the network elements are seeing so there's a little bit of a hybrid between infrastructure and a service layer component but the problem is that data is it's very vendor dependent it's highly fragmented across there because there's no real standards how to create this data so there is telemetry data there are sis logs and they all vendors do it what they think is best for them so the challenge then becomes on the service provider side and how do you stitch together because service is an end-to-end construct or an application it starts at a at a at a user and goes to a server and you need to be able to get that holistic view n2n so the most appropriate data that net scout feels is what we call the wire data or the traffic data is actually looking at packets themselves because they give you the most direct knowledge about how the service is behaving how it's performing and not only that you can actually predict problems as opposed to react to problems because you can trend this data you can apply machine learning to this data and be able to say what might go wrong and be able to take corrective action so we feel that extracting the right contextual information relevant implicit information timely information in a vendor independent way in a way that is universally if we available from edge to core those are the attributes of wire data and we excel in processing that at the source in real-time and converting all of that into actionable intelligence that is very analytics and automation friendly so this is our strength what that allows us to do is as they are going through this transition between 4G and 5g between physical and virtual across fixed and mobile networks you know you can go through this transition if you have it stitched together end to end view that crosses these boundaries or borders as we call it visibility without borders and in this context your operations people never lose insight into what's going on with their customer applications and behavior so they can go through this migration with confidence that they will not negatively impact their user experience by using our technology yeah you know we've thrown out these terms intelligence and automation for decades yes in our industry but if you look at these hybrid environments and all of these changes come out if an operator doesn't have tools like this they can't keep up they can go so I need to have that machine learning I have to have those tools that can help me intelligently attack these pieces otherwise there's no way I can do it yeah and one point there is you know it's like garbage in garbage out if you don't get the right data you can have the most sophisticated machine learning but it's not going to predict the right answer so the quality of data is very important just as the quality of your analytics in your algorithms so we feel that the combination of right data and the right analytics is how you're going to get advantage of you know accurate predictions and automation around that whole suite okay love that right data right information right delusion why don't want to give you right analytics I want to give you the final word final takeaways for your customers today so I think we are in a very exciting time in the industry you know 5g as a technology is a probably the first generation technology which is coming on board where there is so much focus on on things like security and and new applications and so on and and I think it's an exciting time for service providers to take advantage of this platform and then be able to use it to deliver new services and ultimately see their top lines grow which we all want in the industry because if they are successful then via suppliers you know do well you know so I think it's a pretty exciting time and and vyas net scout are happy to be in this spot right now and to see and help our customers go to go through this transition alright dr. Vikram Singh Saxena thank you so much for joining us sharing with us everything that's happening in your space and it glad to see the excitement still with the journey that you've been on thank you Stu happy to be here all right and as always check out the cubed on net for all of our content I'm Stu minimun and thanks as always for watching the cube [Music]

Published Date : Jul 17 2019

SUMMARY :

know the you know the towers everywhere

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
July 2019DATE

0.99+

BostonLOCATION

0.99+

San DiegoLOCATION

0.99+

AT&TORGANIZATION

0.99+

Bell LabsORGANIZATION

0.99+

2019DATE

0.99+

dr.PERSON

0.99+

first timeQUANTITY

0.99+

Boston MassachusettsLOCATION

0.99+

twoQUANTITY

0.98+

todayDATE

0.98+

10 millisecondQUANTITY

0.98+

one wordQUANTITY

0.98+

telcosORGANIZATION

0.98+

telcoORGANIZATION

0.98+

NASA Bell LabsORGANIZATION

0.98+

one pointQUANTITY

0.97+

dr. ZanaPERSON

0.97+

Stu minimunPERSON

0.97+

first generationQUANTITY

0.97+

bothQUANTITY

0.96+

first-timeQUANTITY

0.96+

Vikram SaksenaPERSON

0.96+

firstQUANTITY

0.96+

TellabsORGANIZATION

0.96+

Ma BellPERSON

0.95+

oneQUANTITY

0.94+

decadesQUANTITY

0.92+

Vikram Singh SaxenaPERSON

0.92+

first thingQUANTITY

0.91+

50th anniversaryQUANTITY

0.91+

every six monthsQUANTITY

0.91+

second oneQUANTITY

0.91+

billion dollarsQUANTITY

0.88+

CTOORGANIZATION

0.88+

Vikram SaxenaPERSON

0.86+

wave of cloudEVENT

0.82+

twoDATE

0.82+

one common platformQUANTITY

0.8+

5gQUANTITY

0.79+

agileTITLE

0.77+

sonnezORGANIZATION

0.76+

about five years agoDATE

0.76+

lot of dataQUANTITY

0.75+

20 yearsQUANTITY

0.75+

15 millisecondQUANTITY

0.74+

NETSCOUTORGANIZATION

0.72+

Dr.PERSON

0.72+

82DATE

0.7+

StuPERSON

0.7+

net ScoutORGANIZATION

0.68+

5gOTHER

0.67+

secondlyQUANTITY

0.65+

OS SPSSTITLE

0.63+

thoseQUANTITY

0.62+

of casesQUANTITY

0.59+

three bucketsQUANTITY

0.57+

yearsQUANTITY

0.53+

Cisco liveEVENT

0.5+

minimunPERSON

0.49+

4GOTHER

0.47+

Apollo 11COMMERCIAL_ITEM

0.42+

MarbellaORGANIZATION

0.32+

Eric Herzog & Mark Godard | IBM Interconnect 2017


 

>> Narrator: Live from Las Vegas, it's theCube. Covering Interconnect 2017. brought to you by IBM. >> Hey welcome back everyone. We're live here in Las Vegas for IBM Interconnect 2017. Siliconangle's theCube's Exclusive coverage of IBM Interconnect 2017, I'm John Furrier. My co-host Dave Vellante. Our next two guests, Eric Herzog, Vice President of Marketing for IBM Storage. Nice to see you again, you were on yesterday. And Mark Godard, Manager of Customer Success and Partnership at Sparkcognition, a customer. Guys, welcome to theCube, good to see you again. Welcome for the first time. >> Thank you. >> Thank you. >> Okay, so we're going to talk about some stories we did yesterday, but you've got the customer here. What's the relationship, why are you guys here? >> We provide the storage platform. They use our flash technology. Spark is a professional software company. It's not a custom house, they are a software company. >> And Spark, not related to Spark OpenSource. Just the name Spark, Sparkcognition. Make sure to get that out of the way. Go ahead, continue. >> So they're a hot startup. They have a number of different use case including cybersecurity, real-time IoT, predictive analytics and a whole bunch of other things that they do. When a customer goes on premise 'cause they deliver either through a service model or on premise, when it's in their service model they use our flash and our power servers. When it's on premise they recommend here's the hardware you should use to optimize the software if the customer buys a non-premised version. They offer it both ways, but part of the reason we thought it would be interesting is they're a professional software company. A lot of the people here as you know are regular developers, in-house developers. In this case these guys are a well-funded VC startup that delivers software to the end user base. >> Tell us more about Sparkcognition. Give us the highlights. >> Yeah, appreciate it. Sparkcognition, we're a cognitive algorithms company. We do data science, machine learning, natural language processing. Kind of the whole gambit there. Working, we have three products. SparkPredict is our predictive analytic, our predictive maintenance product. SparkSecure is our network log security product. And Deep Armor is a machine learning endpoint protection product. In that you kind of hear we're in the IoT, the industrial IoT, the IIoT of things. It also, in cybersecurity we've done use cases, other machine learning use cases as well. But the predictive maintenance and cybersecurity are two most, most advanced use cases, industrial areas. So we've been around about three years. We have around 100 people. Appreciate Eric talking about how well-financed we are and how our success really is budding this far. We're happy to be here. >> John: Where are you guys located? >> We're based out of Austin, Texas. >> John: Another Austin. >> Yeah Austin, Texas. >> Dominant with Austin. >> It's always good to have financing. You can't go out of business if you don't run out of money. Talk about the industrial aspect. One of the things that is hot, it's not a mainstream here, is blockchain is the big announcement. But IoT is the big one. But industrial IoT's interesting because now you have the digitization of business as a big factor. And that data is going to be throwing off massive analog digital data now. So analog to digital, what's going on there? What are you guys doing there to help and where does the storage fit in? >> Yeah, I appreciate that. So IIoT, industrial there's obviously there's big clients there. There's value in this information. For us it's predictive maintenance is the big play. A study I read the other day by a Boston consulting group talks about how its services and applications in the industrial internet of things. There's an $80 billion market in the next five years with predictive maintenance leading the way as the most mature application there. So we're happy to be kind of riding on the front of that wave, really pushing the state of the art there. Predictive maintenance is valuable to clients because the idea is to predict failures, do optimization of resources, so to get more energy out of your wind farm, get more gas out of the ground, you name it. Having the software that can provide those solutions efficiently to clients without a lot of start up, but each new iteration. So having a product that can deliver that intellectual property efficiently is important. The whole goal is to be able to reduce maintenance costs and extend the useful life of assets. So that's what SparkPredict is our product, SparkPredict our product, Sparkcognition has been laboring to do. We have a successful deployment of 1,100 turbines with Invenergy, which is the largest wind production company in the United States. We're doing work with Duke, Nexterra, several other large electrical production companies, oil and gas companies as well. In Austin we're near Houston, we have a lot of energy production opportunity there. So predictive maintenance for us is a big play. >> So you guys did a session this week. You hosted a panel, is that right? So I mean no offense, but what we're talking about now is really even more interesting than storage. But it's a storage panel you were hosting, right? So what was the conversation like? >> The conversation around that was we had three software companies, Sparkcognition and two other software companies. Then we had a federal integrator. All of them are doing cloud delivery. So for example, one of the other software companies Medicat, delivers medical record keeping as a service to hospitals. They're doing predictive analytics and predictive maintenance, and also some cybersecurity out. So there were three professional software companies, and integrator. And in each case the issues were A, we need to be up and going all the time and the user doesn't know what storage we're using. But we can never fail because we're real time. In fact, one of the customers is the IRS. So the federal integrator, the IRS cloud runs on IBM storage. The entire IRS runs under IBM cloud. On our storage, but it's their cloud. It's their private cloud that they put together, that the integrator put together. The idea was we've got a cloud deployment. There's two key things your storage has to do. A, it needs to be resilient as heck because these guys and the other two companies on the software side if they cannot serve it as a service then no one's going to buy the software, right? Because software is the service. So for them it's critical in their own infrastructure that it be resilient. Then the second thing, it needs to be fast. You've got to meet the SLAs, right? So when you're thinking the system's integrator at the IRS, what do you think the SLAs are and they've got like 14 petabytes of all flash. >> You forgot dirt cheap. You got resilient as heck, lightning fast, and it's got to be dirt cheap, too. >> Well of course. >> They want all three, right? >> You have this panelist, so what Jenny, what were Jenny's three? Industrial ready, cloud based, and cognitive to the core. So you guys are, I'm on your website. It's cognitive this, cognitive that. You're cognitive to the core. You're presumably you're using industrial ready infrastructure and it's all cloud based, right? Talk about that a little bit, then I've got a follow up. >> To tie into what Eric is saying about the premium hardware, the cloud opportunity, for us to be able to to AI software, to be able to do machine learning models, these are very intensive applications that require massive amounts of CPU, IO, fast storage. To be able to get the value from that data quickly so that it's useful and actionable takes that premium hardware. So that's why we've done testing with flash system, with our cybersecurity product. One of the most innovative things that we did in the previous year was to move from a traditional architecture using X86, 64 where we had a cluster of eight servers there. Brought that down to one flash system array and we're able to get up to 20 times the performance doing things like analyzing, sorting, and ingesting data with our cybersecurity platform. So in that regard we were very much tied closely to the flash system product. That was a very successful use case. We offered a white paper on that. If anyone wants to read more that's available on the IBM website. >> Where do you find that, search it? >> Yeah, it's on IBM.com and it's basically how they used it to deliver software as a service. >> What do I search? >> If you search Sparkcognition IBM you'll find it on Google. >> My other question, my follow up is you talk about these IoT apps which are distributed by their very nature. Can we talk about the data flow? What are you seeing in terms of where the data flows? Everybody wants to instrument the windmill. You've got to connect it then you've got to instrument it. Where's the data going? You're doing analytics locally, you're sending data back. What are you seeing in the client base? >> Yeah, that's a great question. Those in the field use cases for the wind turbines for example, most of our clients they already have a data storage solution. We're not a data storage provider. The reason, and someone asked me this yesterday in a different conversation. They said why are wind turbines so ripe for the picking? It's because they're relatively modern assets. They were built with the sensors onboard. The data, they have been collecting the data since the invention of the modern wind turbine, they've been collecting this data. Generally it's sent in from the field at 10 minute intervals, usually stored in some sort of large data center. For our purposes though, we collect a feed off that data of the important information, run our models, store a small data set a few months, whatever we think we need to train that machine learning model and to retrain and balance that model. That's sort of an example where we're doing the analysis in a data center or in the cloud sort of out of the field. The other approach is sort of an edge analytics approach, you might have heard that term. That's usually for smaller devices where the value of the asset doesn't justify the infrastructure to relay the information and then deploy this large scale solution. So we actually are developing edge analytic solution, a version of our product as well working with a company called Flowserve, their the world's largest pump manufacturing company. To be able to say how can we add some intelligence the to these pumps that may operate near a pipeline or out in the oil field and be able to make those machines smarter even though they don't necessarily justify the robust IT infrastructure of a full wind turbine fleet. >> Is there a best practice that you guys see in terms of the storage? Because you bring out edge and the network. Great point, lot of diversity at the edge now, from industrial to people. But the data's got to be stored somewhere. I mean, is there a best practice? Is there a pattern to developing that you're seeing in terms of how people are approaching the data problem and applying algorithms to it? Just talk, do I move the data? Do I push to compute to the data? Thoughts on what you guys are seeing in terms of best practices. >> One of the other companies that was on the panel also is doing predictive modeling. They take 600 different feeds in real time then munge it for mostly for industrial markets, but mostly for the goods. So the raw goods that they need to make a machine or make a table or make the paper that is used behind us, or make the lights that are used here, they look at all that commodities and then they feed it out to all these consumers, not consumers but the companies that build these products. So for them, they need it real time so they need storage that's incredibly fast because what they're doing is they're putting out on super powerful CPUs loaded with D-ram, but you can only put so much D-ram in a server. They're building these giant clusters to analyze all this data and everything else is sitting on the flash. Then they push that out to their customers. Slightly different model from what Sparkcognition does, but a slightly similar except their taking it from 600 constant data sources in real time, 24 by seven, 365 and then feeding it back out to these manufacturing companies that are looking to buy all these commodities. >> You have "software defined" in your title. That was kind of the big buzzwords a few years ago. Everybody wanted to replicate the public cloud on prem. We think of it as programmable infrastructure, right? Set it and then you can start making API calls and set SLAs and thresholds, etc. Where are we at with software defined? Do you guys, does it resonate with you or is it just an industry buzzword? I'll start with Eric. >> For us we're the largest provider of software defined storage in the world. Hundreds and hundreds and hundreds of millions of dollars every year. We don't sell any infrastructure. We just sell the raw software and they use commodity infrastructure, whatever they want: hard drives, flash drives, CPUs, anything they buy from their local reseller and then create basically high-performance arrays using that software. So they create on their own. Everything is built around automation so we automatically can replicate data, snapshot data, migrate data around from box to box, move it from on-premise to a cloud through what we call transparent cloud tiering. All of that in the software defined storage is done based on automation play. So the software defined storage allows them to if you will, build their own version of our flash system by just buying the raw software and buying flash from someone else, which is okay with us because the real value's in the software, obviously as you know. That allows them to then create infrastructure of their own, but they've got the right kind of software. They're not home brewing the software it's all built around automation. That's what we're seeing in the software defined space across a number of different industries, whether it be cloud providers, banks. We have all kinds of banks that used our software defined storage and don't buy the actual underlying storage from us, just the storage software. >> Do you, you may not have visibility in this, but getting kind of geeky on it. Do you guys adopt that sort of software defined mentality in your approach? >> Yeah, so for us software defined storage is something that we've deployed for our proof of concept evaluations. The nature of the work that we do is the solution is innovative to the point where everyone needs to have some sort of proof point for themselves before the company or the client will invest in a large scale. So software defined storage and embracing that perspective has allowed us to deploy a small scale implementation without having our own dedicated hardware, for example, at different clients. That's enabled us to spin up an instance quickly, to provision that small scale deployment, to be able to prove out results at a low cost to our client. That's where we really leverage that approach. We also have used a similar approach in the cloud where we've used multi-tenant environments to be able to support our cybersecurity product, SparkSecure in a multi-tenant cloud hosted environment which brings down delivery costs as well. It allows us to slice up that data and deliver it at a low cost. As far as our large scale physical deployments for the asset monitoring and such, we really, we generally end up with a piece of a flash system or flash storage, bare metal deployment because that speed is critical whether that's the client wants to have instant monitoring of a critical asset or they have a financial services use case where we're looking for anomalies or looking for threats in the cybersecurity landscape. Having that real-time model building and model result is very critical. So having that bare metal flash system type installation is kind of our preferred route. The only other thing I would say on that is you asked earlier about our approach. For us, the security data is very important. Most of our assets are what are called critical assets. So clients are very sensitive to the security of the data. Some are still uncomfortable with a cloud deployment. Another reason why we have an affinity for the hardware deployment with IBM. >> Why IBM? >> Our company has really deep roots with IBM. My founder Amir Hussein, was actually on the board of directors of the original IBM Watson Project as well as Manoj Saxena was the original GM of the IBM Watson program. We have just a long relationship with IBM. We have a lot of mutual interest and respect for the entity. We've also found that the products are superior in many ways. We are hardware agnostic and we're an independent advisor to our clients when it comes to how to deliver our solutions. But our professional opinion based on the testing that we've done is that IBM is a top-tier option. So we continue to prescribe that to our clients. When they feel that's appropriate they make that purchase through IBM. >> Great testimonial. Eric, excited to hear that nice testimonial for you guys? Congratulations. >> He's done several panels with us and again, part of the reason for here was A, all about IoT which they're all into. All about commo which they're all into. And to show that you can do a software as a service model even in-house. They happen to be a professional software company but if you're a giant global enterprise you may actually do software as a service to your remote branch offices which is very similar to what these guys to do other companies. This gives them an example, the other two software companies the same way, to show in-house developers if you're going to have a private cloud, not go public, you can deliver software as a service internally to your own company through the dev model and do it that way. Or you can use someone like Sparkcognition or Medicat or the other companies that we showed, Z-Power, all of which were using us to deliver their software as a service with IBM flash technology. >> Dave: And you're using Watson or Watson analytics? >> Yes, so we have done integrations with Watson for our cybersecurity product. We've also done integrations with Watson rank and retrieve using the NPL capabilities to advise the analysts both in the Predict space and in the Secure space. Sort of an advisor to say what a client user could see something happening on a turbine and say what does this mean? Using a Watson corpus. I was going to add one thing, we were talking about why IBM? IBM really has been a leader in the space of cognitive computing and they've invested in bringing and nurturing small companies and bringing up entrepreneurs in that space to build that out. So we appreciate that. I think it's important to mention that. >> All right Mark, thanks so much for joining in, the great testimonial, the great insight. Good luck with your business. Congratulations on the success startup taking names and kicking butt. Eric, great to see you again, thanks for the insight and congratulations on great, happy customers and see you again. Okay, we're watching theCube live here at Interconnect 2017. More great coverage, stay with us. There will be more after this short break. (upbeat instrumental music)

Published Date : Mar 22 2017

SUMMARY :

brought to you by IBM. Nice to see you again, you were on yesterday. What's the relationship, why are you guys here? We provide the storage platform. Just the name Spark, Sparkcognition. A lot of the people here as you know are regular developers, Give us the highlights. Kind of the whole gambit there. One of the things that is hot, it's not a mainstream because the idea is to predict failures, So you guys did a session this week. Then the second thing, it needs to be fast. and it's got to be dirt cheap, too. So you guys are, I'm on your website. One of the most innovative things that we did Yeah, it's on IBM.com and it's basically If you search Sparkcognition IBM you'll find it Where's the data going? or out in the oil field and be able to make those machines But the data's got to be stored somewhere. So the raw goods that they need to make a machine Set it and then you can start making API calls So the software defined storage allows them to Do you guys adopt that sort of software defined mentality The nature of the work that we do is the solution of directors of the original IBM Watson Project Eric, excited to hear that nice testimonial And to show that you can do a software as a service model Sort of an advisor to say what a client user Eric, great to see you again, thanks for the insight

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
IBMORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Amir HusseinPERSON

0.99+

Eric HerzogPERSON

0.99+

Mark GodardPERSON

0.99+

EricPERSON

0.99+

JennyPERSON

0.99+

JohnPERSON

0.99+

DukeORGANIZATION

0.99+

MarkPERSON

0.99+

HoustonLOCATION

0.99+

NexterraORGANIZATION

0.99+

10 minuteQUANTITY

0.99+

DavePERSON

0.99+

1,100 turbinesQUANTITY

0.99+

John FurrierPERSON

0.99+

AustinLOCATION

0.99+

$80 billionQUANTITY

0.99+

two companiesQUANTITY

0.99+

United StatesLOCATION

0.99+

SparkcognitionORGANIZATION

0.99+

14 petabytesQUANTITY

0.99+

oneQUANTITY

0.99+

Manoj SaxenaPERSON

0.99+

Austin, TexasLOCATION

0.99+

FlowserveORGANIZATION

0.99+

twoQUANTITY

0.99+

MedicatORGANIZATION

0.99+

each caseQUANTITY

0.99+

24QUANTITY

0.99+

second thingQUANTITY

0.99+

Las VegasLOCATION

0.99+

OneQUANTITY

0.99+

600 different feedsQUANTITY

0.99+

yesterdayDATE

0.99+

both waysQUANTITY

0.99+

IRSORGANIZATION

0.99+

two keyQUANTITY

0.99+

first timeQUANTITY

0.99+

InvenergyORGANIZATION

0.99+

this weekDATE

0.99+

around 100 peopleQUANTITY

0.98+

GoogleORGANIZATION

0.98+

three professional software companiesQUANTITY

0.98+

two other software companiesQUANTITY

0.98+

threeQUANTITY

0.98+

365QUANTITY

0.98+

eight serversQUANTITY

0.98+

three software companiesQUANTITY

0.98+

SparkPredictORGANIZATION

0.97+

two guestsQUANTITY

0.97+

bothQUANTITY

0.97+

Interconnect 2017EVENT

0.96+

two software companiesQUANTITY

0.96+

600 constant data sourcesQUANTITY

0.96+

Z-PowerORGANIZATION

0.96+

SparkORGANIZATION

0.95+

WatsonTITLE

0.94+

sevenQUANTITY

0.93+

Deep ArmorORGANIZATION

0.93+

three productsQUANTITY

0.93+

up to 20 timesQUANTITY

0.93+

one thingQUANTITY

0.92+

Scott Howser, Hadapt - MIT Information Quality 2013 - #MIT #CDOIQ #theCUBE


 

>> wait. >> Okay, We're back. We are in Cambridge, Massachusetts. This is Dave Volante. I'm here with Jeff Kelly. Where with Wicked Bond. This is the Cube Silicon Angles production. We're here at the Mighty Information Quality Symposium in the heart of database design and development. We've had some great guests on Scott Hauser is here. He's the head of marketing at Adapt Company that we've introduced to our community. You know, quite some time ago, Um, really bringing multiple channels into the Duke Duke ecosystem and helping make sense out of all this data bringing insights to this data. Scott, welcome back to the Cube. >> Thanks for having me. It's good to be here. >> So this this notion of data quality, the reason why we asked you to be on here today is because first of all, you're a practitioner. Umm, you've been in the data warehousing world for a long, long time. So you've struggled with this issue? Um, people here today, uh, really from the world of Hey, we've been doing big data for a long time. This whole big data theme is nothing new to us. Sure, but there's a lot knew. Um, and so take us back to your days as a zoo. A data practitioner. Uh, data warehousing, business intelligence. What were some of the data quality issues that you faced and how did you deal with him? So >> I think a couple of points to raise in that area are no. One of things that we like to do is try and triangulate on user to engage them. And every channel we wanted to go and bring into the fold, creating unique dimension of how do we validate that this is the same person, right? Because each channel that you engage with has potentially different requirements of, um, user accreditation or, ah, guarantee of, you know, single user fuel. That's why I think the Holy Grail used to be in a lot of ways, like single sign on our way to triangulate across the spirit systems, one common identity or person to make that world simple. I don't think that's a reality in the in the sense that when you look at, um, a product provider or solution provider and a customer that's external, write those those two worlds Avery spirit and there was a lot of channels and pitch it potentially even third party means that I might want to engage this individual by. And every time I want to bring another one of those channels online, it further complicates. Validating who? That person eighty. >> Okay, so So when you were doing your data warehouse thing again as an I t practitioner, Um, you have you You try to expand the channels, but every time he did that and complex if I hide the data source So how did you deal with that problem? So just create another database and stole five Everything well, >> unfortunately, absolutely creates us this notion of islands of information throughout the enterprise. Because, as you mentioned, you know, we define a schema effectively a new place, Um, data elements into that schema of how you identified how you engage in and how you rate that person's behaviors or engagement, etcetera. And I think what you'd see is, as you'd bring on new sources that timeto actually emerge those things together wasn't in the order of days or weeks. It's on months and years. And so, with every new channel that became interesting, you further complicate the problem and effectively, What you do is, you know, creating these pools of information on you. Take extracts and you try and do something to munch the data and put in a place where you give access to an analyst to say, Okay, here's it. Another, um, Sample said a day to try and figure out of these things. Align and you try and create effectively a new schema that includes all the additional day that we just added. >> So it's interesting because again, one of the themes that we've been hearing a lot of this conference and hear it a lot in many conferences, not the technology. It's the people in process around the technology. That's certainly any person person would agree with that. But at the same time, the technology historically has been problematic, particularly data. Warehouse technology has been challenging you. So you've had toe keep databases relatively small and despair, and you had to build business processes around those that's right a basis. So you've not only got, you know, deficient technology, if you will, no offense, toe data, warehousing friends, but you've got ah, process creep that's actually fair. That's occurred, and >> I think you know what is happening is it's one of the things that's led to sort of the the revolution it's occurring in the market right now about you know, whether it's the new ecosystem or all the tangential technologies around that. Because what what's bound not some technology issues in the past has been the schema right. As important as that is because it gives people a very easy way to interact with the data. It also creates significant challenges when you want to bring on these unique sources of information. Because, you know, as you look at things that have happened over the last decade, the engagement process for either a consumer, a prospect or customer have changed pretty dramatically, and they don't all have the same stringent requirements about providing information to become engaged that way. So I think where the schema has, you know, has value you obviously, in the enterprise, it also has a lot of, um, historical challenges that brings along with >> us. So this jump movement is very disruptive to the traditional market spaces. Many folks say it isn't traditional guy, say, say it isn't but clearly is, particularly as you go Omni Channel. I threw that word out earlier on the channels of discussion that we had a dupe summit myself. John Ferrier, Hobby lobby meta and as your and this is something that you guys are doing that bringing in data to allow your customers to go Omni Channel. As you do that, you start again. Increase the complexity of the corpus of data at the same time. A lot of a lot of times into do you hear about scheme alight ski, but less so how do you reconcile the Omni Channel? The scheme of less It's their scheme alight. And the data quality >> problems, Yes, I think for, you know, particular speaking about adapt one of things that we do is we give customers the ability to take and effectively dump all that data into one common repository that is HD if s and do and leverage some of those open source tools and even their own, you know, inventions, if you will, you know, with m R code pig, whatever, and allow them to effectively normalized data through it orations and to do and then push that into tables effectively that now we can give access to the sequel interface. Right? So I think for us the abilities you're absolutely right. The more channels. You, Khun, give access to write. So this concept of anomie channel where Irrespective of what way we engaged with a customer what way? They touch us in some way. Being able to provide those dimensions of data in one common repository gives the marketeer, if you will, an incredible flexibility and insights that were previous, Who'd be discoverable >> assuming that data qualities this scene >> right of all these So so that that was gonna be my question. So what did the data quality implications of using something like HD FSB. You're essentially scheme unless you're just dumping data and essentially have a raw format and and it's raw format. So now you've gotto reconcile all these different types of data from different sources on build out that kind of single view of a customer of a product, Whatever, whatever is yours. You're right. >> So how do you go >> about doing that in that kind of scenario? So I think the repository in Hindu breach defense himself gives you that one common ground toa workin because you've got, you know, no implications of schema or any other preconceived notions about how you're going toe to toe massage weight if you will, And it's about applying logic and looking for those universal ides. There are a bunch of tools around that are focused on this, but applying those tools and it means that doesn't, um, handy captain from the start by predisposing them to some structure. And you want them to decipher or call out that through whether it's began homegrown type scripts, tools that might be upstairs here and then effectively normalizing the data and moving it into some structure where you can interact with it on in a meaningful way. So that really the kind the old way of trying to bring, you know, snippets of the data from different sources into ah, yet another database where you've got a play structure that takes time, months and years in some cases. And so Duke really allows you to speed up that process significantly by basically eliminating that that part of the equation. Yeah, I think there's and there's a bunch of dimensions we could talk about things like even like pricing exercises, right quality of triangulating on what that pricing should be per product for geography, for engagement, etcetera. I think you see that a lot of those types of work. Let's have transitioned from, you know, mainframe type environments, environments of legacy to the Duke ecosystem. And we've seen cases where people talk about they're going from eight month, you know, exercises to a week. And I think that's where the value of this ecosystem in you know, the commodity scalability really provides you with flexibility. That was just previously you unachievable. >> So could you provide some examples either >> you know, your own from your own career or from some customers you're seeing in terms of the data quality implications of the type of work they're doing. So one of our kind of *** is that you know the data quality measures required for any given, uh, use case various, in some cases, depending on the type of case. You know, in depending on the speed that you need, the analysis done, uh, the type of data quality or the level data qualities going is going to marry. Are you seeing that? And if >> so, can you give some examples of the different >> types of way data quality Gonna manifest itself in a big data were close. Sure. So I think that's absolutely fair. And you know. Obviously there's there's gonna be some trade off between accuracy and performance, right? And so you have to create some sort of confidence coefficient part, if you will, that you know, within some degree of probability this is good enough, right? And there's got to be some sort of balance between that actor Jerseyan time Um, some of the things that you know I've seen a lot of customers being interested in is it is a sort of market emerging around providing tools for authenticity of engagement. So it's an example. You know, I may be a large brand, and I have very, um, open channels that I engage somebody with my B e mail might be some Web portal, etcetera, and there's a lot of fishing that goes on out there, right? And so people fishing for whether it's brands and misrepresenting themselves etcetera. And there's a lot of, you know, desire to try and triangulate on data quality of who is effectively positioned themselves as me, who's really not me and being able to sort of, you know, take a cybersecurity spin and started to block those things down and alleviate those sort of nefarious activities. So We've seen a lot of people using our tool to effectively understand and be able to pinpoint those activities based upon behavior's based upon, um, out liars and looking at examples of where the engagement's coming from that aren't authentic if that >> makes you feel any somewhat nebulous but right. So using >> analytics essentially to determine the authenticity of a person of intensity, of an engagement rather than taking more rather than kind of looking at the data itself using pattern detection to determine. But it also taking, you know, there's a bunch of, um, there's a bunch of raw data that exists out there that needs you when you put it together again. Back to this notion of this sort of, you know, landing zone, if you will, or Data Lake or whatever you wanna call it. You know, putting all of this this data into one repository where now I can start to do you know, analytics against it without any sort of pre determined schema. And start to understand, you know, are these people who are purporting to be, you know, firm X y Z are there really from X y Z? And if they're not, where these things originating and how, when we start to put filters or things in place to alleviate those sort of and that could apply, it sounds like to certainly private industry. But, I mean, >> it sounds like >> something you know, government would be very interested in terms ofthe, you know, in the news about different foreign countries potentially being the source of attacks on U. S. Corporations are part of the, uh, part of our infrastructure and trying to determine where that's coming from and who these people are. And >> of course, people were trying to get >> complicated because they're trying to cover up their tracks, right? Certainly. But I think that the most important thing in this context is it's not necessarily about being able to look at it after the fact, but it's being able to look at a set of conditions that occur before these things happen and identify those conditions and put controls in place to alleviate the action from taking place. I think that's where when you look at what is happening from now an acceleration of these models and from an acceleration of the quality of the data gathering being able to put those things into place and put effective controls in place beforehand is changing. You know the loss prevention side of the business and in this one example. But you're absolutely right. From from what I see and from what our customers were doing, it is, you know, it's multi dimensional in that you know this cyber security. That's one example. There's pricing that could be another example. There's engagements from, ah, final analysis or conversion ratio that could be yet another example. So I think you're right in it and that it is ubiquitous. >> So when you think about the historical role of the well historical we had Stewart on earlier, he was saying, the first known chief data officer we could find was two thousand three. So I guess that gives us a decade of history. But if you look back at the hole, I mean data quality. We've been talking about that for many, many decades. So if you think about the traditional or role of an organization, trying tio achieved data quality, single version of the truth, information, quality, information value and you inject it with this destruction of a dupe that to me anyway, that whole notion of data quality is changing because in certain use, cases inference just fine. Um, in false positives are great. Who cares? That's right. Now analyzing Twitter data from some cases and others like healthcare and financial services. It's it's critical. But so how do you see the notion of data quality evolving and adapting to this >> new world? Well, I think one of these you mentioned about this, you know, this single version of the truth was something that was, you know, when I was on the other side of the table, >> they were beating you over the head waken Do this, We >> can do this, and it's It's something that it sounds great on paper. But when you look at the practical implications of trying to do it in a very finite or stringent controlled way, it's not practical for the business >> because you're saying that the portions of your data that you can give a single version of the truth on our so small because of the elapsed time That's right. I think there's that >> dimension. But there's also this element of time, right and the time that it takes to define something that could be that rigid and the structure months. It's months, and by that time a lot of the innovations that business is trying to >> accomplish. The eyes have changed. The initiatives has changed. Yeah, you lost the sale. Hey, but we got the data. It would look here. Yeah, I think that's your >> right. And I think that's what's evolving. I think there's this idea that you know what Let's fail fast and let's do a lot of it. Orations and the flexibility it's being provided out in that ecosystem today gives people an opportunity. Teo iterated failed fast, and you write that you set some sort of, you know confidence in that for this particular application. We're happy with you in a percent confidence. Go fish. You are something a little >> bit, but it's good enough. So having said that now, what can we learn from the traditional date? A quality, you know, chief data officer, practitioners, those who've been very dogmatic, particularly in certain it is what can we learn from them and take into this >> new war? I think from my point of view on what my experience has always been is that those individuals have an unparalleled command of the business and have an appreciation for the end goal that the business is trying to accomplish. And it's taking that instinct that knowledge and applying that to the emergence of what's happening in the technology world and bringing those two things together. I think it's It's not so much as you know, there's a practical application in that sense of Okay, here's the technology options that we have to do these, you know, these desired you engaged father again. It's the pricing engagement, the cyber security or whatever. It's more. How could we accelerate what the business is trying to accomplish and applying this? You know, this technology that's out there to the business problem. I think in a lot of ways, you know, in the past it's always been here. But this really need technology. How can I make it that somewhere? And now I think those folks bring a lot of relevance to the technology to say Hey, here's a problem. Trying to solve legacy methodologies haven't been effective. Haven't been timely. Haven't been, uh, scaleable. Whatever hock me. Apply what's happening. The market today to these problems. >> Um, you guys adapt in particular to me any way a good signal of the maturity model and with the maturity of a dupe, it's It's starting to grow up pretty rapidly, you know, See, due to two auto. And so where are we had? What do you see is the progression, Um, and where we're going. >> So, you know, I mentioned it it on the cue for the last time it So it and I said, I believe that you know who do busy operating system of big data. And I believe that, you know, there's a huge transition taking place that was there were some interesting response to that on Twitter and all the other channels, but I stand behind that. I think that's really what's happening. Lookit. You know what people are engaging us to do is really start to transition away from the legacy methodologies and they're looking at. He's not just lower cost alternatives, but also more flexibility. And we talked about, you know, its summit. The notion of that revenue curve right and cost takeouts great on one side of the coin, and I are one side of the defense here. But I think equally and even more importantly, is the change in the revenue curve and the insights that people they're finding because of these unique channels of the Omni Challenge you describe being able to. So look at all these dimensions have dated one. Unified place is really changing the way that they could go to market. They could engage consumers on DH that they could provide access to the analyst. Yeah. I mean, ultimately, that's the most >> we had. Stewart Madness con who's maybe got written textbooks on operating systems. We probably use them. I know I did. Maybe they were gone by the time you got there, but young, but the point being, you know, a dupe azan operating system. The notion of a platform is really it's changing dramatically. So, um, I think you're right on that. Okay. So what's what's next for you guys? Uh, we talked about, you know, customer attraction and proof points. You're working. All right on that. I know. Um, you guys got a great tech, amazing team. Um, what's next for >> you? So I think it's it's continuing toe. Look at the market in being flexible with the market around as the Hughes case is developed. So, you know, obviously is a startup We're focused in a couple of key areas where we see a lot of early adoption and a lot of pain around the problem that we can solve. But I think it's really about continuing to develop those use cases, um, and expanded the market to become more of a, you know, a holistic provider of Angelique Solutions on top of a >> house. Uh, how's Cambridge working out for you, right? I mean, the company moved up from the founders, moved up from New Haven and chose shows the East Coast shows cameras were obviously really happy about. That is East Coast people. You don't live there full time, but I might as well. So how's that working out talent pool? You know, the vibrancy of the community, the the you know, the young people that you're able to tap. So >> I see there's a bunch of dimensions around that one. It's hot. It's really, really hot >> in human, Yes, but it's been actually >> fantastic. And if you look it not just a town inside the team, but I think around the team. So if you look at our board right Jet Saxena. Chris Lynch, I've been very successful. The database community over decades of experience, you know, and getting folks like that onto the board fell. The Hardiman has been, you know, in this space as well for a long time. Having folks like that is, you know, advisors and providing guidance to the team. Absolutely incredible. Hack Reduce is a great facility where we do things like hackathons meet ups get the community together. So I think there's been a lot of positive inertia around the company just being here in Cambridge. But, you know, from AA development of resource or recruiting one of you. It's also been great because you've got some really exceptional database companies in this area, and history will show you like there's been a lot of success here, not only an incubating technology, but building real database companies. And, you know, we're on start up on the block that people are very interested in, and I think we show a lot of, you know, dynamics that are changing in the market and the way the markets moving. So the ability for us to recruit talent is exceptional, right? We've got a lot of great people to pick from. We've had a lot of people joined from no other previously very successful database companies. The team's growing, you know, significantly in the engineering space right now. Um, but I just you know, I can't say enough good things about the community. Hack, reduce and all the resource is that we get access to because we're here in Cambridge. >> Is the hacker deuces cool? So you guys are obviously leveraging that you do how to bring people into the Sohag produces essentially this. It's not an incubator. It's really more of a an idea cloud. It's a resource cloud really started by Fred Lan and Chris Lynch on DH. Essentially, people come in, they share ideas. You guys I know have hosted a number of how twos and and it's basically open. You know, we've done some stuff there. It's it's very cool. >> Yeah, you know, I think you know, it's even for us. It's also a great place to recruit, right. We made a lot of talented people there, and you know what? The university participation as well We get a lot of talent coming in, participate in these activities, and we do things that aren't just adapt related, that we've had people teach had obsessions and just sort of evangelize what's happening in the ecosystem around us. And like I said, it's just it's been a great resource pool to engage with. And, uh, I think it's been is beneficial to the community, as it has been to us. So very grateful for that. >> All right. Scott has always awesome. See, I knew you were going to have some good practitioner perspectives on data. Qualities really appreciate you stopping by. My pleasure. Thanks for having to see you. Take care. I keep right to everybody right back with our next guest. This is Dave a lot. They would. Jeff Kelly, this is the Cube. We're live here at the MIT Information Quality Symposium. We'LL be right back.

Published Date : Jul 17 2013

SUMMARY :

the Duke Duke ecosystem and helping make sense out of all this data bringing insights to It's good to be here. So this this notion of data quality, the reason why we asked you to be on here today is because first of all, I don't think that's a reality in the in the sense that when you look at, um, that became interesting, you further complicate the problem and effectively, What you do is, databases relatively small and despair, and you had to build business processes around those it's occurring in the market right now about you know, whether it's the new ecosystem or all the A lot of a lot of times into do you hear about scheme alight ski, but less so problems, Yes, I think for, you know, particular speaking about adapt one of things that we do is we So what did the data quality implications of using And I think that's where the value of this ecosystem in you know, the commodity scalability So one of our kind of *** is that you know the data quality that you know, within some degree of probability this is good enough, right? makes you feel any somewhat nebulous but right. And start to understand, you know, are these people who are purporting something you know, government would be very interested in terms ofthe, you know, in the news about different customers were doing, it is, you know, it's multi dimensional in that you know this cyber security. So if you think about the traditional or But when you look at the practical of the truth on our so small because of the elapsed time That's right. could be that rigid and the structure months. Yeah, you lost the sale. I think there's this idea that you know what Let's fail fast and A quality, you know, chief data officer, practitioners, those who've been very dogmatic, here's the technology options that we have to do these, you know, these desired you engaged you know, See, due to two auto. And I believe that, you know, there's a huge transition taking place Uh, we talked about, you know, customer attraction and proof points. um, and expanded the market to become more of a, you know, a holistic provider the the you know, the young people that you're able to tap. I see there's a bunch of dimensions around that one. on the block that people are very interested in, and I think we show a lot of, you know, dynamics that are changing in So you guys are obviously leveraging that you do how to bring people into the Sohag Yeah, you know, I think you know, it's even for us. Qualities really appreciate you stopping by.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff KellyPERSON

0.99+

ScottPERSON

0.99+

Omni ChannelORGANIZATION

0.99+

Chris LynchPERSON

0.99+

Scott HowserPERSON

0.99+

Dave VolantePERSON

0.99+

CambridgeLOCATION

0.99+

fiveQUANTITY

0.99+

eight monthQUANTITY

0.99+

todayDATE

0.99+

Angelique SolutionsORGANIZATION

0.99+

DavePERSON

0.99+

John FerrierPERSON

0.99+

firstQUANTITY

0.99+

Fred LanPERSON

0.99+

Scott HauserPERSON

0.99+

SohagORGANIZATION

0.99+

New HavenLOCATION

0.99+

TwitterORGANIZATION

0.99+

Cambridge, MassachusettsLOCATION

0.99+

two thousandQUANTITY

0.99+

two thingsQUANTITY

0.99+

StewartPERSON

0.99+

eightyQUANTITY

0.99+

oneQUANTITY

0.99+

one exampleQUANTITY

0.98+

each channelQUANTITY

0.98+

one sideQUANTITY

0.98+

singleQUANTITY

0.98+

OneQUANTITY

0.98+

2013DATE

0.97+

HughesPERSON

0.97+

a weekQUANTITY

0.96+

twoQUANTITY

0.96+

one repositoryQUANTITY

0.96+

#CDOIQORGANIZATION

0.96+

East CoastLOCATION

0.96+

two worldsQUANTITY

0.95+

a decadeQUANTITY

0.94+

one common repositoryQUANTITY

0.93+

Hack ReduceORGANIZATION

0.92+

#MITORGANIZATION

0.91+

one common repositoryQUANTITY

0.91+

Wicked BondORGANIZATION

0.91+

CubeORGANIZATION

0.91+

one commonQUANTITY

0.89+

MIT Information QualityEVENT

0.89+

Mighty Information Quality SymposiumEVENT

0.88+

KhunPERSON

0.87+

MIT Information QualityORGANIZATION

0.86+

single versionQUANTITY

0.86+

a dayQUANTITY

0.85+

twosQUANTITY

0.85+

TeoPERSON

0.85+

SamplePERSON

0.82+

Duke DukeORGANIZATION

0.81+

one side ofQUANTITY

0.8+

single signQUANTITY

0.8+

DukeORGANIZATION

0.76+

Jet SaxenaPERSON

0.75+

HobbyORGANIZATION

0.75+

last decadeDATE

0.74+

Data LakeLOCATION

0.72+

themesQUANTITY

0.7+

Adapt CompanyORGANIZATION

0.65+

Cube Silicon AnglesORGANIZATION

0.62+

HinduOTHER

0.61+

DukeLOCATION

0.6+

HadaptORGANIZATION

0.58+

HardimanPERSON

0.57+

threeQUANTITY

0.52+

SymposiumORGANIZATION

0.51+

pointsQUANTITY

0.5+

#theCUBEORGANIZATION

0.49+

Stewart MadnessPERSON

0.49+

U. S.ORGANIZATION

0.48+

coupleQUANTITY

0.47+