Image Title

Search Results for Charles:

Matt Maccaux, HPE | HPE Discover 2021


 

(bright music) >> Data by its very nature is distributed and siloed, but most data architectures today are highly centralized. Organizations are increasingly challenged to organize and manage data, and turn that data into insights. This idea of a single monolithic platform for data, it's giving way to new thinking. Where a decentralized approach, with open cloud native principles and federated governance, will become an underpinning of digital transformations. Hi everybody. This is Dave Volante. Welcome back to HPE Discover 2021, the virtual version. You're watching theCube's continuous coverage of the event and we're here with Matt Maccaux, who's a field CTO for Ezmeral Software at HPE. We're going to talk about HPE software strategy, and Ezmeral and specifically how to take AI analytics to scale and ensure the productivity of data teams. Matt, welcome to theCube. Good to see you. >> Good to see you again, Dave. Thanks for having me today. >> You're welcome. So talk a little bit about your role as a CTO. Where do you spend your time? >> I spend about half of my time talking to customers and partners about where they are on their digital transformation journeys and where they struggle with this sort of last phase where we start talking about bringing those cloud principles and practices into the data world. How do I take those data warehouses, those data lakes, those distributed data systems, into the enterprise and deploy them in a cloud-like manner? Then the other half of my time is working with our product teams to feed that information back, so that we can continually innovate to the next generation of our software platform. >> So when I remember, I've been following HP and HPE, for a long, long time, theCube has documented, we go back to sort of when the company was breaking in two parts, and at the time a lot of people were saying, "Oh, HP is getting rid of their software business, they're getting out of software." I said, "No, no, no, hold on. They're really focusing", and the whole focus around hybrid cloud and now as a service, you've really retooling that business and sharpened your focus. So tell us more about Ezmeral, it's a cool name, but what exactly is Ezmeral software? >> I get this question all the time. So what is Ezmeral? Ezmeral is a software platform for modern data and analytics workloads, using open source software components. We came from some inorganic growth. We acquired a company called Cytec, that brought us a zero trust approach to doing security with containers. We bought BlueData who came to us with an orchestrator before Kubernetes even existed in mainstream. They were orchestrating workloads using containers for some of these more difficult workloads. Clustered applications, distributed applications like Hadoop. Then finally we acquired MapR, which gave us this scale out distributed file system and additional analytical capabilities. What we've done is we've taken those components and we've also gone out into the marketplace to see what open source projects exist to allow us to bring those cloud principles and practices to these types of workloads, so that we can take things like Hadoop, and Spark, and Presto, and deploy and orchestrate them using open source Kubernetes. Leveraging GPU's, while providing that zero trust approach to security, that's what Ezmeral is all about is taking those cloud practices and principles, but without locking you in. Again, using those open source components where they exist, and then committing and contributing back to the opensource community where those projects don't exist. >> You know, it's interesting, thank you for that history, and when I go back, I have been there since the early days of Big Data and Hadoop and so forth and MapR always had the best product, but they couldn't get it out. Back then it was like kumbaya, open source, and they had this kind of proprietary system but it worked and that's why it was the best product. So at the same time they participated in open source projects because everybody did, that's where the innovation is going. So you're making that really hard to use stuff easier to use with Kubernetes orchestration, and then obviously, I'm presuming with the open source chops, sort of leaning into the big trends that you're seeing in the marketplace. So my question is, what are those big trends that you're seeing when you speak to technology executives which is a big part of what you do? >> So the trends are, I think, are a couplefold, and it's funny about Hadoop, but I think the final nails in the coffin have been hammered in with the Hadoop space now. So that leading trend, of where organizations are going, we're seeing organizations wanting to go cloud first. But they really struggle with these data-intensive workloads. Do I have to store my data in every cloud? Am I going to pay egress in every cloud? Well, what if my data scientists are most comfortable in AWS, but my data analysts are more comfortable in Azure, how do I provide that multi-cloud experience for these data workloads? That's the number one question I get asked, and that's probably the biggest struggle for these chief data officers, chief digital officers, is how do I allow that innovation but maintaining control over my data compliance especially when we talk international standards, like GDPR, to restrict access to data, the ability to be forgotten, in these multinational organizations how do I sort of square all of those components? Then how do I do that in a way that just doesn't lock me into another appliance or software vendor stack? I want to be able to work within the confines of the ecosystem, use the tools that are out there, but allow my organization to innovate in a very structured compliant way. >> I mean, I love this conversation and you just, to me, you hit on the key word, which is organization. I want to talk about what some of the barriers are. And again, you heard my wrap up front. I really do think that we've created, not only from a technology standpoint, and yes the tooling is important, but so is the organization, and as you said an analyst might want to work in one environment, a data scientist might want to work in another environment. The data may be very distributed. You might have situations where they're supporting the line of business. The line of business is trying to build new products, and if I have to go through this monolithic centralized organization, that's a barrier for me. And so we're seeing that change, that I kind of alluded to it up front, but what do you see as the big barriers that are blocking this vision from becoming a reality? >> It very much is organization, Dave. The technology's actually no longer the inhibitor here. We have enough technology, enough choices out there that technology is no longer the issue. It's the organization's willingness to embrace some of those technologies and put just the right level of control around accessing that data. Because if you don't allow your data scientists and data analysts to innovate, they're going to do one of two things. They're either going to leave, and then you have a huge problem keeping up with your competitors, or they're going to do it anyway. And they're going to do it in a way that probably doesn't comply with the organizational standards. So the more progressive enterprises that I speak with have realized that they need to allow these various analytical users to choose the tools they want, to self provision those as they need to and get access to data in a secure and compliant way. And that means we need to bring the cloud to generally where the data is because it's a heck of a lot easier than trying to bring the data where the cloud is, while conforming to those data principles, and that's HPE's strategy. You've heard it from our CEO for years now. Everything needs to be delivered as a service. It's Ezmeral Software that enables that capability, such as self-service and secure data provisioning, et cetera. >> Again, I love this conversation because if you go back to the early days of Hadoop, that was what was profound about a Hadoop. Bring five megabytes of code to a petabyte of data, and it didn't happen. We shoved it all into a data lake and it became a data swamp. And that's okay, it's a one dot oh, you know, maybe in data as is like data warehouses, data hubs, data lakes, maybe this is now a four dot oh, but we're getting there. But open source, one thing's for sure, it continues to gain momentum, it's where the innovation is. I wonder if you could comment on your thoughts on the role that open-source software plays for large enterprises, maybe some of the hurdles that are there, whether they're legal or licensing, or just fears, how important is open source software today? >> I think the cloud native developments, following the 12 factor applications, microservices based, paved the way over the last decade to make using open source technology tools and libraries mainstream. We have to tip our hats to Red Hat, right? For allowing organizations to embrace something so core as an operating system within the enterprise. But what everyone realized is that it's support that's what has to come with that. So we can allow our data scientists to use open source libraries, packages, and notebooks, but are we going to allow those to run in production? So if the answer is no, well? Then if we can't get support, we're not going to allow that. So where HPE Ezmeral is taking the lead here is, again, embracing those open source capabilities, but, if we deploy it, we're going to support it. Or we're going to work with the organization that has the committers to support it. You call HPE, the same phone number you've been calling for years for tier one 24 by seven support, and we will support your Kubernetes, your Spark your Presto, your Hadoop ecosystem of components. We're that throat to choke and we'll provide, all the way up to break/fix support, for some of these components and packages, giving these large enterprises the confidence to move forward with open source, but knowing that they have a trusted partner in which to do so. >> And that's why we've seen such success with say, for instance, managed services in the cloud, versus throwing out all the animals in the zoo and say, okay, figure it out yourself. But then, of course, what we saw, which was kind of ironic, was people finally said, "Hey, we can do this in the cloud more easily." So that's where you're seeing a lot of data land. However, the definition of cloud or the notion of cloud is changing. No longer is it just this remote set of services, "Somewhere out there in the cloud", some data center somewhere, no, it's moving to on-prem, on-prem is creating hybrid connections. You're seeing co-location facilities very proximate to the cloud. We're talking now about the edge, the near edge, and the far edge, deeply embedded. So that whole notion of cloud is changing. But I want to ask you, there's still a big push to cloud, everybody has a cloud first mantra, how do you see HPE competing in this new landscape? >> I think collaborating is probably a better word, although you could certainly argue if we're just leasing or renting hardware, then it would be competition, but I think again... The workload is going to flow to where the data exists. So if the data's being generated at the edge and being pumped into the cloud, then cloud is prod. That's the production system. If the data is generated via on-premises systems, then that's where it's going to be executed. That's production, and so HPE's approach is very much co-exist. It's a co-exist model of, if you need to do DevTests in the cloud and bring it back on-premises, fine, or vice versa. The key here is not locking our customers and our prospective clients into any sort of proprietary stack, as we were talking about earlier, giving people the flexibility to move those workloads to where the data exists, that is going to allow us to continue to get share of wallet, mind share, continue to deploy those workloads. And yes, there's going to competition that comes along. Do you run this on a GCP or do you run it on a GreenLake on-premises? Sure, we'll have those conversations, but again, if we're using open source software as the foundation for that, then actually where you run it is less relevant. >> So there's a lot of choices out there, when it comes to containers generally and Kubernetes specifically, and you may have answered this, you get the zero trust component, you've got the orchestrator, you've got the scale-out piece, but I'm interested in hearing in your words why an enterprise would or should consider Ezmeral instead of alternatives to Kubernetes solutions? >> It's a fair question, and it comes up in almost every conversation. "Oh, we already do Kubernetes, we have a Kubernetes standard", and that's largely true in most of the enterprises I speak to. They're using one of the many on-premises distributions to their cloud distributions, and they're all fine. They're all fine for what they were built for. Ezmeral was generally built for something a little different. Yes, everybody can run microservices based applications, DevOps based workloads, but where Ezmeral is different is for those data intensive, in clustered applications. Those sorts of applications require a certain degree of network awareness, persistent storage, et cetera, which requires either a significant amount of intelligence. Either you have to write in Golang, or you have to write your own operators, or Ezmeral can be that easy button. We deploy those stateful applications, because we bring a persistent storage layer, that came from MapR. We're really good at deploying those stateful clustered applications, and, in fact, we've opened sourced that as a project, KubeDirector, that came from BlueData, and we're really good at securing these, using SPIFFE and SPIRE, to ensure that there's that zero trust approach, that came from Scytale, and we've wrapped all of that in Kubernetes. So now you can take the most difficult, gnarly complex data intensive applications in your enterprise and deploy them using open source. And if that means we have to co-exist with an existing Kubernetes distribution, that's fine. That's actually the most common scenario that I walk into is, I start asking about, "What about these other applications you haven't done yet?" The answer is usually, "We haven't gotten to them yet", or "We're thinking about it", and that's when we talk about the capabilities of Ezmeral and I usually get the response, "Oh. A, we didn't know you existed and B well, let's talk about how exactly you do that." So again, it's more of a co-exist model rather than a compete with model, Dave. >> Well, that makes sense. I mean, I think again, a lot of people, they go, "Oh yeah, Kubernetes, no big deal. It's everywhere." But you're talking about a solution, kind of taking a platform approach with capabilities. You got to protect the data. A lot of times, these microservices aren't so micro and things are happening really fast. You've got to be secure. You got to be protected. And like you said, you've got a single phone number. You know, people say one throat to choke. Somebody in the media the other day said, "No, no. Single hand to shake." It's more of a partnership. I think that's apropos for HPE, Matt, with your heritage. >> That one's better. >> So, you know, thinking about this whole, we've gone through the pre big data days and the big data was all the hot buzzword. People don't maybe necessarily use that term anymore, although the data is bigger and getting bigger, which is kind of ironic. Where do you see this whole space going? We've talked about that sort of trend toward breaking down the silos, decentralization, maybe these hyper specialized roles that we've created, maybe getting more embedded or aligned with the line of business. How do you see... It feels like the next 10 years are going to be different than the last 10 years. How do you see it, Matt? >> I completely agree. I think we are entering this next era, and I don't know if it's well-defined. I don't know if I would go out on an edge to say exactly what the trend is going to be. But as you said earlier, data lakes really turned into data swamps. We ended up with lots of them in the enterprise, and enterprises had to allow that to happen. They had to let each business unit or each group of users collect the data that they needed and IT sort of had to deal with that down the road. I think that the more progressive organizations are leading the way. They are, again, taking those lessons from cloud and application developments, microservices, and they're allowing a freedom of choice. They're allowing data to move, to where those applications are, and I think this decentralized approach is really going to be king. You're going to see traditional software packages. You're going to see open source. You're going to see a mix of those, but what I think will probably be common throughout all of that is there's going to be this sense of automation, this sense that, we can't just build an algorithm once, release it and then wish it luck. That we've got to treat these analytics, and these data systems, as living things. That there's life cycles that we have to support. Which means we need to have DevOps for our data science. We need a CI/CD for our data analytics. We need to provide engineering at scale, like we do for software engineering. That's going to require automation, and an organizational thinking process, to allow that to actually occur. I think all of those things. The sort of people, process, products. It's all three of those things that are going to have to come into play, but stealing those best ideas from cloud and application developments, I think we're going to end up with probably something new over the next decade or so. >> Again, I'm loving this conversation, so I'm going to stick with it for a sec. It's hard to predict, but some takeaways that I have, Matt, from our conversation, I wonder if you could comment? I think the future is more open source. You mentioned automation, Devs are going to be key. I think governance as code, security designed in at the point of code creation, is going to be critical. It's no longer going be a bolt on. I don't think we're going to throw away the data warehouse or the data hubs or the data lakes. I think they become a node. I like this idea, I don't know if you know Zhamak Dehghani? but she has this idea of a global data mesh where these tools, lakes, whatever, they're a node on the mesh. They're discoverable. They're shareable. They're governed in a way. I think the mistake a lot of people made early on in the big data movement is, "Oh, we got data. We have to monetize our data." As opposed to thinking about what products can I build that are based on data that then can lead to monetization? I think the other thing I would say is the business has gotten way too technical. (Dave chuckles) It's alienated a lot of the business lines. I think we're seeing that change, and I think things like Ezmeral that simplify that, are critical. So I'll give you the final thoughts, based on my rant. >> No, your rant is spot on Dave. I think we are in agreement about a lot of things. Governance is absolutely key. If you don't know where your data is, what it's used for, and can apply policies to it. It doesn't matter what technology you throw at it, you're going to end up in the same state that you're essentially in today, with lots of swamps. I did like that concept of a node or a data mesh. It kind of goes back to the similar thing with a service mesh, or a set of APIs that you can use. I think we're going to have something similar with data. The trick is always, how heavy is it? How easy is it to move about? I think there's always going to be that latency issue, maybe not within the data center, but across the WAN. Latency is still going to be key, which means we need to have really good processes to be able to move data around. As you said, govern it. Determine who has access to what, when, and under what conditions, and then allow it to be free. Allow people to bring their choice of tools, provision them how they need to, while providing that audit, compliance and control. And then again, as you need to provision data across those nodes for those use cases, do so in a well measured and governed way. I think that's sort of where things are going. But we keep using that term governance, I think that's so key, and there's nothing better than using open source software because that provides traceability, auditability and this, frankly, openness that allows you to say, "I don't like where this project's going. I want to go in a different direction." And it gives those enterprises a control over these platforms that they've never had before. >> Matt, thanks so much for the discussion. I really enjoyed it. Awesome perspectives. >> Well thank you for having me, Dave. Excellent conversation as always. Thanks for having me again. >> You're very welcome. And thank you for watching everybody. This is theCube's continuous coverage of HPE Discover 2021. Of course, the virtual version. Next year, we're going to be back live. My name is Dave Volante. Keep it right there. (upbeat music)

Published Date : Jun 22 2021

SUMMARY :

and ensure the productivity of data teams. Good to see you again, Dave. Where do you spend your time? and practices into the data world. and at the time a lot and practices to these types of workloads, and MapR always had the best product, the ability to be forgotten, and if I have to go through this the cloud to generally where it continues to gain momentum, the committers to support it. of cloud or the notion that is going to allow us in most of the enterprises I speak to. You got to be protected. and the big data was all the hot buzzword. of that is there's going to so I'm going to stick with it for a sec. and then allow it to be free. for the discussion. Well thank you for having me, Dave. Of course, the virtual version.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Matt MaccauxPERSON

0.99+

MattPERSON

0.99+

Dave VolantePERSON

0.99+

HPORGANIZATION

0.99+

CytecORGANIZATION

0.99+

Next yearDATE

0.99+

two partsQUANTITY

0.99+

AWSORGANIZATION

0.99+

Zhamak DehghaniPERSON

0.99+

HPEORGANIZATION

0.99+

BlueDataORGANIZATION

0.99+

todayDATE

0.99+

HadoopTITLE

0.99+

12 factorQUANTITY

0.99+

each business unitQUANTITY

0.99+

GDPRTITLE

0.98+

GolangTITLE

0.98+

each groupQUANTITY

0.98+

EzmeralORGANIZATION

0.97+

threeQUANTITY

0.97+

zero trustQUANTITY

0.97+

single phone numberQUANTITY

0.96+

EzmeralPERSON

0.96+

singleQUANTITY

0.96+

oneQUANTITY

0.96+

sevenQUANTITY

0.95+

kumbayaORGANIZATION

0.95+

one thingQUANTITY

0.93+

Big DataTITLE

0.91+

two thingsQUANTITY

0.9+

theCubeORGANIZATION

0.9+

next 10 yearsDATE

0.89+

four dotQUANTITY

0.89+

first mantraQUANTITY

0.89+

last 10 yearsDATE

0.88+

Ezmeral SoftwareORGANIZATION

0.88+

one environmentQUANTITY

0.88+

MapRORGANIZATION

0.87+

ScytaleORGANIZATION

0.87+

next decadeDATE

0.86+

firstQUANTITY

0.86+

KubernetesTITLE

0.86+

SPIFFETITLE

0.84+

SPIRETITLE

0.83+

tier oneQUANTITY

0.82+

SparkTITLE

0.8+

five megabytes of codeQUANTITY

0.77+

KubeDirectorORGANIZATION

0.75+

one questionQUANTITY

0.74+

Single handQUANTITY

0.74+

yearsQUANTITY

0.73+

last decadeDATE

0.73+

2021DATE

0.73+

AzureTITLE

0.7+

Sunil James, Sr Director, HPE [ZOOM]


 

(bright music) >> Welcome back to HPE Discover 2021. My name is Dave Vellante and you're watching theCUBE's virtual coverage of Discover. We're going to dig into the most pressing topic, not only for IT, but entire organizations. And that's cyber security. With me is Sunil James, senior director of security engineering at Hewlett Packard Enterprise. Sunil, welcome to theCUBE. Come on in. >> Dave, thank you for having me. I appreciate it. >> Hey, you talked about project Aurora today. Tell us about project Aurora, what is that? >> So I'm glad you asked. Project Aurora is a new framework that we're working on that attempts to provide the underpinnings for Zero Trust architectures inside of everything that we build at HPE. Zero Trust is a way of providing a mechanism for enterprises to allow for everything in their enterprise, whether it's a server, a human, or anything in between, to be verified and attested to before they're allowed to access or transact in certain ways. That's what we announced today. >> Well, so in response to a spate of damaging cyber attacks last month, President Biden issued an executive order designed to improve the United States' security posture. And in that order, he essentially issued a Zero Trust mandate. You know, it's interesting, Sunil. Zero Trust has gone from a buzzword to a critical part of a security strategy. So in thinking about a Zero Trust architecture, how do you think about that, and how does project Aurora fit in? >> Yeah, so Zero Trust architecture, as a concept, has been around for quite some time now. And over the last few years, we've seen many a company attempting to provide technologies that they purport to be Zero Trust. Zero Trust is a framework. It's not one technology, it's not one tool, it's not one product. It is an entire framework of thinking and applying cybersecurity principles to everything that we just talked about beforehand. Project Aurora, as I said beforehand, is designed to provide a way for ourselves and our customers to be able to measure, attest, and verify every single piece of technology that we sell to them. Whether it's a server or everything else in between. Now, we've got a long way to go before we're able to cover everything that HPE sells. But for us, these capabilities are the root of Zero Trust architectures. You need to be able to, at any given moment's notice, verify, measure, and attest, and this is what we're doing with project Aurora. >> So you founded a company called Scytale and sold that to HPE last year. And my understanding is you were really the driving force behind the secure production identity framework, but you said Zero Trust is really a framework. That's an open source project. Maybe you can explain what that is. I mean, people talk about the NIST Framework for cybersecurity. How does that relate? Why is this important and how does Aurora fit into it? >> Yeah, so that's a good question. The NIST Framework is a broader framework for cybersecurity that couples and covers many aspects of thinking about the security posture of an enterprise, whether it's network security, host based intrusion detection capabilities, incident response, things of that sort. SPIFFE, which you're referring to, Secure Production Identity Framework For Everyone, is an open source framework and technology base that we did work on when I was the CEO of Scytale, that was designed to provide a platform agnostic way to assign identity to anything that runs in a network. And so think about yourself or myself. We have identities in our back pocket, driver's license, passports, things of that sort. They provide a unique assertion of who we are, and what we're allowed to do. That does not exist in the world of software. And what SPIFFE does is it provides that mechanism so that you can actually use frameworks like project Aurora that can verify the underpinning infrastructure on top of which software workloads run to be able to verify those SPIFFE identities even better than before. >> Is the intent to productize this capability, you know, within this framework? How do you approach this from HPE's standpoint? >> So SPIFFE and SPIRE will and always will be, as far as I'm concerned, remain an open source project held by the Cloud Native Computing Foundation. It's for the world, all right. And we want that to be the case because we think that more of our Enterprise customers are not living in the world of one vendor or two vendors. They have multiple vendors. And so we need to give them the tools and the flexibility to be able to allow for open source capabilities like SPIFFE and SPIRE to provide a way for them to assign these identities and assign policies and control, regardless of the infrastructure choices they make today or tomorrow. HPE recognizes that this is a key differentiating capability for our customers. And our goal is to be able to look at our offerings that power the next generation of workloads. Kubernetes instances, containers, serverless, and anything that comes after that. And our responsibility is to say, "How can we actually take what we have and be able to provide those kinds of assertions, those underpinnings for Zero Trust that are going to be necessary to distribute those identities to those workloads, and to do so in a scalable, effective, and automated manner?" Which is one of the most important things that project Aurora does. >> So a lot of companies, Sunil, will set up a security division. But is the HPE strategy to essentially embed security across its entire portfolio? How should we think about HPE strategy in cyber? >> Yeah, so it's a great question. HPE has a long history in security and other domains, networking, and servers, and storage, and beyond. The way we think about what we're building with project Aurora, this is plumbing. This is plumbing that must be in everything we build. Customers don't buy one product from us and they think it's one company, and something else from us, and they think it's another company. They're buying HPE products. And our goal with project Aurora is to ensure that this plumbing is widely and uniformly distributed and made available. So whether you're buying an Aruba device, a Primera storage device, or a ProLiant server, project Aurora's capabilities are going to provide a consistent way to do the things that I've mentioned beforehand to allow for those Zero Trust architectures to become real. >> So, as I alluded to President Biden's executive order previously. I mean, you're a security practitioner, you're an expert in this area. It just seems as though, and I'd love to get your comments on this. I mean, the adversaries are well-funded, you know, they're either organized crime, they're nation states. They're extracting a lot of very valuable information, they're monetizing that. You've seen things like ransomware as a service now. So any knucklehead can be in the ransomware business. So it's just this endless escalation game. How do you see the industry approaching this? What needs to happen? So obviously I like what you're saying about the plumbing. You're not trying to attack this with a bunch of point tools, which is part of the problem. How do you see the industry coming together to solve this problem? >> Yeah. If you operate in the world of security, you have to operate from the standpoint of humility. And the reason why you have to operate from a standpoint of humility is because the attack landscape is constantly changing. The things, and tools, and investments, and techniques that you thought were going to thwart an attacker today, they're quickly outdated within a week, a month, a quarter, whatever it might be. And so you have to be able to consistently and continuously evolve and adapt towards what customers are facing on any given moment's notice. I think to be able to, as an industry, tackle these issues more and moreso, you need to be able to have all of us start to abide, not abide, but start to adopt these open-source patterns. We recognize that every company, HPE included, is here to serve customers and to make money for its shareholders as well. But in order for us to do that, we have to also recognize that they've got other technologies in their infrastructure as well. And so it's our belief, it's my belief, that allowing for us to support open standards with SPIFFE and SPIRE, and perhaps with some of the aspects of what we're doing with project Aurora, I think allows for other people to be able to kind of deliver the same underpinning capabilities, the plumbing, if you will, regardless of whether it's an HPE product or something else along those lines as well. We need more of that generally across our industry, and I think we're far from it. >> I mean, this sounds like a war. I mean, it's more than a battle, it's a war that actually is never going to end. And I don't think there is an end in sight. And you hear CESOs talk about the shortage of talent, they're getting inundated with point products and tools, and then that just creates more technical debt. It's been interesting to watch. Interesting maybe is not the right word. But the pivot to Zero Trust, endpoint security, cloud security, and the exposure that we've now seen as a result of the pandemic was sort of rushed. And then of course, we've seen, you know, the adversaries really take advantage of that. So, I mean what you're describing is this ongoing never-ending battle, isn't it? >> Yeah, yeah, no, it's going to be ongoing. And by the way, Zero Trust is not the end state, right? I mean, there was things that we called the final nail in the coffin five years ago, 10 years ago, and yet the attackers persevered. And that's because there's a lot of innovation out there. There's a lot of infrastructure moving to dynamic architectures like cloud and others that are going to be poorly configured, and are going to not have necessarily the best and brightest providing security around them. So we have to remain vigilant. We have to work as hard as we can to help customers deploy Zero Trust architectures. But we have to be thinking about what's next. We have to be watching, studying, and evolving to be able to prepare ourselves, to be able to go after whatever the next capabilities are. >> What I like about what you're saying is, you're right. You have to have humility. I don't want to say, I mean, it's hard because I do feel like a lot of times the vendor community says, "Okay, we have the answer," to your point. "Okay, we have a Zero Trust solution." Or, "We have a solution." And there is no silver bullet in this game. And I think what I'm hearing from you is, look we're providing infrastructure, plumbing, the substrate, but it's an open system. It's got to evolve. And the thing you didn't say, but I'd love your thoughts on this is we've got to collaborate with somebody you might think is your competitor. 'Cause they're the good guys. >> Yeah. Our customers don't care that we're competitors with anybody. They care that we're helping them solve their problems for their business. So our responsibility is to figure out what we need to do to work together to provide the basic capabilities that allow for our customers to remain in business, right? If cybersecurity issues plague any of our customers that doesn't affect just HPE, that affects all of the companies that are serving that customer. And so, I think we have a shared responsibility to be able to protect our customers. >> And you've been in cyber for much, if not most of your career, right? >> Correct. >> So I got to ask you, did you have a superhero when you were a kid? Did you have a sort of a, you know, save the world thing going? >> Did I have a, you know, I didn't have a save the world thing going, but I had, I had two parents that cared for the world in many, many ways. They were both in the world of healthcare. And so everyday I saw them taking care of other people. And I think that probably rubbed off in some of the decisions that I make too. >> Well it's awesome. You're doing great work, really appreciate you coming on theCUBE, and thank you so much for your insights. >> I appreciate that, thanks. >> And thank you for being with us for our ongoing coverage of HPE Discover 21. This is Dave Vellante. You're watching theCUBE. The leader in digital tech coverage. We'll be right back. (bright music)

Published Date : Jun 6 2021

SUMMARY :

Welcome back to HPE Discover 2021. Dave, thank you for having me. Hey, you talked about that attempts to provide the underpinnings Well, so in response to a spate and our customers to be able and sold that to HPE last year. to be able to verify And our goal is to be able But is the HPE strategy to essentially Aurora is to ensure and I'd love to get your comments on this. I think to be able to, as an industry, But the pivot to Zero that are going to be poorly configured, And the thing you didn't say, to be able to protect our customers. I didn't have a save the and thank you so much for your insights. And thank you for being with us

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Sunil JamesPERSON

0.99+

DavePERSON

0.99+

two vendorsQUANTITY

0.99+

ScytaleORGANIZATION

0.99+

two parentsQUANTITY

0.99+

last yearDATE

0.99+

Cloud Native Computing FoundationORGANIZATION

0.99+

PresidentPERSON

0.99+

last monthDATE

0.99+

Zero TrustORGANIZATION

0.99+

tomorrowDATE

0.99+

HPEORGANIZATION

0.99+

Hewlett Packard EnterpriseORGANIZATION

0.99+

one vendorQUANTITY

0.99+

five years agoDATE

0.99+

bothQUANTITY

0.99+

10 years agoDATE

0.99+

Zero TrustORGANIZATION

0.98+

oneQUANTITY

0.98+

SunilORGANIZATION

0.98+

one companyQUANTITY

0.98+

SunilPERSON

0.98+

a monthQUANTITY

0.98+

one productQUANTITY

0.98+

todayDATE

0.98+

a weekQUANTITY

0.98+

SPIFFETITLE

0.97+

SPIRETITLE

0.96+

one toolQUANTITY

0.96+

a quarterQUANTITY

0.95+

pandemicEVENT

0.95+

BidenPERSON

0.94+

AuroraTITLE

0.93+

NIST FrameworkTITLE

0.93+

AuroraORGANIZATION

0.88+

theCUBEORGANIZATION

0.87+

projectORGANIZATION

0.87+

Zero TrustTITLE

0.87+

SPIREORGANIZATION

0.81+

UnitedORGANIZATION

0.8+

ArubaLOCATION

0.77+

Project AuroraTITLE

0.74+

SPIFFEORGANIZATION

0.73+

project AuroraORGANIZATION

0.73+

PrimeraORGANIZATION

0.69+

single pieceQUANTITY

0.69+

DiscoverTITLE

0.68+

Discover 21TITLE

0.68+

States'LOCATION

0.67+

FrameworkTITLE

0.65+

CESOsORGANIZATION

0.63+

projectTITLE

0.58+

Arti Garg & Sorin Cheran, HPE | HPE Discover 2020


 

>> Male Voice: From around the globe, it's theCUBE covering HPE Discover Virtual Experience brought to you by HPE. >> Hi everybody, you're watching theCUBE. And this is Dave Vellante in our continuous coverage of the Discover 2020 Virtual Experience, HPE's virtual event, theCUBE is here, theCUBE virtual. We're really excited, we got a great session here. We're going to dig deep into machine intelligence and artificial intelligence. Dr. Arti Garg is here. She's the Head of Advanced AI Solutions and Technologies at Hewlett Packard Enterprise. And she's joined by Dr. Sorin Cheran, who is the Vice President of AI Strategy and Solutions Group at HPE. Folks, great to see you. Welcome to theCUBE. >> Hi. >> Hi, nice to meet you, hello! >> Dr. Cheran, let's start with you. Maybe talk a little bit about your role. You've had a variety of roles and maybe what's your current situation at HPE? >> Hello! Hi, so currently at HPE, I'm driving the Artificial Intelligence Strategy and Solution group who is currently looking at how do we bring solutions across the HPE portfolio, looking at every business unit, but also on the various geos. At the same time, the team is responsible for building the strategy around the AI for the entire company. We're working closely with the field, we're working closely with the things that are facing the customers every day. And we're also working very closely with the various groups in order to make sure that whatever we build holds water for the entire company. >> Dr. Garg, maybe you could share with us your focus these days? >> Yeah, sure, so I'm also part of the AI Strategy and Solutions team under Sorin as our new vice president in that role, and what I'm focused on is really trying to understand, what are some of the emerging technologies, whether those be things like new processor architectures, or advanced software technologies that could really enhance what we can offer to our customers in terms of AI and exploring what makes sense and how do we bring them to our customers? What are the right ways to package them into solutions? >> So everybody's talking about how digital transformation has been accelerated. If you're not digital, you can't transact business. AI infused into every application. And now people are realizing, "Hey, we can't solve all the world's problems with labor." What are you seeing just in terms of AI being accelerated throughout the portfolio and your customers? >> So that's a very good idea, because we've been talking about digital transformation for some time now. And I believe most of our customers believed initially that the one thing they have is time thinking that, "Oh yes I'm going to somehow at one point apply AI "and somehow at one point "I'm going to figure out how to build the data strategy, "or how to use AI in my different line of businesses." What happened with COVID-19 and in this area is that we lost one thing: time. So I think discussed what they see in our customers is the idea of accelerating their data strategy accelerating, moving from let's say an environment where they would compute center models per data center models trying to understand how do they capture data, how they accelerate the adoption of AI within the various business units, why? Because they understand that currently the way they are actually going to the business changed completely, they need to understand how to adapt a new business model, they need to understand how to look for value pools where there are none as well. So most of our customers today, while initially they spend a lot of time in an never ending POC trying to investigate where do they want to go. Currently they do want to accelerate the application of AI models, the build of data strategies, how then they use all of this data? How do they capture the data to make sure that they look at new business models, new value pools, new customer experience and so on and so forth. So I think what they've seen in the past, let's say three to six months is that we lost time. But the shift towards an adoption of analytics, AI and data strategy is accelerated a lot, simply because customers realize that they need to get ahead of the game. >> So Dr. Garg, what if you could talk about how HPE is utilizing machine intelligence during this pandemic, maybe helping some of your customers, get ahead of it, or at least trying to track it. How are you applying AI in this context? >> So I think that Sorin sort of spoke to one of the things with adopting AI is, it's very transformational for a business so it changes how you do things. You need to actually adopt new processes to take advantage of it. So what I would say is right now we're hearing from customers who recognize that the context in which they are doing their work is completely different. And they're exploring how AI can help them really meet the challenges of those context. So one example might be how can AI and computer vision be coupled together in a way that makes it easier to reopen stores, or ensures that people are distancing appropriately in factories. So I would say that it's the beginning of these conversations as customers as businesses try to figure out how do we operate in the new reality that we have? And I think it's a pretty exciting time. And I think just to the point that Sorin just made, there's a lot of openness to new technologies that there wasn't before, because there's this willingness to change the business processes to really take advantage of any technologies. >> So Dr. Cheran, I probably should have started here but help us understand HPE's overall strategy with regard to AI. I would certainly know that you're using AI to improve IT, the InfoSite product and capability via the Nimble acquisition, et cetera, and bringing that across the portfolio. But what's the strategy for HPE? >> So, yeah, thank you. That's (laughs) a good question. So obviously you started with a couple of our acquisition in the past because obviously Nimble and then we talked a lot about our efforts to bring InfoSite across the portfolio. But currently, in the past couple of months, let's say close to a year, we've been announcing a lot of other acquisitions and we've been talking about Tuteybens, we've been talking about Scytale we've been talking about Cray, and so on, so forth, and now what we're doing at HPE is to bring all of this IP together into one place and try to help our customers within their region out. If you're looking at what, for example, what did they actually get when Cray play was not only the receiver, but we also acquire and they also have a lot of software and a lot of IP around optimization and so on and so forth. Also within our own labs, we've been investigating AI around like, for example, some learning or accelerators or a lot of other activity. So right now what we're trying to help our customers with is to understand how do they lead from the production stage, from the POC stage to the production stage. So (mumbles) what we are trying to do is we are trying to accelerate their adoption of AI. So simply starting from an optimized platform infrastructure up to the solution they are actually going to apply or to use to solve their business problems and wrapping all of that around with services either consumed on-prem as a service and so on. So practically what we want to do is we want to help our customers optimize, orchestrate and operationalize AI. Because the problem of our customers is not to start in our PLC, the problem is how do I then take everything that I've been developing or working on and then put it in production at the edge, right? And then keep it, maintaining production in order to get insights and then actually take actions that are helping the enterprise. So basically, we want to be data driven assets in cloud enable, and we want to help our customers move from POC into production. >> Or do you work with obviously a lot of data folks, companies or data driven data scientists, you are hands on practitioners in this regard. One of the challenges that I hear a lot from customers is they're trying to operationalize AI put AI into production, they have data in silos, they spend all their time, munging data, you guys have made a number of acquisitions. Not a list of which is prey, obviously map of, data specialist, my friend Kumar's company Blue Data. So what do you see as HPE's role in terms of helping companies operationalize AI. >> So I think that a big part of operationalizing AI moving away from the PLC to really integrate AI into the business processes you have and also the sort of pre existing IT infrastructure you talked about, you might already have siloed data. That's sort of something we know very well at HPE, we understand a lot of the IT that enterprises already have the incumbent IT and those systems. We also understand how to put together systems and integrated systems that include a lot of different types of computing infrastructure. So whether that being different types of servers and different types of storage, we have the ability to bring all of that together. And then we also have the software that allows you to talk to all of these different components and build applications that can be deployed in the real world in a way that's easy to maintain, and scale and grow as your AI applications will almost invariably get more complex involved, more outputs involved and more input. So one of the important things as customers try to operationalize AI is think is knowing that it's not just solving the problem you're currently solving. It's not just operationalizing the solution you have today, it's ensuring that you can continue to operationalize new things or additional capabilities in the future. >> I want to talk a little bit about AI for good. We talked about AI taking away jobs, but the reality is, when you look at the productivity data, for instance, in the United States, in Europe, it's declining and it has for the last several decades and so I guess my point is that we're not going to be able to solve some of the world problems in the coming decades without machine intelligence. I mean you think about health care, you think about feeding populations, you think about obviously paying things like pandemics, climate change, energy alternatives, et cetera, productivity is coming down. Machines are potential opportunity. So there's an automation imperative. And you feel, Dr. Cheran, the people who are sort of beyond that machines replacing human's issue? Is that's still an item or has the pandemic sort of changed that? >> So I believe it is, so it used to be a very big item, you're right. And every time we were speaking at a conference and every time you're actually looking at the features of AI, right? Two scenarios are coming to plays, right? The first one where machines are here, actually take a walk, and then the second one as you know even a darker version where terminator is coming, yes and so forth, right? So basically these are the two, is the lesser evil in the greater evil and so on and so forth. And we still see that regular thing coming over and over again. And I believe that 2019 was the year of reckoning, where people are trying to realize that not only we can actually take responsible AI, but we can actually create an AI that is trustworthy, an AI that is fair and so on and so forth. And that we also understood in 2019 it was highly debated everywhere, which part of our jobs are going to be replaced like the parts that are mundane, or that can actually be easily automated and so on and so forth. With the COVID-19 what happened is that people are starting to look at AI differently, why? Because people are starting to look at data differently. And looking at data differently, how do I actually create this core of data which is trusted, secure and so on and so forth, and they are trying to understand that if the data is trusted and secure somehow, AI will be trusted and secure as well. Now, if I actually shifted forward, as you said, and then I try to understand, for example on the manufacturing floor, how do I add more machines? Or how do I replace humans with machines simply because, I need to make sure that I am able to stay in production and so on and so forth. From their perspective, I don't believe that the view of all people are actually looking at AI from the job marketplace perspective changed a lot. The view that actually changes how AI is helping us better certain prices, how AI is helping us, for example, in health care, but the idea of AI actually taking part of the jobs or automating parts of the jobs, we are not actually past yet, even if 2018 and even more so in 2019, it was the year also where actually AI through automation replaced the number of jobs but at the same time because as I was saying the first year where AI created more jobs it's because once you're displacing in one place, they're actually creating more work more opportunities in other places as well. But still, I don't believe the feeling changed. But we realize that AI is a lot more valuable and it can actually help us through some of our darkest hours, but also allow us to get better and faster insights as well. >> Well, machines have always replaced humans and now for the first time in history doing so in a really cognitive functions in a big way. But I want to ask you guys, I'll start with Dr. Arti, a series of questions that I think underscore the impact of AI and the central role that it plays in companies digital transformations, we talk about that a lot. But the questions that I'm going to ask you, I think will hit home just in terms of some hardcore examples, and if you have others I'd love to hear them but I'm going to start with Arti. So when do you think Dr. or machines will be able to make better diagnoses than doctors? We're actually there today already? >> So I think it depends a little bit on how you define that. And I'm just going to preface this by saying both of my parents are physicians. So I have a little bit of bias in this space. But I think that humans can bring creativity in a certain type of intelligence that it's not clear to me. We even know how to model with the computer. And so diagnoses have sometimes two components. One is recognizing patterns and being able to say, "I'm going to diagnose this disease that I've seen before." I think that we are getting to the place where there are certain examples. It's just starting to happen where you might be able to take the data that you need to make a diagnosis as well understood. A machine may be able to sort of recognize those subtle patterns better. But there's another component of doing diagnosis is when it's not obvious what you're looking for. You're trying to figure out what is the actual sort of setup diseases I might be looking at. And I think that's where we don't really know how to model that type of inspiration and creativity that humans still bring to things that they do, including medical diagnoses. >> So Dr. Cheran my next question is, when do you think that owning and driving your own vehicle will become largely obsolete? >> (laughs) Well, I believe my son is six year old now. And I believe, I'm working with a lot of companies to make sure that he will not get his driving license with his ID, right? So depending who you're asking and depending the level of autonomy that you're looking at, but you just mentioned the level five most likely. So there are a lot of dates out there so some people actually say 2030. I believe that my son in most of the cities in US but also most of the cities in Europe, by the time he's 18 in let's say 2035, I'll try to make sure that I'm working with the right companies not to allow them to get the driving license. >> I'll let my next question is from maybe both of you can answer. Do you take the traditional banks will lose control of payment system? >> So that's an interesting question, because I think it's broader than an AI question, right? I think that it goes into some other emerging technologies, including distributed ledgers and sort of the more secure forms of blockchain. I think that's a challenging question to my mind, because it's bigger than the technology. It's got Economic and Policy implications that I'm not sure I can answer. >> Well, that's a great answer, 'cause I agree with you already. I think that governments and banks have a partnership. It's important partnership for social stability. But similar we've seen now, Dr. Cheran in retail, obviously the COVID-19 has affected retail in a major way, especially physical retail, do you think that large retail stores are going to go away? I mean, we've seen many in chapter 11. At this point, how much of that is machine intelligence versus just social change versus digital transformation? It's an interesting question, isn't it? >> So I think most of the... Right now the retailers are here to stay I guess for the next couple of years. But moving forward, I think their capacity of adapting to stores like to walk in stores or to stores where basically we just go in and there are no shop assistants and just you don't even need the credit card to pay you're actually being able to pay either with your face or with your phone or with your small chips and so on and so forth. So I believe currently in the next couple of years, obviously they are here to stay. Moving forward then we'll get artificial intelligence, or robotics applied everywhere in the store and so on and so forth. Most likely their capacity of adapting to the new normal, which is placing AI everywhere and optimizing the walk in through predicting when and how to guide the customers to the shop, and so on and so forth, would allow them to actually survive. I don't believe that everything is actually going to be done online, especially from the retailer perspective. Most of the... We've seen a big shift at COVID-19. But what I was reading the other day, especially in France that the counter has opened again, we've seen a very quick pickup in the retailers of people that actually visiting the stores as well. So it's going to be some very interesting five to 10 years, and then most of the companies that have adapted to the digital transformation and to the new normal I think they are here to stay. Some of them obviously are going to take sometime. >> I mean, I think it's an interesting question too that you really sort of triggering in my mind is when you think about the framework for how companies are going to come back and come out of this, it's not just digital, that's a big piece of it, like how digital businesses, can they physically distance? I mean, I don't know how sports arenas are going to be able to physically distance that's going to be interesting to see how essential is the business and if you think about the different industries that it really is quite different across those industries. And obviously, digital plays a big factor there, but maybe we could end on that your final thoughts and maybe any other other things you'd like to share with our audience? >> So I think one of the things that's interesting anytime you talk about adopting a new technology, and right now we're happening to see this sort of huge uptick in AI adoption happening right at the same time but this sort of massive shift in how we live our lives is happening and sort of an acceptance, I think that can't just go back to the way things work as you mentioned, they'll probably be continued sort of desire to maintain social distancing. I think that it's going to force us to sort of rethink why we do things the way we do now, a lot, the retail, environments that we have the transportation solutions that we have, they were adapted in many cases in a very different context, in terms of what people need to do on a day-to-day basis within their life. And then what were the sort of state of technologies available. We're sort of being thrust and forced to reckon with like, what is it I really need to do to live my life and then what are the technologies I have available to meet to answer that and I think, it's really difficult to predict right now what people will think is important about a retail experience, I wouldn't be surprised if you start to find in person retail actually be much less, technologically aided, and much more about having the ability to talk to a human being and get their opinion and maybe the tactile sense of being able to like touch new clothes, or whatever it is. And so it's really difficult I think right now to predict what things are going to look like maybe even a year or two from now from that perspective. I think that what I feel fairly confident is that people are really starting to understand and engage with new technologies, and they're going to be really open to thinking about what those new technologies enable them to do in this sort of new way of living that we're going to probably be entering pretty soon. >> Excellent! All right, Sorin, bring us home. We'll give you the last word on this topic. >> Now, so I wanted to... I agree with Arti because what these three months of staying at home and of busy shutting down allowed us to do was to actually have a very big reset. So let's say a great reset but basically we realize that all the things we've taken from granted like our freedom of movement, our technology, our interactions with each other, and also for suddenly we realize that everything needs to change. And the only one thing that we actually kept doing is interacting with each other remotely, interacting with each other with our peers in the house, and so on and so forth. But the one thing that stayed was generating data, and data was here to stay because we actually leave traces of data everywhere we go, we leave traces of data when we put our watch on where we are actually playing with our phone, or to consume digital and so on and so forth. So what these three months reinforced for me personally, but also for some of our customers was that the data is here to stay. And even if the world shut down for three months, we did not generate less data. Data was there on the contrary, in some cases, more data. So the data is the main enabler for the new normal, which is going to pick up and the data will actually allow us to understand how to increase customer experience in the new normal, most likely using AI. As I was saying at the beginning, how do I actually operate new business model? How do I find, who do I partner with? How do I actually go to market together? How do I make collaborations more secure, and so on and so forth. And finally, where do I actually find new value pools? For example, how do I actually still enjoy for having a beer in a pub, right? Because suddenly during the COVID-19, that wasn't possible. I have a very nice place around the corner, but it's actually cheaply stuff. I'm not talking about beer but in general, I mean, so the finance is different the pools of data, the pools (mumbles) actually, getting values are different as well. So data is here to stay, and the AI definitely is going to be accelerated because it needs to use data to allow us to adopt the new normal in the digital transformation. >> A lot of unknowns but certainly machines and data are going to play a big role in the coming decade. I want to thank Dr. Arti Garg and Dr. Sorin Cheran for coming on theCUBE. It's great to have you. Thank you for a wonderful conversation. Really appreciate it. >> Thank you very much. >> Thanks so much. >> All right. And thank you for watching everybody. This is Dave Vellante for theCUBE and the HPE 2020 Virtual Experience. We'll be right back right after this short break. (upbeat music)

Published Date : Jun 23 2020

SUMMARY :

brought to you by HPE. of the Discover 2020 Virtual Experience, and maybe what's your in order to make sure Dr. Garg, maybe you could share with us and your customers? that the one thing they So Dr. Garg, what And I think just to the and bringing that across the portfolio. from the POC stage to the production stage. One of the challenges that the solution you have today, but the reality is, when you I need to make sure that I am able to stay and now for the first time in history and being able to say, question is, when do you think but also most of the cities in Europe, maybe both of you can answer. and sort of the more obviously the COVID-19 has Right now the retailers are here to stay for how companies are going to having the ability to talk We'll give you the last and the data will actually are going to play a big And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

CheranPERSON

0.99+

FranceLOCATION

0.99+

Blue DataORGANIZATION

0.99+

EuropeLOCATION

0.99+

2019DATE

0.99+

USLOCATION

0.99+

2018DATE

0.99+

HPEORGANIZATION

0.99+

KumarPERSON

0.99+

NimbleORGANIZATION

0.99+

Sorin CheranPERSON

0.99+

Arti GargPERSON

0.99+

Arti GargPERSON

0.99+

threeQUANTITY

0.99+

COVID-19OTHER

0.99+

GargPERSON

0.99+

three monthsQUANTITY

0.99+

bothQUANTITY

0.99+

Hewlett Packard EnterpriseORGANIZATION

0.99+

United StatesLOCATION

0.99+

twoQUANTITY

0.99+

18QUANTITY

0.99+

fiveQUANTITY

0.99+

2035DATE

0.99+

six monthsQUANTITY

0.99+

Two scenariosQUANTITY

0.99+

oneQUANTITY

0.98+

one thingQUANTITY

0.98+

first timeQUANTITY

0.98+

ArtiPERSON

0.98+

10 yearsQUANTITY

0.98+

OneQUANTITY

0.98+

first yearQUANTITY

0.98+

InfoSiteORGANIZATION

0.98+

SorinPERSON

0.98+

2030DATE

0.98+

todayDATE

0.98+

two componentsQUANTITY

0.97+

AI Strategy and Solutions GroupORGANIZATION

0.97+

a yearQUANTITY

0.97+

one exampleQUANTITY

0.96+

six year oldQUANTITY

0.96+

second oneQUANTITY

0.96+

next couple of yearsDATE

0.96+

Dr.PERSON

0.96+

chapter 11OTHER

0.96+

one placeQUANTITY

0.95+

Discover 2020 Virtual ExperienceEVENT

0.95+

Cray playTITLE

0.94+

HPE 2020EVENT

0.91+

pandemicEVENT

0.89+

past couple of monthsDATE

0.88+

ScytaleORGANIZATION

0.87+