Breaking Analysis: How JPMC is Implementing a Data Mesh Architecture on the AWS Cloud
>> From theCUBE studios in Palo Alto and Boston, bringing you data-driven insights from theCUBE and ETR. This is braking analysis with Dave Vellante. >> A new era of data is upon us, and we're in a state of transition. You know, even our language reflects that. We rarely use the phrase big data anymore, rather we talk about digital transformation or digital business, or data-driven companies. Many have come to the realization that data is a not the new oil, because unlike oil, the same data can be used over and over for different purposes. We still use terms like data as an asset. However, that same narrative, when it's put forth by the vendor and practitioner communities, includes further discussions about democratizing and sharing data. Let me ask you this, when was the last time you wanted to share your financial assets with your coworkers or your partners or your customers? Hello everyone, and welcome to this week's Wikibon Cube Insights powered by ETR. In this breaking analysis, we want to share our assessment of the state of the data business. We'll do so by looking at the data mesh concept and how a leading financial institution, JP Morgan Chase is practically applying these relatively new ideas to transform its data architecture. Let's start by looking at what is the data mesh. As we've previously reported many times, data mesh is a concept and set of principles that was introduced in 2018 by Zhamak Deghani who's director of technology at ThoughtWorks, it's a global consultancy and software development company. And she created this movement because her clients, who were some of the leading firms in the world had invested heavily in predominantly monolithic data architectures that had failed to deliver desired outcomes in ROI. So her work went deep into trying to understand that problem. And her main conclusion that came out of this effort was the world of data is distributed and shoving all the data into a single monolithic architecture is an approach that fundamentally limits agility and scale. Now a profound concept of data mesh is the idea that data architectures should be organized around business lines with domain context. That the highly technical and hyper specialized roles of a centralized cross functional team are a key blocker to achieving our data aspirations. This is the first of four high level principles of data mesh. So first again, that the business domain should own the data end-to-end, rather than have it go through a centralized big data technical team. Second, a self-service platform is fundamental to a successful architectural approach where data is discoverable and shareable across an organization and an ecosystem. Third, product thinking is central to the idea of data mesh. In other words, data products will power the next era of data success. And fourth data products must be built with governance and compliance that is automated and federated. Now there's lot more to this concept and there are tons of resources on the web to learn more, including an entire community that is formed around data mesh. But this should give you a basic idea. Now, the other point is that, in observing Zhamak Deghani's work, she is deliberately avoided discussions around specific tooling, which I think has frustrated some folks because we all like to have references that tie to products and tools and companies. So this has been a two-edged sword in that, on the one hand it's good, because data mesh is designed to be tool agnostic and technology agnostic. On the other hand, it's led some folks to take liberties with the term data mesh and claim mission accomplished when their solution, you know, maybe more marketing than reality. So let's look at JP Morgan Chase in their data mesh journey. Is why I got really excited when I saw this past week, a team from JPMC held a meet up to discuss what they called, data lake strategy via data mesh architecture. I saw that title, I thought, well, that's a weird title. And I wondered, are they just taking their legacy data lakes and claiming they're now transformed into a data mesh? But in listening to the presentation, which was over an hour long, the answer is a definitive no, not at all in my opinion. A gentleman named Scott Hollerman organized the session that comprised these three speakers here, James Reid, who's a divisional CIO at JPMC, Arup Nanda who is a technologist and architect and Serita Bakst who is an information architect, again, all from JPMC. This was the most detailed and practical discussion that I've seen to date about implementing a data mesh. And this is JP Morgan's their approach, and we know they're extremely savvy and technically sound. And they've invested, it has to be billions in the past decade on data architecture across their massive company. And rather than dwell on the downsides of their big data past, I was really pleased to see how they're evolving their approach and embracing new thinking around data mesh. So today, we're going to share some of the slides that they use and comment on how it dovetails into the concept of data mesh that Zhamak Deghani has been promoting, and at least as we understand it. And dig a bit into some of the tooling that is being used by JP Morgan, particularly around it's AWS cloud. So the first point is it's all about business value, JPMC, they're in the money business, and in that world, business value is everything. So Jr Reid, the CIO showed this slide and talked about their overall goals, which centered on a cloud first strategy to modernize the JPMC platform. I think it's simple and sensible, but there's three factors on which he focused, cut costs always short, you got to do that. Number two was about unlocking new opportunities, or accelerating time to value. But I was really happy to see number three, data reuse. That's a fundamental value ingredient in the slide that he's presenting here. And his commentary was all about aligning with the domains and maximizing data reuse, i.e. data is not like oil and making sure there's appropriate governance around that. Now don't get caught up in the term data lake, I think it's just how JP Morgan communicates internally. It's invested in the data lake concept, so they use water analogies. They use things like data puddles, for example, which are single project data marts or data ponds, which comprise multiple data puddles. And these can feed in to data lakes. And as we'll see, JPMC doesn't strive to have a single version of the truth from a data standpoint that resides in a monolithic data lake, rather it enables the business lines to create and own their own data lakes that comprise fit for purpose data products. And they do have a single truth of metadata. Okay, we'll get to that. But generally speaking, each of the domains will own end-to-end their own data and be responsible for those data products, we'll talk about that more. Now the genesis of this was sort of a cloud first platform, JPMC is leaning into public cloud, which is ironic since the early days, in the early days of cloud, all the financial institutions were like never. Anyway, JPMC is going hard after it, they're adopting agile methods and microservices architectures, and it sees cloud as a fundamental enabler, but it recognizes that on-prem data must be part of the data mesh equation. Here's a slide that starts to get into some of that generic tooling, and then we'll go deeper. And I want to make a couple of points here that tie back to Zhamak Deghani's original concept. The first is that unlike many data architectures, this puts data as products right in the fat middle of the chart. The data products live in the business domains and are at the heart of the architecture. The databases, the Hadoop clusters, the files and APIs on the left-hand side, they serve the data product builders. The specialized roles on the right hand side, the DBA's, the data engineers, the data scientists, the data analysts, we could have put in quality engineers, et cetera, they serve the data products. Because the data products are owned by the business, they inherently have the context that is the middle of this diagram. And you can see at the bottom of the slide, the key principles include domain thinking, an end-to-end ownership of the data products. They build it, they own it, they run it, they manage it. At the same time, the goal is to democratize data with a self-service as a platform. One of the biggest points of contention of data mesh is governance. And as Serita Bakst said on the Meetup, metadata is your friend, and she kind of made a joke, she said, "This sounds kind of geeky, but it's important to have a metadata catalog to understand where data resides and the data lineage in overall change management. So to me, this really past the data mesh stink test pretty well. Let's look at data as products. CIO Reid said the most difficult thing for JPMC was getting their heads around data product, and they spent a lot of time getting this concept to work. Here's the slide they use to describe their data products as it related to their specific industry. They set a common language and taxonomy is very important, and you can imagine how difficult that was. He said, for example, it took a lot of discussion and debate to define what a transaction was. But you can see at a high level, these three product groups around wholesale, credit risk, party, and trade and position data as products, and each of these can have sub products, like, party, we'll have to know your customer, KYC for example. So a key for JPMC was to start at a high level and iterate to get more granular over time. So lots of decisions had to be made around who owns the products and the sub-products. The product owners interestingly had to defend why that product should even exist, what boundaries should be in place and what data sets do and don't belong in the various products. And this was a collaborative discussion, I'm sure there was contention around that between the lines of business. And which sub products should be part of these circles? They didn't say this, but tying it back to data mesh, each of these products, whether in a data lake or a data hub or a data pond or data warehouse, data puddle, each of these is a node in the global data mesh that is discoverable and governed. And supporting this notion, Serita said that, "This should not be infrastructure-bound, logically, any of these data products, whether on-prem or in the cloud can connect via the data mesh." So again, I felt like this really stayed true to the data mesh concept. Well, let's look at some of the key technical considerations that JPM discussed in quite some detail. This chart here shows a diagram of how JP Morgan thinks about the problem, and some of the challenges they had to consider were how to write to various data stores, can you and how can you move data from one data store to another? How can data be transformed? Where's the data located? Can the data be trusted? How can it be easily accessed? Who has the right to access that data? These are all problems that technology can help solve. And to address these issues, Arup Nanda explained that the heart of this slide is the data in ingestor instead of ETL. All data producers and contributors, they send their data to the ingestor and the ingestor then registers the data so it's in the data catalog. It does a data quality check and it tracks the lineage. Then, data is sent to the router, which persists the data in the data store based on the best destination as informed by the registration. This is designed to be a flexible system. In other words, the data store for a data product is not fixed, it's determined at the point of inventory, and that allows changes to be easily made in one place. The router simply reads that optimal location and sends it to the appropriate data store. Nowadays you see the schema infer there is used when there is no clear schema on right. In this case, the data product is not allowed to be consumed until the schema is inferred, and then the data goes into a raw area, and the inferer determines the schema and then updates the inventory system so that the data can be routed to the proper location and properly tracked. So that's some of the detail of how the sausage factory works in this particular use case, it was very interesting and informative. Now let's take a look at the specific implementation on AWS and dig into some of the tooling. As described in some detail by Arup Nanda, this diagram shows the reference architecture used by this group within JP Morgan, and it shows all the various AWS services and components that support their data mesh approach. So start with the authorization block right there underneath Kinesis. The lake formation is the single point of entitlement and has a number of buckets including, you can see there the raw area that we just talked about, a trusted bucket, a refined bucket, et cetera. Depending on the data characteristics at the data catalog registration block where you see the glue catalog, that determines in which bucket the router puts the data. And you can see the many AWS services in use here, identity, the EMR, the elastic MapReduce cluster from the legacy Hadoop work done over the years, the Redshift Spectrum and Athena, JPMC uses Athena for single threaded workloads and Redshift Spectrum for nested types so they can be queried independent of each other. Now remember very importantly, in this use case, there is not a single lake formation, rather than multiple lines of business will be authorized to create their own lakes, and that creates a challenge. So how can that be done in a flexible and automated manner? And that's where the data mesh comes into play. So JPMC came up with this federated lake formation accounts idea, and each line of business can create as many data producer or consumer accounts as they desire and roll them up into their master line of business lake formation account. And they cross-connect these data products in a federated model. And these all roll up into a master glue catalog so that any authorized user can find out where a specific data element is located. So this is like a super set catalog that comprises multiple sources and syncs up across the data mesh. So again to me, this was a very well thought out and practical application of database. Yes, it includes some notion of centralized management, but much of that responsibility has been passed down to the lines of business. It does roll up to a master catalog, but that's a metadata management effort that seems compulsory to ensure federated and automated governance. As well at JPMC, the office of the chief data officer is responsible for ensuring governance and compliance throughout the federation. All right, so let's take a look at some of the suspects in this world of data mesh and bring in the ETR data. Now, of course, ETR doesn't have a data mesh category, there's no such thing as that data mesh vendor, you build a data mesh, you don't buy it. So, what we did is we use the ETR dataset to select and filter on some of the culprits that we thought might contribute to the data mesh to see how they're performing. This chart depicts a popular view that we often like to share. It's a two dimensional graphic with net score or spending momentum on the vertical axis and market share or pervasiveness in the data set on the horizontal axis. And we filtered the data on sectors such as analytics, data warehouse, and the adjacencies to things that might fit into data mesh. And we think that these pretty well reflect participation that data mesh is certainly not all compassing. And it's a subset obviously, of all the vendors who could play in the space. Let's make a few observations. Now as is often the case, Azure and AWS, they're almost literally off the charts with very high spending velocity and large presence in the market. Oracle you can see also stands out because much of the world's data lives inside of Oracle databases. It doesn't have the spending momentum or growth, but the company remains prominent. And you can see Google Cloud doesn't have nearly the presence in the dataset, but it's momentum is highly elevated. Remember that red dotted line there, that 40% line, anything over that indicates elevated spending momentum. Let's go to Snowflake. Snowflake is consistently shown to be the gold standard in net score in the ETR dataset. It continues to maintain highly elevated spending velocity in the data. And in many ways, Snowflake with its data marketplace and its data cloud vision and data sharing approach, fit nicely into the data mesh concept. Now, a caution, Snowflake has used the term data mesh in it's marketing, but in our view, it lacks clarity, and we feel like they're still trying to figure out how to communicate what that really is. But is really, we think a lot of potential there to that vision. Databricks is also interesting because the firm has momentum and we expect further elevated levels in the vertical axis in upcoming surveys, especially as it readies for its IPO. The firm has a strong product and managed service, and is really one to watch. Now we included a number of other database companies for obvious reasons like Redis and Mongo, MariaDB, Couchbase and Terradata. SAP as well is in there, but that's not all database, but SAP is prominent so we included them. As is IBM more of a database, traditional database player also with the big presence. Cloudera includes Hortonworks and HPE Ezmeral comprises the MapR business that HPE acquired. So these guys got the big data movement started, between Cloudera, Hortonworks which is born out of Yahoo, which was the early big data, sorry early Hadoop innovator, kind of MapR when it's kind of owned course, and now that's all kind of come together in various forms. And of course, we've got Talend and Informatica are there, they are two data integration companies that are worth noting. We also included some of the AI and ML specialists and data science players in the mix like DataRobot who just did a monster $250 million round. Dataiku, H2O.ai and ThoughtSpot, which is all about democratizing data and injecting AI, and I think fits well into the data mesh concept. And you know we put VMware Cloud in there for reference because it really is the predominant on-prem infrastructure platform. All right, let's wrap with some final thoughts here, first, thanks a lot to the JP Morgan team for sharing this data. I really want to encourage practitioners and technologists, go to watch the YouTube of that meetup, we'll include it in the link of this session. And thank you to Zhamak Deghani and the entire data mesh community for the outstanding work that you're doing, challenging the established conventions of monolithic data architectures. The JPM presentation, it gives you real credibility, it takes Data Mesh well beyond concept, it demonstrates how it can be and is being done. And you know, this is not a perfect world, you're going to start somewhere and there's going to be some failures, the key is to recognize that shoving everything into a monolithic data architecture won't support massive scale and agility that you're after. It's maybe fine for smaller use cases in smaller firms, but if you're building a global platform in a data business, it's time to rethink data architecture. Now much of this is enabled by the cloud, but cloud first doesn't mean cloud only, doesn't mean you'll leave your on-prem data behind, on the contrary, you have to include non-public cloud data in your Data Mesh vision just as JPMC has done. You've got to get some quick wins, that's crucial so you can gain credibility within the organization and grow. And one of the key takeaways from the JP Morgan team is, there is a place for dogma, like organizing around data products and domains and getting that right. On the other hand, you have to remain flexible because technologies is going to come, technology is going to go, so you got to be flexible in that regard. And look, if you're going to embrace the metaphor of water like puddles and ponds and lakes, we suggest maybe a little tongue in cheek, but still we believe in this, that you expand your scope to include data ocean, something John Furry and I have talked about and laughed about extensively in theCUBE. Data oceans, it's huge. It's the new data lake, go transcend data lake, think oceans. And think about this, just as we're evolving our language, we should be evolving our metrics. Much the last the decade of big data was around just getting the stuff to work, getting it up and running, standing up infrastructure and managing massive, how much data you got? Massive amounts of data. And there were many KPIs built around, again, standing up that infrastructure, ingesting data, a lot of technical KPIs. This decade is not just about enabling better insights, it's a more than that. Data mesh points us to a new era of data value, and that requires the new metrics around monetizing data products, like how long does it take to go from data product conception to monetization? And how does that compare to what it is today? And what is the time to quality if the business owns the data, and the business has the context? the quality that comes out of them, out of the shoot should be at a basic level, pretty good, and at a higher mark than out of a big data team with no business context. Automation, AI, and very importantly, organizational restructuring of our data teams will heavily contribute to success in the coming years. So we encourage you, learn, lean in and create your data future. Okay, that's it for now, remember these episodes, they're all available as podcasts wherever you listen, all you got to do is search, breaking analysis podcast, and please subscribe. Check out ETR's website at etr.plus for all the data and all the survey information. We publish a full report every week on wikibon.com and siliconangle.com. And you can get in touch with us, email me david.vellante@siliconangle.com, you can DM me @dvellante, or you can comment on my LinkedIn posts. This is Dave Vellante for theCUBE insights powered by ETR. Have a great week everybody, stay safe, be well, and we'll see you next time. (upbeat music)
SUMMARY :
This is braking analysis and the adjacencies to things
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
JPMC | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
Zhamak Deghani | PERSON | 0.99+ |
James Reid | PERSON | 0.99+ |
JP Morgan | ORGANIZATION | 0.99+ |
JP Morgan | ORGANIZATION | 0.99+ |
Cloudera | ORGANIZATION | 0.99+ |
Serita Bakst | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Scott Hollerman | PERSON | 0.99+ |
Hortonworks | ORGANIZATION | 0.99+ |
Boston | LOCATION | 0.99+ |
40% | QUANTITY | 0.99+ |
JP Morgan Chase | ORGANIZATION | 0.99+ |
Serita | PERSON | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Arup Nanda | PERSON | 0.99+ |
each | QUANTITY | 0.99+ |
ThoughtWorks | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
david.vellante@siliconangle.com | OTHER | 0.99+ |
each line | QUANTITY | 0.99+ |
Terradata | ORGANIZATION | 0.99+ |
Redis | ORGANIZATION | 0.99+ |
$250 million | QUANTITY | 0.99+ |
first point | QUANTITY | 0.99+ |
three factors | QUANTITY | 0.99+ |
Second | QUANTITY | 0.99+ |
MapR | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Talend | ORGANIZATION | 0.99+ |
John Furry | PERSON | 0.99+ |
Zhamak Deghani | PERSON | 0.99+ |
first platform | QUANTITY | 0.98+ |
YouTube | ORGANIZATION | 0.98+ |
fourth | QUANTITY | 0.98+ |
single | QUANTITY | 0.98+ |
One | QUANTITY | 0.98+ |
Third | QUANTITY | 0.97+ |
Couchbase | ORGANIZATION | 0.97+ |
three speakers | QUANTITY | 0.97+ |
two data | QUANTITY | 0.97+ |
first strategy | QUANTITY | 0.96+ |
one | QUANTITY | 0.96+ |
one place | QUANTITY | 0.96+ |
Jr Reid | PERSON | 0.96+ |
single lake | QUANTITY | 0.95+ |
SAP | ORGANIZATION | 0.95+ |
wikibon.com | OTHER | 0.95+ |
siliconangle.com | OTHER | 0.94+ |
Azure | ORGANIZATION | 0.93+ |
Breaking Analysis: NFTs, Crypto Madness & Enterprise Blockchain
>> From theCUBE Studios in Palo Alto and Boston, bringing you data-driven insights from theCube and ETR, this is Breaking Analysis with Dave Vellante. >> When a piece of digital art sells for $69.3 million, more than has ever been paid for works, by Gauguin or Salvador Dali, making it created the third most expensive living artists in the world. One can't help but take notice and ask, what is going on? The latest craze around NFTs may feel a bit bubblicious, but it's yet another sign, that the digital age is now fully upon us. Hello and welcome to this week's Wikibon's CUBE insights, powered by ETR. In this Breaking Analysis, we want to take a look at some of the trends, that may be difficult for observers and investors to understand, but we think offer significant insights to the future and possibly some opportunities for young investors many of whom are fans of this program. And how the trends may relate to enterprise tech. Okay, so this guy Beeple is now the hottest artist on the planet. That's his Twitter profile. That picture on the inset. His name is Mike Winkelmann. He is actually a normal looking dude, but that's the picture he chose for his Twitter. This collage reminds me of the Million Dollar Homepage. You may already know the story, but many of you may not. Back in 2005 a college kid from England named Alex Tew, T-E-W created The Million Dollar Homepage to fund his education. And his idea was to create a website with a million pixels, and sell ads at a dollar for each pixel. Guess how much money he raised. A million bucks, right? No, wrong. He raised $1,037,100. How so you ask? Well, he auctioned off the last 1000 pixels on eBay, which fetched an additional $38,000. Crazy, right? Well, maybe not. Pretty creative in a way, way early sign of things to come. Now, I'm not going to go deep into NFTs, and explain the justification behind them. There's a lot of material that's been published that can do justice to the topic better than I can. But here are the basics, NFTs stands for Non-Fungible Tokens. They are digital representations of assets that exist in a blockchain. Now, each token as a unique and immutable identifier, and it uses cryptography to ensure its authenticity. NFTs by the name, they're not fungible. So, unlike Bitcoin, Ethereum or other cryptocurrencies, which can be traded on a like-for-like basis, in other words, if you and I each own one bitcoin we know exactly how much each of our bitcoins is worth at any point of time. Non-Fungible Tokens each have their own unique values. So, they're not comparable on a like-to-like basis. But what's the point of this? Well, NFTs can be applied to any property, identities tweets, videos, we're seeing collectables, digital art, pretty much anything. And it's really. The use cases are unlimited. And NFTs can streamline transactions, and they can be bought and sold very efficiently without the need for a trusted third party involved. Now, the other benefit is the probability of fraud, is greatly reduced. So where do NFTs fit as an asset class? Well, they're definitely a new type of asset. And again, I'm not going to try to justify their existence, but I want to talk about the choices, that investors have in the market today. The other day, I was on a call with Jay Po. He is a VC and a Principal at a company called Stage 2 Capital. He's a former Bessemer VC and one of the sharper investors around. And he was talking about the choices that investors have and he gave a nice example that I want to share with you and try to apply here. Now, as an investor, you have alternatives, of course we're showing here a few with their year to date charts. Now, as an example, you can buy Amazon stock. Now, if you bought just about exactly a year ago you did really well, you probably saw around an 80% return or more. But if you want to jump in today, your mindset might be, hmm, well, okay. Amazon, they're going to be around for a long time, so it's kind of low risk and I like the stock, but you're probably going to get, well let's say, maybe a 10% annual return over the longterm, 15% or maybe less maybe single digits, but, maybe more than that but it's unlikely that any kind of reasonable timeframe within any reasonable timeframe you're going to get a 10X return. In order to get that type of return on invested capital, Amazon would have to become a $16 trillion valued company. So, you sit there, you asked yourself, what's the probability that Amazon goes out of business? Well, that's pretty low, right? And what are the chances it becomes a $16 trillion company over the next several years? Well, it's probably more likely that it continues to grow at that more stable rate that I talked about. Okay, now let's talk about Snowflake. Now, as you know, we've covered the company quite extensively. We watched this company grow from an early stage startup and then saw its valuation increase steadily as a private company, but you know, even early last year it was valued around $12 billion, I think in February, and as late as mid September right before the IPO news hit that Marc Benioff and Warren Buffett were going to put in $250 million each at the IPO or just after the IPO and it was projected that Snowflake's valuation could go over $20 billion at that point. And on day one after the IPO Snowflake, closed worth more than $50 billion, the stock opened at 120, but unless you knew a guy, you had to hold your nose and buy on day one. And you know, maybe got it at 240, maybe you got it at 250, you might have got it at higher and at the time you might recall, I said, You're likely going to get a better price than on day one, which is usually the case with most IPOs, stock today's around 230. But you look at Snowflake today and if you want to buy in, you look at it and say, Okay, well I like the company, it's probably still overvalued, but I can see the company's value growing substantially over the next several years, maybe doubling in the near to midterm [mumbles] hit more than a hundred billion dollar valuation back as recently as December, so that's certainly feasible. The company is not likely to flame out because it's highly valued, I have to probably be patient for a couple of years. But you know, let's say I liked the management, I liked the company, maybe the company gets into the $200 billion range over time and I can make a decent return, but to get a 10X return on Snowflake you have to get to a valuation of over a half a trillion. Now, to get there, if it gets there it's going to become one of the next great software companies of our time. And you know, frankly if it gets there I think it's going to go to a trillion. So, if that's what your bet is then you know, you would be happy with that of course. But what's the likelihood? As an investor you have to evaluate that, what's the probability? So, it's a lower risk investment in Snowflake but maybe more likely that Snowflake, you know, they run into competition or the market shifts, maybe they get into the $200 billion range, but it really has to transform the industry execute for you to get in to that 10 bagger territory. Okay, now let's look at a different asset that is cryptocurrency called Compound, way more risky. But Compound is a decentralized protocol that allows you to lend and borrow cryptocurrencies. Now, I'm not saying go out and buy compound but just as a thought exercise is it's got an asset here with a lower valuation, probably much higher upside, but much higher risk. But so for Compound to get to 10X return it's got to get to $20 billion valuation. Now, maybe compound isn't the right asset for your cup of tea, but there are many cryptos that have made it that far and if you do your research and your homework you could find a project that's much, much earlier stage that yes, is higher risk but has a much higher upside that you can participate in. So, this is how investors, all investors really look at their choices and make decisions. And the more sophisticated investors, they're going to use detailed metrics and analyze things like MOIC, Multiple on Invested Capital and IRR, which is Internal Rate of Return, do TAM analysis, Total Available Market. They're going to look at competition. They're going to look at detailed company models in ARR and Churn rates and so forth. But one of the things we really want to talk about today and we brought this up at the snowflake IPO is if you were Buffet or Benioff and you had to, you know, quarter of a dollars to put in you could get an almost guaranteed return with your late in the game, but pre IPO money or a look if you were Mike Speiser or one of the earlier VCs or even someone like Jeremy Burton who was part of the inside network you could get stock or options, much cheaper. You get a 5X, 10X, 50X or even North of a hundred X return like the early VCs who took a big risk. But chances are, you're not one of these in one of these categories. So how can you as a little guy participate in something big and you might remember at the time of the snowflake IPO we showed you this picture, who are these people, Olaf Carlson-Wee, Chris Dixon, this girl Sono. And of course Tim Berners-Lee, you know, that these are some of the folks that inspired me personally to pay attention to crypto. And I want to share the premise that caught my attention. It was this. Think about the early days of the internet. If you saw what Berners-Lee was working on or Linus Torvalds, in one to invest in the internet, you really couldn't. I mean, you couldn't invest in Linux or TCP/IP or HTTP. Suppose you could have invested in Cisco after its IPO that would have paid off pretty big time, for sure. You know, he could have waited for the Netscape IPO but the core infrastructure of the internet was fundamentally not directly a candidate for investment by you or really, you know, by anybody. And Satya Nadella said the other day we have reached maximum centralization. The main protocols of the internet were largely funded by the government and they've been co-opted by the giants. But with crypto, you actually can invest in core infrastructure technologies that are building out a decentralized internet, a new internet, you know call it web three Datto. It's a big part of the investment thesis behind what Carlson-wee is doing. And Andreessen Horowitz they have two crypto funds. They've raised more than $800 million to invest and you should read the firm's crypto investment thesis and maybe even take their crypto startup classes and some great content there. Now, one of the people that I haven't mentioned in this picture is Camila Russo. She's a journalist she's turned into hardcore crypto author is doing great job explaining the white hot defining space or decentralized finance. If you're just at read her work and educate yourself and learn more about the future and be happy perhaps you'll find some 10X or even hundred X opportunities. So look, there's so much innovation going around going on around blockchain and crypto. I mean, you could listen to Warren Buffet and Janet Yellen who implied this is all going to end badly. But while look, these individuals they're smart people. I don't think they would be my go-to source on understanding the potential of the technology and the future of what it could bring. Now, we've talked earlier at the, at the start here about NFTs. DeFi is one of the most interesting and disruptive trends to FinTech, names like Celsius, Nexo, BlockFi. BlockFi let's actually the average person participate in liquidity pools is actually quite interesting. Crypto is going mainstream Tesla, micro strategy putting Bitcoin on their balance sheets. We have a 2017 Jamie diamond. He called Bitcoin a tulip bulb like fraud, yet just the other day JPM announced a structured investment vehicle to give its clients a basket of stocks that have exposure to crypto, PayPal allowing customers to buy, sell, and Hodl crypto. You can trade crypto on Robin Hood. Central banks are talking about launching digital currencies. I talked about the Fedcoin for a number of years and why not? Coinbase is doing an IPO will give it a value of over a hundred billion. Wow, that sounds frothy, but still big names like Mark Cuban and Jamaat palliate Patiala have been active in crypto for a while. Gronk is getting into NFTs. So it goes to have a little bit of that bubble feel to it. But look often when tech bubbles burst they shake out the pretenders but if there's real tech involved, some contenders emerge. So, and they often do so as dominant players. And I really believe that the innovation around crypto is going to be sustained. Now, there is a new web being built out. So if you want to participate, you got to do some research figure out things like how PolkaWorks, make a call on whether you think avalanche is an Ethereum killer dig in and find out about new projects and form a thesis. And you may, as a small player be able to find some big winners, but look you do have to be careful. There was a lot of fraud during the ICO. Craze is your risk. So understand the Tokenomics and maybe as importantly the Pump-a-nomics, because they certainly loom as dangers. This is not for the faint of heart but because I believe it involves real tech. I like it way better than Reddit stocks like GameStop for example, now not to diss Reddit. There's some good information on Reddit. If you're patient, you can find it. And there's lots of good information flowing on Discord. There's people flocking to Telegram as a hedge against big tech. Maybe there's all sounds crazy. And you know what, if you've grown up in a privileged household and you have a US Education you know, maybe it is nuts and a bit too risky for you. But if you're one of the many people who haven't been able to participate in these elite circles there are things going on, especially outside of the US that are democratizing investment opportunities. And I think that's pretty cool. You just got to be careful. So, this is a bit off topic from our typical focus and ETR survey analysis. So let's bring this back to the enterprise because there's a lot going on there as well with blockchain. Now let me first share some quotes on blockchain from a few ETR Venn Roundtables. First comment is from a CIO to diversified holdings company who says correctly, blockchain will hit the finance industry first but there are use cases in healthcare given the privacy and security concerns and logistics to ensure provenance and reduce fraud. And to that individual's point about finance. This is from the CTO of a major financial platform. We're really taking a look at payments. Yeah. Do you think traditional banks are going to lose control of the payment systems? Well, not without a fight, I guess, but look there's some real disruption possibilities here. And just last comment from a government CIO says, we're going to wait until the big platform players they get into their software. And so that is happening Oracle, IBM, VMware, Microsoft, AWS Cisco, they all have blockchain initiatives going on, now by the way, none of these tech companies wants to talk about crypto. They try to distance themselves from that topic which is understandable, I guess, but I'll tell you there's far more innovation going on in crypto than there is in enterprise tech companies at this point. But I predict that the crypto innovations will absolutely be seeping into enterprise tech players over time. But for now the cloud players, they want to support developers who are building out this new internet. The database is certainly a logical place to support a mutable transactions which allow people to do business one-on-one and have total confidence that the source hasn't been hacked or changed and infrastructure to support smart contracts. We've seen that. The use cases in the enterprise are endless asset tracking data access, food, tracking, maintenance, KYC or know your customer, there's applications in different industries, telecoms, oil and gas on and on and on. So look, think of NFTs as a signal crypto craziness is a signal. It's a signal as to how IT in other parts of companies and their data might be organized, managed and tracked and protected, and very importantly, valued. Look today. There's a lot of memes. Crypto kitties, art, of course money as well. Money is the killer app for blockchain, but in the future the underlying technology of blockchain and the many percolating innovations around it could become I think will become a fundamental component of a new digital economy. So get on board, do some research and learn for yourself. Okay, that's it for today. Remember all of these episodes they're available as podcasts, wherever you listen. I publish weekly on wikibon.com and siliconangle.com. Please feel free to comment on my LinkedIn post or tweet me @dvellante or email me at david.vellante@siliconangle.com. Don't forget to check out etr.plus for all the survey action and data science. This is Dave Vellante for theCUBE Insights powered by ETR. Be well, be careful out there in crypto land. Thanks for watching. We'll see you next time. (soft music)
SUMMARY :
bringing you data-driven and at the time you might recall, I said,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Mike Winkelmann | PERSON | 0.99+ |
Janet Yellen | PERSON | 0.99+ |
Camila Russo | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Satya Nadella | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Alex Tew | PERSON | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Jeremy Burton | PERSON | 0.99+ |
Chris Dixon | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
$20 billion | QUANTITY | 0.99+ |
2005 | DATE | 0.99+ |
Jay Po | PERSON | 0.99+ |
Olaf Carlson-Wee | PERSON | 0.99+ |
$200 billion | QUANTITY | 0.99+ |
$1,037,100 | QUANTITY | 0.99+ |
December | DATE | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
$69.3 million | QUANTITY | 0.99+ |
Stage 2 Capital | ORGANIZATION | 0.99+ |
10% | QUANTITY | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
$38,000 | QUANTITY | 0.99+ |
Mike Speiser | PERSON | 0.99+ |
Warren Buffet | PERSON | 0.99+ |
PayPal | ORGANIZATION | 0.99+ |
February | DATE | 0.99+ |
Boston | LOCATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Coinbase | ORGANIZATION | 0.99+ |
England | LOCATION | 0.99+ |
$16 trillion | QUANTITY | 0.99+ |
Jamie diamond | PERSON | 0.99+ |
Andreessen Horowitz | PERSON | 0.99+ |
more than $800 million | QUANTITY | 0.99+ |
Gauguin | PERSON | 0.99+ |
15% | QUANTITY | 0.99+ |
Salvador Dali | PERSON | 0.99+ |
Linus Torvalds | PERSON | 0.99+ |
GameStop | ORGANIZATION | 0.99+ |
Tim Berners-Lee | PERSON | 0.99+ |
over a half a trillion | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
david.vellante@siliconangle.com | OTHER | 0.99+ |
first | QUANTITY | 0.99+ |
more than $50 billion | QUANTITY | 0.99+ |
Warren Buffett | PERSON | 0.99+ |
2017 | DATE | 0.99+ |
10X | QUANTITY | 0.99+ |
each pixel | QUANTITY | 0.99+ |
120 | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Mark Cuban | PERSON | 0.99+ |
Linux | TITLE | 0.99+ |
Berners-Lee | PERSON | 0.99+ |
around $12 billion | QUANTITY | 0.99+ |
over a hundred billion | QUANTITY | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
mid September | DATE | 0.99+ |
Benioff | PERSON | 0.99+ |
Beeple | PERSON | 0.99+ |
Snowflake | EVENT | 0.98+ |
PolkaWorks | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
early last year | DATE | 0.98+ |
250 | QUANTITY | 0.98+ |
over $20 billion | QUANTITY | 0.98+ |
JPM | ORGANIZATION | 0.98+ |
a trillion | QUANTITY | 0.98+ |
eBay | ORGANIZATION | 0.98+ |
First comment | QUANTITY | 0.97+ |
US | LOCATION | 0.97+ |
@dvellante | PERSON | 0.97+ |
a year ago | DATE | 0.97+ |
Marc Benioff | PERSON | 0.96+ |
each | QUANTITY | 0.96+ |
Shaun Walsh, QLogic - #VMworld 2015 - #theCUBE
San Francisco extracting the signal from the noise it's the cube covering vmworld 2015 brought to you by VM world and its ecosystem sponsors now your host Stu miniman and Brian Grace Lee welcome back this is the cube SiliconANGLE TVs live production of vmworld 2015 here in moscone north san francisco happy to have back on this segment we're actually gonna dig into some of the networking pieces Brian Grace Lee and myself here hosting it Sean Walsh repeat cube guest you know in a new role though so Sean welcome back here now the general manager of the ethernet business at qlogic thanks for joining us thank you thanks for having me alright so I mean Sean you know we're joking before we start here I mean you and I go back about 15 years I do you know those that know the adapter business I mean you know Jay and I've LJ core business on you've worked for qlogic before you did a stint in ml accent and you're now back to qlogic so why don't we start off with that you know what brought you back to qlogic what do you see is the opportunity there sure um I'll tell you more than anything else what brought me back was this 25 gig transition it's very rare and I call it the Holy trifecta of opportunity so you've got a market transition you actually have a chip ready for the market at the right time and the number one incumbent which is Intel doesn't have a product I mean not that they're late they just don't have a product and that's the type of stuff that great companies are built out of are those unique opportunities in the market and you know more than anything else that's when brought me back to qlogic alright so before we dig into some of the ethernet and hyperscale piece you know what what's the state of fibre channel Sean you know what we said is in those fiber channel the walking dead is it a cash cow that you know qlogic be a bit of milk and brocade and the others in the fibre channel business for a number years you know what's your real impression of fibre channel did that yeah so you know look fibre channel is mature there's no question about it is that the walking dead no not by any stretch and if it is the walking dead man it produces a lot of cash so I'll take that any day of the year right The Walking Dead's a real popular show so fibre channel you know it's still it's still gonna be used in a lot of environments but you know jokingly the way that I describe it to people is I look at fibre channel now is the Swiss bank of networks so a lot of web giant's by our fiber channel cards and people will look at me and go why do they do that because for all the hype of open compute and all the hype of the front end processors and all the things that are happening when you click on something where there's money involved that's on back end Oracle stuff and it's recorded on fibre channel and if there's money involved it's on fibre and as long as there's money in the enterprise or in the cloud I'm reasonably certain fibre channel will be around yeah it's a funny story I remember two years ago I think we were at Amazon's reinvent show and Andy Jesse's on stage and somebody asked you know well how much of Amazon is running amazoncom is running on AWS and its most of it and we all joke that somewhere in the back corner running the financials is you know a storage area network with the traditional array you know probably atandt touched by fibre channel absolutely i mean we just did a roll out with one of the web giants and there were six different locations each of the each of the pods for the service for about 5,000 servers and you know as you would expect about 3,000 on the front access servers there's about 500 for pop cash that was about 15 maybe twelve thirteen hundred for the for the big data and content distribution and all those other things the last 500 servers look just like the enterprise dual 10 gigs dual fibre channel cards and you know I don't see that changing anytime soon all right so let's talk a bit a little bit 25 gig Ethernet had an interview yesterday with mellanox actually who you know have some strong claims about their market leadership in the you know greater than 10 gig space so where are we with kind of the standards the adoption in queue logical position and 25 gig Ethernet sure so you know obviously like everyone in this business we all know each other yeah and when you look at the post 10 gig market okay 40 gigs been the dominant technology and I will tip my hat to mellanox they've done well in that space now we're both at the same spot so we have exactly the same opportunity in front of us we're early to market on the 25 we have race to get there and what we're seeing is the 10 gig market is going to 25 pretty straightforward because I like the single cable plant versus the quad cable plant the people that are at 40 aren't going to 50 they're going to transition straight to 100 we're seeing 50 more as a blade architecture midplane sort of solution and that's where at right now and I can tell you that we have multiple design win opportunities that we're in the midst of and we are slugging it out with these guys everything and it will be an absolute knife fight between us and mellanox to see who comes out number one in this market obviously we both think we're going to win but at the end of the day I've placed my bet and I expect to win all right so Sean can you lay out for us you know where are those battles so traditionally the network adapter it was an OEM type solution right I got it into the traditional server guys yeah and then it was getting the brand recognition for the enterprise customers and pushing that through how much is that traditional kind of OEM is it changing what's having service providers and those hyperscale web giants yes so there's there's three fundamental things when you look at 25 gig you gotta deal with so first off the enterprise is going to be much later because they need the I Triple E version that has backwards auto-negotiation so you know that's definitely a 17 18 pearly transition type thing the play right now is in the cloud and the service provider market where they're rolling out specific services and they're not as concerned about the backwards compatibility so that's where we're seeing the strength of this so they're all the names that you would expect and I have to say one of the interesting things about working with these guys is there n das or even nastier than our Liam India is they do not want you talking about them but it is very much that market where it's a non traditional enterprise type of solution for the next 12-18 months and then as we roll into that next gen around the pearly architecture where we all have full auto-negotiation that's where you're going to see the enterprise start to kick in yeah what what what are the types of applications that are driving this this next bump in speed what is it is it video is it sort of east and west types of application traffic is a big data what's what's driving this next bump so a couple of things you would expect which would be the you know certainly hadoop mapreduce you know those sorts of things are going there the beginning of migration to spark where they're doing real-time analytics versus post or processing batch type stuff so there they really care about it and this is where our DMA is also becoming very very popular in it the next area that most people probably don't think of is the telco in a vspace is the volume as these guys are doing their double move and there going from a TCA type platforms running mostly one in ten they're going to leave right to 25 and for them the big thing is the ability to partition the network and do that virtualization and be able to run deep edk in one set of partitions standard storage another set of partitions in classic IP on the third among the among the few folks that you know you would expect in that are the big content distribution guys so one of the companies that I can mention is Netflix so they've already been out at their at 40 right now and you know they're not waiting for 50 they're going to make another leap that goes forward and they've been pretty public about those types of statements if you look at some of the things that they talked about at NDF or IDF and they're wanting to have nvme and direct gas connection over i serve that's driving 100 gig stuff we did a demo at a flash memory summit with Samsung where we had a little over 3 million I ops coming off of it and again it's not the wrong number that matters but it's that ability to scale and deal with that many concurrent sessions that are driving it so those are the early applications and I don't think the applications will be a surprise because they're all the ones that have moved to 40 you know the 10 wasn't enough 40 might be too much they're going to 25 and for a lot of the others and its really the pop cash side that's driving the hunter gig stuff because you know when that Super Bowl ad goes you got to be able to take all that bandwidth it once yeah so Sean you brought up nvme maybe can you discuss a little bit you know what are the you know nvm me and some of these next-generation architectures and what's the importance to the user sure so nvme is basically a connection capability that used to run for hard drives then as intel moved into SSDs they added this so you had very very high performance low latency pci express like performance what a number of us in this business are starting to do is then say hey look instead of using SAS which is kind of running out of gas at 12 gig let's move to nvme and make it a fabric and encapsulate it so there's three dynamics that help that one is the advent of 25 50 100 the second is the use of RDMA to get the latency that you want and then the third is encapsulation I sir or the ice cozy with RDMA together and it's sort of that trifecta of things that are giving very very high performance scale out on the back end and again this is for the absolute fastest applications where they want the lowest latency there was an interesting survey that was done by a university of arizona on latency and it said that if two people are talking and if you pause for more than a quarter of a second that's when people change their body language they lean forward they tilt their head they do whatever and that's kind of the tolerance factor for latency on these things and again one of the one of the statements that that Facebook made publicly at their recent forum was that they will spend a hundred million dollars to save a millisecond because that's the type of investment that drives their revenue screen the faster they get clicks the faster they generate revenue so when you think of high frequency trading when you think of all those things that are time-sensitive the human factor and that are going to drive this all right so storage the interaction with networking is you know critically important especially to show like this at vmworld I mean John you and I talked for years is it wasn't necessarily you know fibre channel versus the ethernet now it's changing operational models if I go use Salesforce I don't think about my network anymore I felt sort of happen to used Ethernet it's I don't really care um hyper convergence um when somebody buys hyper convergence you know they just kind of the network comes with it when I buy a lot of these solutions my networking decision is made for me and I haven't thought about it so you know what's that trend that you're seeing so the for us the biggest trend is that it's a shifting customer base so people like new tonics and these guys are becoming the drivers of what we do and the OEMs are becoming much more distribution vehicles for these sorts of things than they are the creators of this content so when we look at how we write and how we build these things there's far more multi-threading in terms of them there's far more partitions in terms of the environment because we never know when we get plugged into it what that is going to be so incorporating our l2 and our RDMA into one set of engine so that you always have that hyper for it's on tap on demand and you know without getting down into the minutia of the implementation it is a fundamental shift in how we look at our driver architectures you know looking at arm based solutions and micro servers versus just x86 as you roll the film forward and it also means that as we look at our architectures they have to become much smaller and much lighter so some of the things that we traditionally would have done in an offload environment we may do more in firmware on the side and I think the other big trend that is going to drive that is this move towards FPGAs and some of the other things that are out there essentially acting as coprocessors from you you mentioned earlier Open Compute open compute platform those those foundations and what's going on what is what what's really going on there i think a lot of us see the headlines sometimes you think about it you go okay this is an opportunity for lots of engineering to contribute to things but what's the reality that you're dealing with the web scale folks sure if they seem like the first immediate types of companies that would buy into this or use it what's the reality of what's going on with that space well obviously inside the the i will say the web scale cloud giant space you know i think right now if you look at it you've got sort of the big 10 baidu Tencent obama at amazon web as your microsoft being those guys and then you know they are definitely building and designing their own stuff there's another tier below that where you have the ebays the Twitter's the the other sorts of folks that are in there and you know they're just now starting that migration if you look at the enterprise not a big surprise the financial guys are leading this we've seen public statements from JPM and other folks that have been at these events so you know I view it very much like the blade server migration I think it's going to be twenty twenty-five percent of the overall market whether we whether people like to admit it or not good old rack and stack is going to be around for a very long time and you know they're there are applications where it makes a lot of sense when you're deploying prop private cloud in the managed service provider market we're starting to see a move into that but you know if you say you know what's the ten year life cycle of an architect sure i would say that in the cloud were probably four or five years into it and the enterprise were maybe one or two years into it all right so what about the whole sdn discussion Sean you know how much does qlogic play into that what are you seeing in general and you know we're at vmworld so what about nsx you know is that part of the conversation and what do you hear in the marketplace today yeah it really is part of the conversation and the interesting part is that I think sdn is getting a lot of play because of the capabilities that people want and again you know when you look at the managed service providers wanting to have large scale lower costs that's going to definitely drive it but much like OpenStack and Linux and some of these other things it's not going to be you know the guys going to go download it off the web and put it in production at AT&T you know it's going to be a prepackaged solution it's going to be embedded as part of it if you look at what Red Hat is doing with their OpenStack release we look what mirantis is doing with their OpenStack release again from an enterprise perspective and from a production in the MSP and second tier cloud that's what you're going to see more of so for us Sdn is critical because it allows us to then start to do things that we want to do for high-performance storage it allows us to change the value proposition in terms of if you look at Hadoop one of these we want to be able to do is take the storage engine module and run that on our card with our embedded V switch and our next gen ship so that we can do zero stack copies between nodes to improve latency so it's not just having RDMA is having a smart stack that goes with it and having the SDN capability to go out tell the controller pay no attention this little traffic that's going on over here you know these are not the droids you're looking for and then everything goes along pretty well so it's it's very fundamental and strategic but it's it's a game it's a market in which we're going to participate but it's not one we're going to try and write or do a distribution for okay any other VMware related activities q logics doing announcements this week that you want to share this week I would have to say no you know I think the one other thing that we're strategically working on them on with that you would expect is RDMA capabilities across vMotion visa and those sorts of things we've been one of the leaders in terms of doing genevieve which is the follow-on to VX land for hybrid cloud and that sort of thing and we see that as a key fundamental partnership technology with VMware going forward all right so let's turn back to qlogic for a second so the CEO recently left he DNA that there's a search going on so give us the company update if you will well actually there isn't a search so Jean who is gonna is going to run the ship forward as CEO we've brought in chris king who was on our board as executive chair in person chris has a lot of experience in the chip market and she understands that intimate tie that we have to that intel tick-tock model and really how you run an efficient ship driven organization you know whether we play in the systems in between level you know we're not quite the system but we're not quite the chip and understanding that market is part of what she does and the board has given us the green light to continue to go forward develop what we need to do in terms of the other pieces jean has a strong financial background she was acting CEO for a year between HK and simon aires me after Simon left so she's got the depth she knows the business and for us you know you know it's kind of a non op where everything else is continuing on as you would expect yeah okay last question I have for you Sean I mean the dynamics change for years you know what there was kind of the duopoly Xin the market I mean it was in tellin broadcom oh yeah on the ethernet side it was Emulex and amp qlogic it's a different conversation today I mean you mentioned Intel we talked about mellanox don't you logic you know your old friend I don't lie back on a vago bought broadcom and now they're called broadcom I think so yeah so you know layout for us you know kind of you know where you see that the horses on the track and you know what excites you yeah so again you know if you look at the the 10 gig side of the business clearly intel has the leadership position now we're number two in the market if you look at the shared data that's come out you know the the the Emulex part of a vago has been struggling in losing chair then we have this 25 gig transition that came in the market and that was driven by broadcom and you know for those of us who have followed this business they I think everyone can appreciate the irony of avago of avago buying Emulex and then for all the years we tried to keep him separate bringing them back together was but we-we've chuckled over a few beers on that one but then you've got this 25 gig transition and you know the other thing is that if you look at so let me step back and say the other thing on the 10 gig market is was a very very clear dividing line the enterprise was owned by the broadcom / qlogic emulex side the cloud the channel the the the appliance business was owned by Intel mellanox okay now as we go into this next generation you've got us mellanox and the the original broadcom team coming in with 25 game we've all done something that gets us through this consortium approach we're all going to have a night Ripley approach from there and Intel isn't there you know we haven't seen any announcements or anything specific from Emulex that they've said publicly in that space so right now we kind of view it as a two-horse race we think from a software perspective that our friends at at broadcom com whatever we want to call them or bravado I think is how r CT / first tool that I don't think they have a software depth to run this playbook right now and then we have to do is take our enterprise strength and move those things like load balancing and failover and the SDN tools and end par and all the virtualization capabilities we have we got to move those rapidly into the into the cloud space and go after it for us it means we have to be more open source driven than we have been in the past it means that we have a different street fight for every one of these it represents a change in some of the sales model and how we go to market so you know not to say that we're you know we we've got all of everything wrapped up and perfect in this market but again right time right place and this will be the transition for another you know we think three to five years and there's there's still a lot of interesting things that are happening ironically one of the most interesting things I think it's got to happen in 25 is this use of the of the new little profile connectors I think that will do more to help the adoption of 25 gig in Hunter gig where you can use the RCX or r XC connector there's our cxr see I forgot the acronym but it kind of looks like the firewire HDMI connectors that you have on your laptop's now and now imagine that you can have a car that has that connector in a form factor that's you know maybe a half inch square and now you've got incredible port density and you can dynamically change between 25 50 and 100 on the fly well let Sean Sean you know we've always talked there's a lot of complexity that goes in under the covers and it's the interest who's got a good job of making that simple and consumable right and help tried those new textures go forward all right Sean thank you so much for joining us we'll be right back with lots more coverage including some more networking in-depth conversation thank you for watching thanks for having me
**Summary and Sentiment Analysis are not been shown because of improper transcript**
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jean | PERSON | 0.99+ |
Shaun Walsh | PERSON | 0.99+ |
Brian Grace Lee | PERSON | 0.99+ |
one | QUANTITY | 0.99+ |
Sean | PERSON | 0.99+ |
three | QUANTITY | 0.99+ |
four | QUANTITY | 0.99+ |
Jay | PERSON | 0.99+ |
Andy Jesse | PERSON | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
10 gig | QUANTITY | 0.99+ |
Emulex | ORGANIZATION | 0.99+ |
ten year | QUANTITY | 0.99+ |
25 gig | QUANTITY | 0.99+ |
Brian Grace Lee | PERSON | 0.99+ |
qlogic | ORGANIZATION | 0.99+ |
Samsung | ORGANIZATION | 0.99+ |
25 gig | QUANTITY | 0.99+ |
100 gig | QUANTITY | 0.99+ |
10 gig | QUANTITY | 0.99+ |
Simon | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
two years | QUANTITY | 0.99+ |
two people | QUANTITY | 0.99+ |
12 gig | QUANTITY | 0.99+ |
mellanox | ORGANIZATION | 0.99+ |
50 | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
chris | PERSON | 0.99+ |
amazon | ORGANIZATION | 0.99+ |
microsoft | ORGANIZATION | 0.99+ |
twelve thirteen hundred | QUANTITY | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
amazoncom | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
five years | QUANTITY | 0.99+ |
40 gigs | QUANTITY | 0.99+ |
Stu miniman | PERSON | 0.99+ |
about 5,000 servers | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
40 | QUANTITY | 0.99+ |
100 | QUANTITY | 0.99+ |
10 gig | QUANTITY | 0.99+ |
two years ago | DATE | 0.98+ |
Linux | TITLE | 0.98+ |
twenty twenty-five percent | QUANTITY | 0.98+ |
about 3,000 | QUANTITY | 0.98+ |
25 | QUANTITY | 0.98+ |
Super Bowl | EVENT | 0.98+ |
jean | PERSON | 0.98+ |
OpenStack | TITLE | 0.98+ |
third | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
moscone | LOCATION | 0.98+ |
San Francisco | LOCATION | 0.98+ |
first | QUANTITY | 0.97+ |
Red Hat | ORGANIZATION | 0.97+ |
this week | DATE | 0.97+ |
over 3 million | QUANTITY | 0.97+ |
The Walking Dead | TITLE | 0.96+ |
Intel | ORGANIZATION | 0.96+ |
telco | ORGANIZATION | 0.96+ |
John | PERSON | 0.96+ |
about 500 | QUANTITY | 0.96+ |
greater than 10 gig | QUANTITY | 0.96+ |
three fundamental things | QUANTITY | 0.96+ |