Image Title

Search Results for jaspersoft:

Nick Halsey, Okera | CUBE Conversation


 

(soft electronic music) >> Welcome to this special CUBE Conversation. I'm John Furrier here, in theCUBE's Palo Alto studio. We're here, remotely, with Nick Halsey who's the CEO of OKERA, hot startup doing amazing work in cloud, cloud data, cloud security, policy governance as the intersection of cloud and data comes into real stable operations. That's the number one problem. People are figuring out, right now, is how to make sure that data's addressable and also secure and can be highly governed. So Nick, great to see you. Thanks for coming on theCUBE. >> It's great to be here, John, thank you. >> So you guys have a really hot company going on, here, and you guys are in an intersection, an interesting spot as the market kind of connects together as cloud is going full, kind of, whatever, 3.0, 4.0. You got the edge of the network developing with 5G, you've got space, you've got more connection points, you have more data flowing around. And the enterprises and the customers are trying to figure out, like, okay, how do I architect this thing. And oh, by the way, I got a, like all these compliance issues, too. So this is kind of what you could do. Take a minute to explain what your company's doing. >> Yeah, I'm happy to do that, John. So we're introduced a new category of software that we call universal data authorization or UDA which is really starting to gain some momentum in the market. And there're really two critical reasons why that happening. People are really struggling with how do I enable my digital transformation, my cloud migration while at the same time making sure that my data is secure and that I'm respecting the privacy of my customers, and complying with all of these emerging regulations around data privacy like GDPR, CCPA, and that alphabet soup of regulations that we're all starting to become aware of. >> I want to ask about the market opportunity because, you know, one of the things we see and the cloud covers normal conversations like, "Hey, modern applications are developing." We're starting to see cloud-native. You're starting to see these new use cases so you're starting to see new expectations from users and companies which creates new experiences. And this is throwing off all kinds of new, kinds of data approaches. And a lot of people are scratching their head and they feel like do they slow it down, they speed it up? Do I get a hold of the compliance side first? Do I innovate? So it's like a real kind of conflict between the two. >> Yeah, there's a real tension in most organizations. They're trying to transform, be agile, and use data to drive that transformation. But there's this explosion of the volume, velocity, and variety of data, we've all heard about the three Ds, we'll say there're five Ds. You know, it's really complicated. So you've got the people on the business side of the house and the Chief Data Officer who want to enable many more uses of all of these great data assets. But of course, you've got your security teams and your regulatory and compliance teams that want to make sure they're doing that in the right way. And so you've got to build a zero-trust infrastructure that allows you to be agile and be secure at the same time. And that's why you need universal data authorization because the old manual ways of trying to securely deliver data to people just don't scale in today's demanding environments. >> Well I think that's a really awesome approach, having horizontally scalable data. Like infrastructure would be a great benefit. Take me through what this means. I'd like to get you to define, if you don't mind, what is universal data authorization. What is the definition? What does that mean? >> Exactly and people are like, "I don't understand security. "I do data over here and privacy, "well I do that over here." But the reality is you really need to have the right security platform in order to express your privacy policies, right. And so in the old days, we used to just build it into the database, or we'd build it into the analytic tools. But now, we have too much data in too many platforms in too many locations being accessed by too many, you know, BI applications and A-I-M-L data apps and so you need to centralize the policy definition and policy enforcement so that it can be applied everywhere in the organization. And the example I like to give, John, is we are just like identity access management. Why do I need Okta or Sale Point, or one of those tools? Can't I just log in individually to, you know, SalesForce or to GitHub or? Sure, you can but once you have 30 or 40 systems and thousands of users, it's impossible to manage your employee onboarding and off-boarding policy in a safe and secure way. So you abstract it and then you centralize it and then you can manage and scale it. And that's the same thing you do with OKERA. We do all of the security policy enforcement for all of your data platforms via all of your analytic tools. Anything from Tableau to Databricks to Snowflake, you name it, we support those environments. And then as we're applying the security which says, "Oh, John is allowed access to this data in this format "at this time," we can also make sure that the privacy is governed so that we only show the last four digits of your social security number, or we obfuscate your home address. And we certainly don't show them your bank balance, right? So you need to enable the use of the data without violating the security and privacy rights that you need to enforce. But you can do both, with our customers are doing at incredible scale, then you have sort of digital transformation nirvana resulting from that. >> Yeah, I mean I love what you're saying with the scale piece, that's huge. At AWS's Reinforce Virtual Conference that they had to run because the event was canceled due to the Delta COVID surge, Stephen Schmidt gave a great keynote, I called it a master class, but he mainly focused on cyber security threats. But you're kind of bringing that same architectural thinking to the data privacy, data security piece. 'Cause it's not so much you're vulnerable for hacking, it's still a zero-trust infrastructure for access and management, but-- >> Well you mean you need security for many reasons. You do want to be able to protect external hacks. I mean, every week there's another T-Mobile, you know, you name it, so that's... But 30% of data breaches are by internal trusted users who have rights. So what you needed to make sure is that you're managing those rights and that you're not creating any long tails of data access privilege that can be abused, right? And you also need, one of the great benefits of using a platform like OKERA, is we have a centralized log of what everybody is doing and when, so I could see that you, John, tried to get into the salary database 37 times in the last hour and maybe we don't want to let you do that. So we have really strong stakeholder constituencies in the security and regulatory side of the house because, you know, they can integrate us with Splunk and have a single pane of glass on, weird things are happening in the network and there's, people are trying to hit these secure databases. I can really do event correlation and analysis, I can see who's touching what PII when and whether it's authorized. So people start out by using us to do the enforcement but then they get great value after they've been using us for a while, using that data, usage data, to be able to better manage their environments. >> It's interesting, you know, you bring up the compliance piece as a real added value and I wasn't trying to overlook it but it brings up a good point which is you have, you have multiple benefits when you have a platform like this. So, so take me through like, who's using the product. You must have a lot of customers kicking the tires and adopting it because architecturally, it makes a lot of sense. Take me through a deployment of what it's like in the customer environment. How are they using it? What is some of the first mover types using this approach? And what are some of the benefits they might be realizing? >> Yeah, as you would imagine, our early adopters have been primarily very large organizations that have massive amounts of data. And they tend also to be in more regulated industries like financial services, biomedical research and pharmaceuticals, retail with tons of, you know, consumer information, those are very important. So let me give you an example. We work with one of the very largest global sports retailers in the world, I can't use their name publicly, and we're managing all of their privacy rights management, GDPR, CCPA, worldwide. It's a massive undertaking. Their warehouse is over 65 petabytes in AWS. They have many thousands of users in applications. On a typical day, an average day OKERA is processing and governing six trillion rows of data every single day. On Black Friday, it peaked over 10 trillion rows of data a day, so this is scale that most people really will never get to. But one of the benefits of our architecture is that we are designed to be elastically scalable to sort of, we actually have a capability we call N scale because we can scale to the Nth degree. We really can go as far as you need to in terms of that. And it lets them do extraordinary things in terms of merchandising and profitability and market basket analysis because their teams can work with that data. And even though it's governed and redacted and obfuscated to maintain the individuals' privacy rights, we still let them see the totality of the data and do the kind of analytics that drive the business. >> So large scale, big, big customer base that wants scale, some, I'll say data's huge. What are some of the largest data lakes that you guys have been working with? 'Cause sometimes you hear people saying our data lakes got zettabytes and petabytes of content. What are some of the, give us a taste of the order of magnitude of some of the size of the data lakes and environments that your customers were able to accomplish. >> I want to emphasize that this is really important no matter what size because some of our customers are smaller tech-savvy businesses that aren't necessarily processing huge volumes of data, but it's the way that they are using the data that drives the need for us. But having said that, we're working with one major financial regulator who has a data warehouse with over 200 petabytes of data that we are responsible for providing the governance for. And one thing about that kind of scale that's really important, you know, when you want to have everybody in your organization using data at that scale, which people think of as democratizing your data, you can't just democratize the data, you also have to democratize the governance of the date, right? You can't centralize policy management in IT because then everybody who wants access to the data still has to go back to IT. So you have to make it really easy to write policy and you have to make it very easy to delegate policy management down to the departments. So I need to be able to say this person in HR is going to manage these 50 datasets for those 200 people. And I'm going to delegate the responsibility to them but I'm going to have centralized reporting and auditing so I can trust but verify, right? I can see everything they're doing and I can see how they are applying policy. And I also need to be able to set policy at the macro level at the corporate level that they inherit so I might just say I don't care who you are, nobody gets to see anything but the last four digits of your social security number. And they can do further rules beyond that but they can't change some of the master rules that you're creating. So you need to be able to do this at scale but you need to be able to do it easily with a graphical policy builder that lets you see policy in plain English. >> Okay, so you're saying scale, and then the more smaller use cases are more refined or is it more sensitive data? Regulated data? Or more just levels of granularity? Is that the use case? >> You know, I think there's two things that are really moving the market right now. So the move to remote work with COVID really changed everybody's ideas about how do you do security because you're no longer in a data center, you no longer have a firewall. The Maginot Line of security is gone away and so in a zero-trust world, you know, you have to secure four endpoints: the data, the device, the user, and the application. And so this pretty radical rethinking of security is causing everybody to think about this, big, small, or indifferent. Like, Gartner just came out with a study that said by 2025 75% of all user data in the world is going to be governed by privacy policy. So literally, everybody has to do this. And so we're seeing a lot more tech companies that manage data on behalf of other users, companies that use data as a commodity, they're transacting data. Really, really understand the needs for this and when you're doing data exchange between companies that is really delicate process that have to be highly governed. >> Yeah, I love the security redo. We asked Pat Gelsinger many, many years ago when he was a CEO of VMware what we thought about security and Dave Allante, my co-host at theCUBE said is it a do-over? He said absolutely it's a do-over. I think it was 2013. He mused around that time frame. It's kind of a do-over and you guys are hitting it. This is a key thing. Now he's actually the CEO of Intel and he's still driving forward. Love Pat's vision on this early, but this brings up the question okay, if it's a do-over and these new paradigms are existing and you guys are building a category, okay, it's a new thing. So I have to ask you, I'm sure your customers would say, "Hey, I already got that in another platform." So how do you address that because when you're new you have to convince the customer that this is a new thing. Like, I don't-- >> So, so look, if somebody is still running on Teradata and they have all their security in place and they have a single source of the truth and that's working for them, that's great. We see a lot of our adoption happening as people go on their cloud transformation journey. Because I'm lifting and shifting a lot of data up into the cloud and I'm usually also starting to acquire data from other sources as I'm doing that, and I may be now streaming it in. So when I lift and shift the data, unfortunately, all of the security infrastructure you've built gets left behind. And so a lot of times, that's the forcing function that gets people to realize that they have to make a change here, as well. And we also find other characteristics like, people who are getting proactive in their data transformation initiatives, they'll often hire a CDO, they'll start to use modern data cataloging tools and identity access management tools. And when we see people adopting those things, we understand that they are on a journey that we can help them with. And so we partner very closely with the catalog vendors, with the identity access vendors, you know, with many other parts of the data lake infrastructure because we're just part of the stack, right? But we are the last mile because we're the part of the stack that lets the user connect. >> Well I think you guys are on a wave that's massive and I think it's still, it's going to be bigger coming forward. Again, when you see categories being created it's usually at the beginning of a bigger wave. And I got to ask you because one thing's I've been really kind of harping on on theCUBE and pounding my fist on the table is these siloed approaches. And you're seeing 'em everywhere, I mean, even in the consumer world. LinkedIn's a silo. Facebook's a silo. So you have this siloed mentality. Certainly in the enterprise they're no stranger to silos. So if you want to be horizontally scalable with data you've got to have it free, you've got to break the silos. Are we going to get there? Is this the beginning? Are we breaking down the silos, Nick, or is this the time or what's your reaction to that? >> I'll tell you something, John. I have spent 30 years in the data and analytics business and I've been fortunate enough to help launch many great BI companies like Tableau and Brio Software, and Jaspersoft and Alphablocks we were talking about before the show. Every one of those companies would have been much more successful if they had OKERA because everybody wanted to spread those tools across the organization for better, more agile business analytics, but they were always held back by the security problem. And this was before privacy rights were even a thing. So now with UDA and I think hand-in-hand with identity access management, you truly have the ability to deliver analytic value at scale. And that's key, you need simplicity at scale and that is what lets you let all parts of your organization be agile with data and use it to transform the business. I think we can do that, now. Because if you run in the cloud, it's so easy, I can stand up things like Hadoop in, you know, like Databricks, like Snowflake. I could never do that in my on-prem data center but I can literally press a button and have a very sophisticated data platform, press a button, have OKERA, have enforcement. Really, almost any organization can now take advantage of what only the biggest and most sophisticated organizations use to be able to do it. >> I think Snowflake's an example for all companies that you could essentially build in the shadows of the big clouds and build your own franchise if you nail the security and privacy and that value proposition of scale and good product. So I got, I love this idea of security and privacy managed to a single platform. I'd love to get your final thought while I got you here, on programmability because I'm seeing a lot of regulators and people in the privacy world puttin' down all these rules. You got GDPR and I want to write we forgot and I got all these things... There's a trend towards programmability around extraction of data and managing data where just a simple query could be like okay, I want to know what's goin' on with my privacy and we're a media company and so we record a lot of data too, and we've got to comply with all these like, weird requests, like hey, can you, on June 10th, I want, can you take out my data? And so that's programmatic, that's not a policy thing. It's not like a lawyer with some privacy policy. That's got to be operationalized. So what's your reaction to that as this world starts to be programmable? >> Right, well that's key to our design. So we're an API first approach. We are designed to be a part of a very sophisticated mesh of technology and data so it's extremely simple to just call us to get the information that you need or to express a policy on the fly that might be created because of the current state-based things that are going on. And that's very, very important when you start to do real-time applications that require geo-fencing, you're doing 5G edge computing. It's a very dynamic environment and the policies need to change to reflect the conditions on the ground, so to speak. And so to be callable, programmable, and betable, that is an absolutely critical approach to implementing IUDA in the enterprise. >> Well this is super exciting, I feel you guys are on, again, a bigger wave than it appears. I mean security and privacy operating system, that's what you guys are. >> It is. >> It is what it is. Nick, great to chat with you. >> Couldn't have said it better. >> I love the category creation, love the mojo and I think you guys are on the right track. I love this vision merging data security policy in together into one to get some enablement and get some value creation for your customers and partners. Thanks for coming on to theCUBE. I really appreciate it. >> Now, it's my pleasure and I would just give one piece of advice to our listeners. You can use this everywhere in your organization but don't start with that. Don't boil the ocean, pick one use case like the right to be forgotten and let us help you implement that quickly so you can see the ROI and then we can go from there. >> Well I think you're going to have a customer in theCUBE. We will be calling you. We need this. We've done a lot of digital events now with the pandemic, so locked data that we didn't have to deal with before. But thanks for coming on and sharing, appreciate it. OKERA, hot startup. >> My pleasure, John and thank you so much. >> So OKERA conversation, I'm John Furrier here, in Palo Alto. Thanks for watching. (soft electronic music)

Published Date : Sep 7 2021

SUMMARY :

So Nick, great to see you. and you guys are in an category of software that we call of the things we see and the Chief Data I'd like to get you to And the example I like to the event was canceled to let you do that. What is some of the first mover types and do the kind of analytics of some of the size the data, you also have So the move to remote work So how do you address that all of the security And I got to ask you because and that is what lets you let all parts and people in the privacy world puttin' on the ground, so to speak. that's what you guys are. Nick, great to chat with you. and I think you guys like the right to be to have a customer in theCUBE. and thank you so much. So OKERA conversation, I'm John Furrier

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Nick HalseyPERSON

0.99+

JohnPERSON

0.99+

Dave AllantePERSON

0.99+

JaspersoftORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

Stephen SchmidtPERSON

0.99+

Pat GelsingerPERSON

0.99+

June 10thDATE

0.99+

NickPERSON

0.99+

TableauORGANIZATION

0.99+

OKERAORGANIZATION

0.99+

2013DATE

0.99+

37 timesQUANTITY

0.99+

AlphablocksORGANIZATION

0.99+

John FurrierPERSON

0.99+

30 yearsQUANTITY

0.99+

30QUANTITY

0.99+

50 datasetsQUANTITY

0.99+

30%QUANTITY

0.99+

IntelORGANIZATION

0.99+

2025DATE

0.99+

GartnerORGANIZATION

0.99+

40 systemsQUANTITY

0.99+

T-MobileORGANIZATION

0.99+

PatPERSON

0.99+

bothQUANTITY

0.99+

twoQUANTITY

0.99+

LinkedInORGANIZATION

0.99+

two thingsQUANTITY

0.99+

200 peopleQUANTITY

0.99+

FacebookORGANIZATION

0.99+

over 200 petabytesQUANTITY

0.99+

VMwareORGANIZATION

0.99+

GDPRTITLE

0.99+

AWSORGANIZATION

0.98+

EnglishOTHER

0.98+

DatabricksORGANIZATION

0.98+

TeradataORGANIZATION

0.98+

single platformQUANTITY

0.98+

oneQUANTITY

0.98+

Brio SoftwareORGANIZATION

0.98+

over 65 petabytesQUANTITY

0.98+

over 10 trillion rows of data a dayQUANTITY

0.98+

Black FridayEVENT

0.98+

first approachQUANTITY

0.97+

thousands of usersQUANTITY

0.97+

one pieceQUANTITY

0.97+

75%QUANTITY

0.96+

SnowflakeORGANIZATION

0.96+

GitHubORGANIZATION

0.96+

theCUBEORGANIZATION

0.96+

Delta COVID surgeEVENT

0.95+

Reinforce Virtual ConferenceEVENT

0.95+

single sourceQUANTITY

0.95+

first moverQUANTITY

0.94+

pandemicEVENT

0.93+

every single dayQUANTITY

0.92+

six trillion rows of dataQUANTITY

0.92+

OktaORGANIZATION

0.91+

one thingQUANTITY

0.9+

single paneQUANTITY

0.9+

four endpointsQUANTITY

0.9+

CCPATITLE

0.89+

UDAORGANIZATION

0.89+

firstQUANTITY

0.88+

two critical reasonsQUANTITY

0.86+

zeroQUANTITY

0.85+

Sale PointORGANIZATION

0.85+

many years agoDATE

0.85+

TableauTITLE

0.84+

IUDATITLE

0.84+

petabytesQUANTITY

0.81+

thousands of usersQUANTITY

0.81+

todayDATE

0.8+

Daniel Heacock, Etix & Adam Haines, Federated Sample - AWS Re:Invent 2013 - #awsreinvent #theCUBE


 

hi everybody we are live at AWS reinvents in Las Vegas I'm Jeff Kelly with Wikibon org you're watching the cube silicon angles premiere live broadcast we go out to the technology events and as John foyer likes to say extract the signal from the noise so being here at the AWS show we were talk we're going to talk to a lot of AWS customers here a lot about what they're doing in in this case around analytics data warehousing and data integration so for this segment I'm joined by two customers Daniel heacock senior business systems analyst with a tix and Adam Cain's who's a data architect with federated sample welcome guys thanks for joining us on the cube Thanks your first time so we'll promise we'll make this as painless as possible so so you guys have a couple things in common we were talking beforehand some of the workflows are similar you work your you're using Amazon Web Services redshift platform for data warehousing you're using attunity for some of the data integration to bring that in from your for your operational transactional databases and using a bi tool on top to kind of tease out some of the insights from that data but why don't we get started Daniel we'll start with you tell us a little bit about etix kind of what you guys do and then we'll just kind of get into the use cases and talk to use AWS and the tuner need some of the other technologies you use it sure yeah so the company I work for is etix we are a primary market ticketing company in the entertainment industry we provide a box office solutions to venues and venue owners all types of events casinos fairs festivals pretty much you name and we sell some tickets in that industry we we provide a software solution that enables those menu owners to engage their customers and sell tickets so could kind of a competitor to something like ticketmaster the behemoth in the industry and you're definitely so Ticketmaster would be the behemoth in the industry and we are we consider ourselves a smaller sexier version that more friendly to the customer customer friendly more agile absolutely so Adam tell us a little bit about better a sample sure federated sample is a technology company in the market research industry and we aim to do is add an exchange layer between buyers and sellers so we facilitate the transaction between when a buyer or a company like coke would say hey we need to do a survey we will negotiate pricing and route our respondents to their surveys try to make that a more seamless process so they don't have to go out and find your very respond right everything online and right right absolutely got it so so let's talk a little bit about let's start with AWS so obviously we're here to reinvent a big show 9,000 people here so you guys you know talk about agile talk about cloud enabling kind of innovation and I'm gonna start with you what kind of brought you to AWS are you using red shift and I think you mentioned you're all in the cloud right just give us your impressions of the show in AWS and what that's meant your business right shows been great so far as to we were originally on-premise entirely at data center out in California and it just didn't meet our rapid growth we're a smaller company startup so we couldn't handle the growth so we need something more elastic more agile so we ended up moving our entire infrastructure into amazon web services so then we found that we had a need to actually perform analytics on that data and that's when we started the transition to you know redshift and so the idea being you're moving data from your transactional system which is also on AWS into redshift so using attunity for that they're clapping solution talk a little bit about that and and you know how that is differentiate from some of the other integration methods you could have chosen right so we started with a more conventional integration method a homegrown solution to move our data from our production sequel server into redshift and it worked but it was not optimal didn't have all the bells and whistles and it was prone to bad management being like not many people could configure it know how to use it so then we saw cloud being from attunity and they offered a native solution using secret survey replication that could tie into our native sequel server and then push that data directly into cloud being at a very fast rate so moving that data from from the sequel server it is essentially a real-time replication so that yes that's moving that data into redshifts of the year analysts can actually write when they're doing there the reporting or doing some real ad hoc kind of queries they can be confident they've got the most up-to-date data from your secret service right actual system right yeah nearly real-time and just to put in perspective the reports that we were running on our other system we're taking you know 10 15 minutes to run in redshift we're running those same reports in minutes 1 12 minutes right and if you're running those reports so quickly you know the people sometimes forget when you're talking about you know real time or interactive queries and reporting it's somewhat only as good as the data timeliness that you've got that you by Dave the timeless of the data you've got in that database because right trying to make some real-time decisions you've got a lag of depending on the workload and your use case even 15 minutes to an hour back might really impact you're ready to make those decisions so Adam talk a little bit about your use case is it is a similar cloud cloud architecture are you moving from upside Daniel moving from on-premise to so you're actually working with an on-premise data center it's an Oracle database and so we've basically we we ran into two limitations one regarding to our current reporting infrastructure and then to kind of our business intelligence capabilities and so as an analyst I've been kind of tasked with creating internal feedback loops within our organization as far as delivering certain types of KPIs and metrics to you know inform our our different teams or operations teams our marketing teams so that has been one of the kind of BI lms that we've been able to achieve because of the replication and the redshift and then the the other is actually making our reporting more I guess comprehensive we're able to run now that we're using redshift we're able to run reports that we were previously not be able to do to run on our on-premise transactional database so really we just are kind of embracing the power of redshift and it's enabling us and a lot of different types of ways yeah i mean we're hearing a lot about red shift at the show it's the amazon says the fastest-growing service AWS has had from a revenue perspective and it's six seven year history so clearly there's a lot of power in that platform it removes a lot of the concerns around having to manage that infrastructure obviously but the performance you know that's that's something I think when people are have their own data centers their own databases tuning those for the type of performance you're looking for is can be a challenge is that one of the drivers to kind of your move to redshift oh for sure the performance i I'm trying to think of a good example of a metric to compare but it's basically enabled us to develop a product or to develop products that would not have been possible otherwise there were certain i guess the ability to crunch data like you said in a specific time frame is very important for reporting purposes and if you're not able to meet a certain time frame then certain type of report is just not going to be useful so it's opening the door for new types of products within our organization well let's dig into that a little bit the different data types we're talking about so you've got a tea tix you're talking about customer transactions your custom are you talking about profiles of different types of customers tell us about some of the data sources that you're moving from your transactional system which i think is an Oracle database to to red shift and then you know what are some of those types of analytic workloads what kind of insights are you looking for sure so you know we're in the business of selling tickets and so one of our you know main concerns or I guess you should say we're in the business of helping our customers sell tickets and so we're always trying to figure out ways to improve their marketing efforts and so marketing segmentation is one of the huge ones appending data from large data services in order to get customer demographic information is something as you know easy to do in red shift and so we're able to use that information transaction information customer information I guess better engage our fans and likewise Adam could you maybe walk us through kind of a use case maybe your types of data you're looking at right that you're moving into red ship with attunity and then you know what kind of analytics are you doing on top of that what kind of insights are you gathering right so are our date is a little bit different than then ticketing but what we ultimately capture is is a respondent answers to questions so we try to find the value in a particular set of answers so we can determine the quality of the supply that's sent from suppliers so if they say that a person meets a certain demographic that we can actually verify that that person reads that demographic and then we can actually help them improve their supply that they push down to that respondent to it everybody makes more money because the completion rates go up so overall just business and analysis on that type of information so that we can help our customers and help ourselves so I wonder if we could talk a little bit about kind of the BI layer on top as well I think you're both using jaspersoft but you know beyond that you know one of the topics we've been covering on the cube another and on Wikibon is this whole analytics for all movement and we've been hearing about self service business intelligence for 20-plus years from some of the more incumbent vendors like business objects and cognos that others but really I mean if you look at a typical enterprise business intelligence usage or adoption rate kind of stalls out by eighteen percent twenty percent talk about how you've seen this kind of industry evolve a little bit maybe talk about jaspersoft specifically but what are some of the things that you think have to happen or some of the types of tools that are needed to really make business intelligence more consumable for analysts and more business use people who are not necessarily trained in statistics aren't data scientists Adam we start yes so one of the things that we're doing is with our jaspersoft we're trying to figure out you know certain we have a pis and we have traditional you know client server applications which ones our customers want to use the most because we're trying to push everybody towards an API oriented so we're trying to put that data into redshift with Jasper soft and kind of flip that data and look at it year-to-date or over a period of time to see where all of our money's coming from where others are rather than getting driven from and our business users are now empowered with jaspersoft to do that themselves they don't rely on us to pull data from they could just tie right into jaspersoft grab the data they need for whatever period of time they want and look at it in a nice pretty chart as a similar experience you're having any text definitely and I think one of the things I should emphasize about our use of Jasper's off and basically really any bi tool you choose to use in the Amazon platform is just the ability to launch it almost immediately and be able to play with data within 5-10 minutes of trying to launch it yeah it's pretty amazing what how quickly things can come from just a thought into action so well that's a good point because I mean you think about not just bitten telligence but the whole datawarehousing world it was you know the traditional method is you you know the business user a business unit goes to IT they say here are some of the requirements of the metrics we want on these reports IT then gun it goes away and builds it comes back six months later 12 months later here you go here's the report and next thing you know the business doesn't remember what they asked for this isn't necessarily going to serve our needs anymore and you've just essentially it's not a particularly useful model and Amazon really helps you kind of shorten that time frame significantly it sounds like between what you can do with redshift and some of their other database products and whatever bi to used to use is that kind of how you see this evolving oh definitely and the options I guess the the kind of plug and play workflow is is pretty pretty amazing and it's a it's given us the flexibility in our organization to be able to say well we can use this tool for now and there's a there's a chance we may decide there's something different in the future that we want to use and plugin in its place we're confident that that product will be there whenever the you know whenever the need is there right well that's the other thing you can you can start to use a tool and if it doesn't meet your need you can stop using it move to another tool so I think that puts you know vendors like jaspersoft than others puts them on their toes they've got to continually innovate and make their product useful otherwise you know they know that you know there were AWS customers can simply press the button stop using it press another button stop start using another tool so I think it's good in that sense but kind of you know when you talk about cloud and especially around data you get questions around privacy about data ownership who owns the data if it's in amazon's cloud is your data but you know it's on there in their data centers how do you feel about that Adam is there any concerns around either privacy or data ownership when it comes to using the cloud I mean you guys are all in in the cloud so right yeah so we've isolated a lot of our data into virtual private clouds so with that segment of the network we feel much more comfortable putting our data in a public space because we do feel like it's secure enough for our type of data so that was one of the major concerns up front but you know after talking with Amazon and going through the whole process of migrating to we kind of feel way more comfortable with that if you expand on that a little so you've got a private instance essentially in amazon's rep right so we have a private subnet so it's a segmented piece of their network that's just for us okay so we're not you can't access this publicly only within our VPN client or within our infrastructure itself so we're segmented we're away from that everybody else interesting so they offer that kind of type of service when there's more privacy concern as a security concern definitely and of course a lot depends on the type of data i mean how sensitive that data is if it you know but personally identifiable data obviously is going to be more sensitive than if it's just a general market data that anyone could potentially access daniel is we'll talk about your concerns around that or did you have concerns definitely a more of a governance people process question than a technology question I think well I definitely a technology question to a certain extent I mean as a as a transaction based business we were obviously very concerned with security and our CTO is very adamant about that and so that was one of the first first issues that we address whenever we decided to go this route and I'm obviously AWS has has taken all the precautions we have a very similar set up to what Adam is describing as far as our security we are very much confident that it is a very robust solution so looking forward how do you see your use of both the cloud and kind of analytics evolving you know one of the things we've been covering a lot is the as use case to get more complex your kind of you've got to orchestrate more data flows you've got to move data for more places you mentioned you're using attunity to do some of that replication from your transactional database and some red shift you know what are some of the other potential data integration challenges you see fate you see yourselves facing as you kind of potentially get more complex deployments we've got more data maybe you start using more services on Amazon how do you look to tackle some of those eight integration challenges let me start that's a good question one of the things we're trying to do inside of you know our organization is I guess bring data from all the different sources that we have together we have you know we use Salesforce for our sales team we collect information from MailChimp from our digital marketing agency that that we'd like to tile that information together and so that's something we're working on attunity has been a great help there and they're you know they're their product development as far as their capabilities of bringing in information from other sources is growing so that's a you know we're confident that the demand is there and that the product will develop as we as we move forward well I mean it's interesting that we've got you know you two gentlemen up here one with a kind of a on premise to cloud deployment and one all in the cloud so I'm clearly tuning you can kind of gap both those right on premise and cloud roll but also work in the cloud environment Adam when we if you could talk a little bit about how you see this kind of evolving as you get more complex maybe bring in more systems are you looking to bring in more data sources maybe even third-party data sources outside data sources how are you how do you look at this evolve right President Lee we do have a Mongo database so we have other sources that we're doing now there's talks of even trying to stick that in dynamo DB which is a reg amazon offering and that ties directly into redshift so we could load that data directly into that using that key pair or however we want to use that type of data data Mart but one of the things that we're trying to work out right now is just distribution and you know being agile you know elasticity which I work those issues with our growing database so so our database grows rather large each month so working on scalability is our primary focus but other data sources so we look into other database technologies that we can leverage in addition to sequel server to help distribute that load you so we've got time just for one more question I wonder I always like to ask when we get customers and users on if you can give some advice to other practitioners for watching so I mean if you can give one piece of advice to somebody who might be in your position they're looking at maybe they've got an on-premise data warehouse or maybe they're just trying to figure out a way to to get make better use of their data I mean what would the we the one thing would it be a technology piece of advice maybe you know looked at something like red shift or and solutions like attunity but maybe it would be more of a you know cultural question around the use of data and I'm I instead of making data-driven decisions but with that kind of one piece of ice big I could put you on the spot okay I would say don't try to do it yourself when the experts have done it for I couldn't put it any more simpler than that very succinct but very powerful but for me my biggest takeaway would be just redshift I was kind of apprehensive to use it at first I was so used to other technologies but we can do so much with redshift now add you know half the cost so your good works pretty compelling all right fantastic well Adam pains Daniel heacock thank you so much for joining us on the cube appreciate it we'll be right back with our next guests we're live here at AWS reinvent in Las Vegas you're watching the cube the cute

Published Date : Nov 13 2013

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
Daniel heacockPERSON

0.99+

Jeff KellyPERSON

0.99+

DavePERSON

0.99+

AWSORGANIZATION

0.99+

Daniel HeacockPERSON

0.99+

Adam CainPERSON

0.99+

AmazonORGANIZATION

0.99+

CaliforniaLOCATION

0.99+

amazonORGANIZATION

0.99+

Daniel heacockPERSON

0.99+

DanielPERSON

0.99+

20-plus yearsQUANTITY

0.99+

AdamPERSON

0.99+

jaspersoftORGANIZATION

0.99+

Adam HainesPERSON

0.99+

two customersQUANTITY

0.99+

15 minutesQUANTITY

0.99+

eighteen percentQUANTITY

0.99+

etixTITLE

0.99+

9,000 peopleQUANTITY

0.99+

Las VegasLOCATION

0.99+

twenty percentQUANTITY

0.99+

Amazon Web ServicesORGANIZATION

0.98+

first timeQUANTITY

0.98+

OracleORGANIZATION

0.98+

10 15 minutesQUANTITY

0.98+

each monthQUANTITY

0.97+

John foyerPERSON

0.97+

oneQUANTITY

0.97+

one more questionQUANTITY

0.96+

5-10 minutesQUANTITY

0.96+

cokeORGANIZATION

0.96+

two gentlemenQUANTITY

0.96+

TicketmasterORGANIZATION

0.95+

PresidentPERSON

0.95+

bothQUANTITY

0.95+

six months laterDATE

0.94+

12 months laterDATE

0.93+

six seven yearQUANTITY

0.92+

eight integrationQUANTITY

0.91+

12 minutesQUANTITY

0.89+

ticketmasterORGANIZATION

0.88+

lot of our dataQUANTITY

0.88+

agileTITLE

0.87+

firstQUANTITY

0.85+

one pieceQUANTITY

0.85+

red shiftTITLE

0.85+

two limitationsQUANTITY

0.85+

one piece of adviceQUANTITY

0.82+

JasperTITLE

0.8+

WikibonORGANIZATION

0.78+

minutes 1QUANTITY

0.77+

lotQUANTITY

0.75+

first issuesQUANTITY

0.74+