Image Title

Search Results for NHLBI:

Hui Xue, National Heart, Lung, and Blood Institute | DockerCon Live 2020


 

>> Narrator: From around the globe it's theCUBE with digital coverage of DockerCon Live 2020. Brought to you by Docker and its ecosystem partners. >> Hi, I'm Stu Miniman and welcome to theCUBE's coverage of DockerCon Live 2020. Really excited to be part of this online event. We've been involved with DockerCon for a long time, of course one of my favorite things is always to be able to talk to the practitioners. Of course we remember for years, Docker exploded onto the marketplace, millions of people downloaded it, using it. So joining me is Hui Xue, who is a Principal Deputy Director of Medical Signal Processing at the National Heart, Lung, and Blood Institute, which is part of the National Institute of Health. Hui, thank you so much for joining us. >> Thank you for inviting me. >> So let's start. Of course, the name of your institute, very specific. I think anyone in the United States knows the NIH. Tell us a little bit about your role there and kind of the scope of what your team covers. >> So I'm basically a researcher and developer of the medical imaging technology. We are the heart, lung and the blood, so we work and focus on imaging the heart. So what we exactly do is to develop the new and novel imaging technology and deploy them to the front of our clinical library, which Docker played an essential role in the process. So, yeah, that's what we do at NHLBI. >> Okay, excellent. So research, you know, of course in the medical field with the global pandemic gets a lot of attention. So you keyed it up there. Let's understand, where does containerization and Docker specifically play into the work that your team is doing? >> So, maybe I'd like to give an example which will suffice. So for example, we're working on the magnetic resonance imaging, MRI. Many of us may may already have been scanned. So we're using MRI to image the heart. What Docker plays, is Docker allow us to deploy our imaging technology to the clinical hospital. So we have a global deployment around 40 hospitals, a bit more, around the world. If we are for example develop a new AI-based image analysis for the heart image, what we do with Docker is we can put our model and software into the Docker so that our collaboration sites, they will pull the software that contains the latest technology, then use them for the patients, of course under the research agreement at NIH. Because Docker is so efficient, available globally, we can actually implement a continuous integration and testing, update the framework based on Docker. Then our collaborators would have the latest technology instead of, you know, in the traditional medical imaging in general, the iteration of technology is pretty slow. But with all this latest technology, and such like container Docker come into the field. It's actually relatively new. In the past two to three years, all these paradigm is, it's changing, certainly very exciting to us. It give us the flexibility we never had before to reach our customers, to reach other people in the world to help them. They also help us so that's a very good experience to have. >> Yeah that's pretty powerful what you're talking about there rather than you know, we install some equipment, who knows how often things get updated, how do you make sure to synchronize between different locations. Obviously the medical field highly regulated and being a government agency, talk a little bit about how you make sure you have the right version control, security is in place, how do all of those things sort out? >> Yes, that's an essential question. So firstly I want to clarify one thing. So it's not NIH who endorse Docker, it's us as researchers. We practiced Docker too and we trust its performance. This container technology is efficient, it's globally available and it's very secure. So all the communication between the container and the imaging equipment is encrypted. We also have all the paperwork it saved to set up to allow us to provide technology to our clinician. When they post the latest software, every version they put up into the Docker went through an automated integration test system. So every time they make a change, the newer version of software runs through a rigorous test, something like 200 gigabytes of data runs through and checked everything is still working. So the basic principle is we don't allow any version of the software to be delivered to customer without testing Docker. Let's say this container technology in general actually is 100% automating all this process, which actually give us a lot of freedom so we have a rather very small team here at NIH. Many people are actually very impressed by how many customer we support within this so small team. So the key reason is because we have a strongly utilized container technology, so its automation is unparalleled, certainly much better than anything I had before using this container technology. So that's actually the key to maintain the quality and the continuous service to our customers. >> Yeah, absolutely. Automation is something we've been talking about in the industry for a long time but if we implement it properly it can have a huge impact. Can you bring us inside a little bit, you know, what tools are you doing? How is that automation set up and managed? And how that fits into the Docker environment. >> So I kind of describe to be more specific. So we are using a continuous testing framework. There are several apps to be using a specific one to build on, which is an open source Python tool, rather small actually. What it can do is, this tool will set up at the service, then this service will watch for example our GitHub repo. Whenever I make a change or someone in the team makes a change for example, fix a bug, add a new feature, or maybe update a new AI model, we push the edge of the GitHub then there's a continuous building system that will notice, it will trigger the integration test run all inside Docker environment. So this is the key. What container technology offers is that we can have 100% reproducible runtime environment for our customers as the software provider, because in our particular use case we don't set up customer with the uniform hardware so they bought their own server around the world, so everyone may have slightly different hardware. We don't want that to get into our software experience. So Docker actually offers us the 100% control of the runtime environment which is very essential if we want to deliver a consistent medical imaging experience because most applications actually it's rather computational intensive, so they don't want something to run for like one minute in one site and maybe three minutes at another site. So what Docker place is that Docker will run all the integration tests. If everything pass then they pack the Docker image then send to the Docker Hub. Then all our collaborators around the world have new image then they will coordinate with them so they will find a proper time to update then they have the newer technology in time. So that's why Docker is such a useful tool for us. >> Yeah, absolutely. Okay, containerization in Docker really transformed the way a lot of those computational solutions happen. I'm wondering if you can explain a little bit more the stack that you're using if people that might not have looked at solutions for a couple of years think oh it's containers, it's dateless architectures, I'm not sure how it fits into my other network environment. Can you tell us what are you doing for the storage in the network? >> So we actually have a rather vertical integration in this medical imaging application, so we build our own service as the software, its backbone is C++ for the higher computational efficiency. There's lots of Python because these days AI model essential. What Docker provides, as I mentioned, uniform always this runtime environment so we have a fixed GCC version then if we want to go into that detail. Specific version of numerical library, certain versions of Python, will be using PyTorch a lot. So that's our AI backbone. Another way of using Docker is actually we deploy the same container into the Microsoft Azure cloud. That's another ability I found out about Docker, so we never need to change anything in our software development process, but the same container I give you must work everywhere on the cloud, on site, for our customers. This actually reduces the development cost, also improve our efficiency a lot. Another important aspect is this actually will improve customers', how do they say it, customer acceptance a lot because they go to one customer, tell them the software you are running is actually running on 30 other sites exactly the same up to the let's say heights there, so it's bit by bit consistent. This actually help us convince many people. Every time when I describe this process I think most people accept the idea. They actually appreciate the way how we deliver software to them because we always can falling back. So yes, here is another aspect. So we have many Docker images that's in the Docker Hub, so if one deployment fails, they can easily falling back. That's actually very important for medical imaging applications that fail because hospitals need to maintain their continuous level of service. So even we want to avoid this completely but yes occasionally, very occasionally, there will be some function not working or some new test case never covered before, then we give them an magnet then, falling back, that's actually also our policy and offered by the container technology. >> Yeah, absolutely. You brought up, many have said that the container is that atomic unit of building block and that portability around any platform environment. What about container orchestration? How are you managing these environments you talked about in the public cloud or in different environments? What are you doing for container orchestration? >> Actually our set-up might be the simplest case. So we basically have a private Docker repo which we paid, actually the Institute has paid. We have something like 50 or 100 private repos, then for every repo we have one specific Docker setup with different software versions of different, for example some image is for PyTorch another for TensorFlow depending on our application. Maybe some customer has the requirement to have rather small Docker image size then they have some trimmed down version of image. In this process, because it's still in a small number like 20, 30 active repo, we are actually managing it semi-automatically so we have the service running to push and pull, and loading back images but we actually configured this process here at the Institute whenever we feel we have something new to offer to the customer. Regarding managing this Docker image, it's actually another aspect for the medical image. So at the customer side, we had a lot of discussion with them for whether we want to set up a continuous automated app, but in the end they decided, they said they'd better have customers involved. Better have some people. So we were finally stopped there by, we noticed customer, there are something new to update then they will decide when to update, how to test. So this is another aspect. Even we have a very high level of confirmation using the container technology, we found it's not 100%. In some site, it's still better have human supervision to help because if the goal is to maintain 100% continuous service then in the end they need some experts on the field to test and verify. So that's how they are in the current stage of deployment of this Docker image. We found it's rather light-weight so even with a few people at NIH in our team, they can manage a rather large network globally, so it's really exciting for us. >> Excellent. Great. I guess final question, give us a little bit of a road map as to, you've already talked about leveraging AI in there, the various pieces, what are you looking for from Docker in the ecosystem, and your solution for the rest of the year? >> I would say the future definitely is on the cloud. One major direction we are trying to push is to go the clinical hospital, linking and use the cloud in building as a routine. So in current status, some of sites, hospital may be very conservative, they are afraid of the security, the connection, all kinds of issues related to cloud. But this scenario is changing rapidly, especially container technology contributes a lot on the cloud. So it makes the whole thing so easy, so reliable. So our next push is to move in lots of the application into the cloud only. So the model will be, for example, we have new AI applications. It may be only available on the cloud. If some customer is waiting to use them they will have to be willing to connect to the cloud and maybe sending data there and receive, for example, the AI apps from our running Docker image in the cloud, but what we need to do is to make the Docker building even more efficiency. Make the computation 100% stable so we can utilize the huge computational power in the cloud. Also the price, so the key here is the price. So if we have one setup in the cloud, a data center for example, we currently maintain two data centers one across Europe, another is in United States. So if we have one data center and 50 hospitals using it every day, then we need the numbers. The average price for one patient comes to a few dollars per patient. So if we consider this medical health care system the costs, the ideal costs of using cloud computing can be truly trivial, but what we can offer to patients and doctor has never happened. The computation you can bring to us is something they never saw before and they never experienced. So I believe that's the future, it's not, the old model is everyone has his own computational server, then maintaining that, it costs a lot of work. Even doctor make the software aspects much easier, but the hardware, someone still need to set-up them. But using cloud will change all of. So I think the next future is definitely to wholly utilize the cloud with the container technology. >> Excellent. Well, we thank you so much. I know everyone appreciates the work your team's doing and absolutely if things can be done to allow scalability and lower cost per patient that would be a huge benefit. Thank you so much for joining us. >> Thank you. >> All right, stay tuned for lots more coverage from theCUBE at DockerCon Live 2020. I'm Stu Miniman and thank you for watching theCUBE. (gentle music)

Published Date : May 29 2020

SUMMARY :

the globe it's theCUBE at the National Heart, Lung, of the scope of what your team covers. of the medical imaging technology. course in the medical field and software into the Docker Obviously the medical field of the software to be the Docker environment. edge of the GitHub then in the network? the way how we deliver about in the public cloud or because if the goal is to from Docker in the ecosystem, So the model will be, for example, the work your team's doing you for watching theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NIHORGANIZATION

0.99+

National Institute of HealthORGANIZATION

0.99+

100%QUANTITY

0.99+

EuropeLOCATION

0.99+

United StatesLOCATION

0.99+

200 gigabytesQUANTITY

0.99+

one minuteQUANTITY

0.99+

three minutesQUANTITY

0.99+

Stu MinimanPERSON

0.99+

PythonTITLE

0.99+

Hui XuePERSON

0.99+

50 hospitalsQUANTITY

0.99+

DockerORGANIZATION

0.99+

DockerConEVENT

0.99+

20QUANTITY

0.99+

one patientQUANTITY

0.99+

30 other sitesQUANTITY

0.99+

PyTorchTITLE

0.99+

MicrosoftORGANIZATION

0.99+

DockerTITLE

0.99+

one data centerQUANTITY

0.98+

oneQUANTITY

0.98+

one siteQUANTITY

0.98+

millions of peopleQUANTITY

0.98+

two data centersQUANTITY

0.97+

DockerCon Live 2020EVENT

0.97+

firstlyQUANTITY

0.97+

HuiPERSON

0.97+

NHLBIORGANIZATION

0.97+

National Heart, Lung, and Blood InstituteORGANIZATION

0.97+

theCUBEORGANIZATION

0.97+

one customerQUANTITY

0.96+

National Heart, Lung, and Blood InstituteORGANIZATION

0.96+

one thingQUANTITY

0.96+

50QUANTITY

0.96+

100 private reposQUANTITY

0.93+

around 40 hospitalsQUANTITY

0.91+

30 active repoQUANTITY

0.85+

pandemicEVENT

0.82+

three yearsQUANTITY

0.82+

C++TITLE

0.81+

Hui XueORGANIZATION

0.8+

TensorFlowTITLE

0.75+

One majorQUANTITY

0.71+

Azure cloudTITLE

0.7+

DockerPERSON

0.7+

Medical Signal ProcessingORGANIZATION

0.66+

few dollars per patientQUANTITY

0.65+

couple of yearsQUANTITY

0.58+