Listen Carefully: Is the Stethoscope on Life Support? - Jason Bellet, Co-founder and CCO of Eko


The stethoscope has been around since 1816 and its basic design hasn't changed substantially in decades. But this tried and true tool has been launched into the world of machine learning and artificial intelligence in recent years largely thanks to the work of Jason Bellet and his co-founders at Eko. Today, over 50,000 clinicians in thousands of health systems across the globe are using Eko’s digital stethoscopes and echocardiograms to diagnose and monitor heart problems, and there's more change to come, as he explains to host Shiv Gaglani in this episode of Raise the Line.




SHIV GAGLANI: Hi, I'm Shiv Gaglani. Today, on Raise the Line, I'm happy to be joined by Jason Bellet, the co-founder and Chief Customer Officer of Eko, which is a digital health company that equips providers with artificial intelligence-powered digital stethoscopes, and electrocardiograms to assist them in the detection and monitoring of patients with cardiovascular disease.

Over 50,000 clinicians across 4,000 health systems are using Eko’s tools to help millions of patients around the globe. Jason, thanks so much for being with us today. 

JASON BELLET: Absolutely. Thanks for having me. It's good to see you again. 

SHIV GAGLANI: One quick piece of information for the audience. Osmosis and Eko were actually incubated by the same tech investors called DreamIt Health. It's been a couple of years since we last connected, but it's amazing to see the traction you all have achieved at that time.  

JASON BELLET: Same with you. We’ve got to cheer on our other digital health founders. 

SHIV GAGLANI: Totally. I know about your background, but for our audience, which is primarily health professional students and health professionals, we'd love to hear a bit about how you even got into co-founding Eko.

JASON BELLET: I'm happy to tell you a little bit about Eko, how we got started and where we are now. We started Eko as undergraduates at UC Berkeley. It's an interesting story. My co-founder Connor was in his Bioengineering senior thesis class. They had a really broad thesis topic, which was identifying a gap in healthcare, having stakeholder interviews with providers and administrators, and then, by the end of this semester, propose a solution. It couldn't be broader in terms of scope, but he was really interested in cardiology and, having had a heart murmur, actually really interested in murmurs as well, so he spoke with a ton of cardiologists and one of them looked down at their stethoscope and said, “This is the icon of medicine. It's worn around the neck of 30 million providers around the world. It's a rubber tube with a metal chest piece. We're so reliant on it in the front lines. Even in the age of echocardiography and 12 lead EKG, we still use the stethoscope to pick up the first signs of heart failure, heart disease, and pulmonary disease. Yet it's very subjective, and though cardiologists are considered the experts, they even struggle with picking up the nuances, let alone providers who aren't experts in cardiovascular care.” So, the light bulb went off for Connor, “Can we build a stethoscope that has embedded AI that actually allows providers to understand better what they're listening to and then give them decision support to be able to pick up disease earlier on?”, and we started it out as a pie in the sky, “Let's bring machine learning to the stethoscope.” Ultimately, over the last six years, this has come into reality with the platform that we built. It's been an incredible journey.

SHIV GAGLANI: Truly. It's pretty incredible to see where you've come to over the last six years. What is your background like? You were talking about Connor's a Biomed engineer. Did you drop out of your undergraduate program?

JASON BELLET: It’s a good question. Connor had this idea in the senior year of Berkeley. At the time, I was in the Business School, an undergraduate program at Haas, and our third co-founder was a mechanical engineer. The unique part about being on a college campus is you can quickly pull from people with different experiences and different studies from different departments to come together and work on a project.

Connor tapped me to build the economic model, do a market assessment and competitive assessment to figure out how we were going to bring this thing to market. The other co-founder was building it from scratch, building the prototype with 3-D printing and some off-the-shelf components, also hacking away at an iOS application to be able to pull data off the stethoscope. 

We took it all and pitched it to an angel investor as part of a campus entrepreneurship program that they were starting at schools across the country. It was an angel fund called Founder.org. Their thesis was investing in first-time student entrepreneurs. We were really fortunate to get a first-round of seed funding to go past that initial prototype.

At the time, we were all looking at, “What are we going to do after college? Should we start looking for jobs?” As soon as we got that first sign that there was going to be some investment behind this, we all gave up our job search, and this became our only professional experience of building this thing from the ground up. So, my experience is Eko. 

SHIV GAGLANI: That's amazing. Can you talk to us a bit more about the history? You started off with the stethoscope, and then over time, you started getting more products like the electrocardiogram. You got FDA approval, I believe, or you've been working with FDA on the AI components. I’d love to hear what that ramp up has been like and the product offerings you have right now. 

JASON BELLET: It really started with the end-goal of building an AI-assisted cardiac screening tool. Obviously, the stethoscope is the first-line screening tool, and we wanted to start with that, but the big picture is, “How do we help providers pick up previously undiagnosed, even potentially asymptomatic, bowel disease, AFib, and heart failure in the clinic? Can we get those patients into the cardiologist’s office so they can get diagnosed and under the right care pipeline?”

To do that, it was broken into three different components. First, we had to build the devices to capture the data, and I wish that there had been a Bluetooth enabled stethoscope with mobile compatibility that we could have just started layering AI on top of. There wasn't the iPhone to build the app on, so we had to start by getting into the hardware space. Then we had to build the software to really pull the data off the stethoscope for iOS and Android and make it engageable at the point of care.

The third is the AI. Each of those three components took on their own roadmaps, so for the first three years, while we were building the AI and the training data set, and then ultimately the validation data set, we had this amazing digital stethoscope that didn't have AI, so we tried to figure out “how do we bring this to market now”, not twiddle our thumbs for five years while we build the algorithm, but begin to provide value? That's where the Core first came to market, and its value proposition really was to help providers hear more clearly, then it was also used for telehealth. 

As telehealth was exploding, the ability to capture high-quality heart and lung sounds and then stream them via our platform during telemedicine visits really became our focus all the while we were building the big vision, which was the decision support capacity. As we were still along that line, we realized that in order to provide the level of decision support we wanted at the point of care -  while heart and lung sounds were great - we could get so much more if we also had ECG. We could better understand the timing. We could assess arrhythmias.

Mayo Clinic had been developing a machine learning algorithm that could screen for low ejection fraction with a single EDCG, which was mind-blowing, but if you could combine a single EDCG with a stethoscope, build it into a handheld tool that can be applied during the standard auscultation, that was the dream in terms of being able to really screen for a panel of conditions. 

The duo ultimately quickly became a high priority product for us. Just in about January of this year of 2020, everything came together where we had the duo FDA cleared. We had the Core FDA cleared, we got the AI FDA cleared, and we have about 200 health systems that use our telehealth solution. Now it's all about layering the AI on top of their current implementations and bringing the AI to the 50,000 providers that use our devices every day.

SHIV GAGLANI: That's incredible. Your year started off really well, and COVID hit, which obviously has accelerated a lot of the things that I know you've been talking about for some time in terms of telehealthcare, the need for remote screening, the need for patient enabled healthcare screening and so, consumer health. Could you tell us a bit more about how COVID has affected your growth curve right now?

JASON BELLET: Absolutely. It all comes back to that big picture of how do we help providers better detect disease at the point of care? The big thing that COVID changed is, “What is the point of care?”, so whether the doctor is side by side with the patient, the internal medicine office doing the annual wellness exam or they're a 100 miles away doing a video conference, the need for that internal medicine doctor to screen that patient for potentially life-threatening cardiovascular conditions, and even more now with COVID-related potentially life-threatening cardiovascular conditions, haven’t changed.

What we're really focused on is, especially in the age of telehealth, “How do we help providers move beyond video?” For you and me, to be able to see and talk to each other is one thing, but as a provider, I can only do so much. 

We're not at the point yet where we can throw away the physical exam and just do a video call, so being able to equip providers with tools that can be sent directly to the patient, to their home, or if they're at a skilled nursing facility or a nursing home, be sent to wherever they are, which is the new point of care, and allow that provider to be able to diagnose with not only greater confidence, because they can hear the heart and lung sounds, but with the AI greater accuracy, we really feel like that's the next frontier.

We have been talking about telehealth for a long time. This really accelerated the industry's adoption of it and also the reimbursement of it, which was a key missing component that we now have. 

SHIV GAGLANI: Yes, with the Cares Act. I know there’s emergency funding for telehealth. Let’s say I'm a cardiologist, or I'm a primary care doctor, and I'm seeing a group of patients at a skilled nursing facility where I'm not visiting them, but they have an Eko, they have your devices.

What is that experience like? What do I see? Am I on a Zoom call, and then I see the Eko output in a separate app? How does that work?

JASON BELLET: Great question. Within our application,  the patient location --  so the skilled nursing facility  -- can use an iPad on a cart. They can use a Windows PC. Any platform they want to install our software. The provider, on the other end, can enter into a video chat with the nurse or the MA along with the patient. In some cases, it's even the patient by themselves at home using this on themselves. 

In addition to the video chat, the second they turn on the device, the tablet in our application automatically picks up the stream and begins to live stream the heart and lung sound to the provider on the other end. So you, Shiv, as the doctor, are seeing me. We also put on headphones so you can hear my heart sounds, my lung sounds, and get a single lead ECG. By using the AI, you can click a button and capture 15 seconds of either heart sound data or ECG data and assess it right there for AFib or the presence of a heart murmur. 

Beyond that, of course, you could also do the other components of the physical core exam that are not necessarily needed to be livestreamed. The key component of the exam that you just cannot replace with a manual capture is the stethoscope. That's what we deliver, and it's almost as important as the real-time video. 

SHIV GAGLANI: Wow. That's fantastic. Do you have any electronic health record implementations yet, or is that coming? 

JASON BELLET: We do, yes. We work with a partner called Redox, that I’m sure—

SHIV GAGLANI:  -- another DreamIt company.

JASON BELLET: Yes, exactly. Another DreamIt company, who really helped us accelerate our ability to integrate with a variety of EHRs, and then do custom implementations at various health systems. Now that the AI is on the market, the need to not only be able to get that point of care screening but also embed that into the patient's record and then send it to the specialist with the referral and say, “We heard a murmur. We documented AFib.” Among the patients over the age of 65, 2.3% are in AFib, have asymptomatic AFib, so the ability to just screen every incomer in the exam means you're going to inevitably pick up patients that you would have missed, so the ability to append that to any referral that you do to a cardiologist is so helpful.

SHIV GAGLANI: Totally. Do you mind giving us a bit of a sense of your users? At this point, you have 50,000 clinicians. What's the breakdown? Are they nurse practitioners, cardiologists, other primary care docs? Also, as far as building that AI library, AI is as good as the data set it's trained on. How big is your data set roughly at this point?

JASON BELLET: Great question. Our user base is really as horizontal as the stethoscope is as a tool. We've got specialists using this at the point of care. Whether they're a cardiologist or pulmonologist to hear better, even though they're the experts at being able to pick up abnormalities, they still love the additional decision support that we can provide. 

All the way to medical assistants, nursing, and medical school students who are learning auscultation for the first time and can go out on some of their first rounds and be able to finally experience what their teachers had been talking about, “Oh, I hear a murmur. It's loud and clear, and the app is telling me that, “Yes, indeed, this is a murmur.""

The 50,000 providers really span from physicians to nurses, to nurse practitioners, to physician assistants. The majority of them are practicing primary care or internal medicine providers because they really serve as the front line for picking up on valve disease and need that higher level of auscultation, but we'd benefit anyone that uses a stethoscope. 

In terms of the data science side, you're right.  There wasn't a data set of annotated pathological heart sounds that we could just use to develop the machine learning algorithm, so we had to go out and partner with organizations like Northwestern Medicine and UCSF to conduct 18-month long studies where we captured heart sounds and lung sounds, validated the heart sounds against echocardiography, used that data to train the algorithm, and we then ultimately validated it at a study we did at Northwestern Medicine.

SHIV GAGLANI: Wow. That's incredible. I'm going to go back to the education piece, but one quick thing I wanted to ask about is, we know that COVID has a lot of respiratory impacts, but people are saying that it may be a blood vessel disorder as well or it affects the blood vessels throughout the systems. As far as the respiratory effects, is there a specific lung sound database now you're collecting on patients with COVID. Can you talk a bit about that?

JASON BELLET: We are working with some institutions to build a data set of lung sounds from COVID patients. What we've heard from our providers is that there's not a specific sound that necessarily corresponds to COVID but being able to differentiate that using machine learning when a physician is auscultating would be helpful in terms of differentiating between various pulmonary conditions.

What we're focusing on is how do we build an algorithm that can help with the pulmonary exam differentiating between wheezes and rales to essentially help what you're already doing, which is trying to suss out the different potential pathologies rather than trying to use the pulmonary auscultation to arrive at a disease diagnosis. But COVID-19, as pulmonary disease, has definitely escalated our research into pulmonary as a category.

SHIV GAGLANI: It makes a lot of sense. Speaking of medical education, one reason I was excited about Eko when I first learned about it early on, around the time we were starting Osmosis, is we also worked with TedMed to create this thing called a smartphone physical. 

The whole reason for doing it was that we said, “Look, at some point in the future, patients directly, or providers, will have access to this data,” and then like you guys are doing, “Let’s add an AI or Machine Learning layer on top of it to help interpret.” 

In the process of developing the smartphone physical, I got to know Eric Topol really well. He's written a book on AI in medicine. He's a cardiologist at Scripps. He talks about a lot of things in AI, but one thing we wrote together, an article for academic medicine, was about how these devices like Eko can be used to improve the quality of education. 

Do you mind talking a bit about how a lot of our medical, nursing, PA students in schools could be using the Eko today to improve education, and then also what the role of a clinician will be assuming AI does all the diagnostics for them?

JASON BELLET: It's fascinating to think about the future of the physical exam. That obviously impacts how educators are teaching it today. One conversation that comes up a lot is what the future of the stethoscope is. We have handheld ultrasounds. What is the need to be able to train auscultation and differentiate between normal and pathologic murmurs when you have an ultrasound at the bedside?

We've been really focused on this discussion. I think that where we've ultimately ended up is that we want to look at what we have today, and we have today about 30 million providers with a stethoscope around their neck, using it to provide the front line screening tool around the world.

We're working towards a generation of providers where they are going to have handheld ultrasounds. I think the reality is just given the cost of that technology, the training that's going to be required to use it, and even the time that it takes to embed into the physical exam, there's going to be at least for the next -- I'm not a fortune teller -- but for the next decade and a half, a world in which the stethoscope and the ultrasound are still feeding off of each other, and the stethoscope is probably going to be used a lot more than handheld ultrasound in your everyday wellness exam, so being able to build tools that help providers use that tool more effectively. Then, if something flags, having an ultrasound that's right there that can be used more effectively and get to the diagnosis in a more timely manner.  That's the yin and yang between those two technologies that we're really excited about.

In terms of education, we are committed to helping students to understand the importance of auscultation because the reality is, when you look at the data, the number of patients that come in and don't flag the need for an ultrasound but that have moderate aortic stenosis and moderate mitral regurgitation, who walk right in and out of that wellness exam, is staggering. So being able to couple the stethoscope and the echocardiogram in both how we train this next generation of students and how we implement it in practice today, I think it is really important. 

What the device will be like in 15, 20 years from now is anyone's guess, but I think right now we're really focused on how we bring these together.

SHIV GAGLANI: Yes, totally. I know you're constantly innovating, so I'm sure whether it's five or 10 or 15 years, Eko will eventually create an ultrasound as well that's cheaper. 

JASON BELLET: Our big premise is always the early detection of cardiovascular disease and then ultimately monitoring it, so we felt with our stethoscope that given today's healthcare environment and what would be the fastest way to improve detection in every physical exam, that's where we started and we've even shifted to the home with COVID “how do we bring these types of tools to patients?” 

As we see the cost of ultrasound technology come down and the ability to implement it, we will have to adapt. The nature of cardiac screening will change but right now, and for the foreseeable 10, 15, 20 years, we're really looking at how we empower this generation of students to be able to learn the value of auscultation, the value of the physical exam, and this stethoscope is critical to that.

SHIV GAGLANI: Totally. Speaking of students, my last question is given that our audience has a lot of medical, nursing, PA, and other allied health professional students, do you have any advice to them about meeting the challenges of COVID or anything related to Eko that they should know as well?

JASON BELLET: Well, first of all, I want to just express my gratitude to them. Signing up to enter medicine right now is not an easy thing to do. I have friends that are medical students who are just graduating, and these six months or a year that we have ahead of us in the thick of COVID is going to be their first year in medical practice, and it's not easy. 

Eko, as a company whose community is primarily clinicians, we're really focused on how we innovate and are able to support our providers. One of the things that we have done from feedback from our providers is thinking about wireless auscultation. When we think of the stethoscope, it's a 25-inch, 26-inch rubber tubing that's separating you and the patient. You have to get very close. If you're wearing PPE, it can often be difficult to maneuver the stethoscope and be able to listen at the same time.

What we discovered is that using the Bluetooth in our devices actually enables providers to live stream the audio to Bluetooth AirPods or headphones that they're wearing, which allows them to remove the rubber tubing completely, put headphones under their PPE, and be able to either have the patient put the stethoscope on their own chest and guide the patient on the placement or fully encapsulate the stethoscope in a glove or in a covering and be able to perform the exam, but at a further distance from the patient.

In terms of advice for providers: institutions, and healthcare technology companies need to be focused on provider and practitioner safety. We went through the PPE crisis. We're still in it, but now we have to think about how we deliver care with patient and provider safety as one of the foremost, if not the foremost concern. 

I think medical students need to think about their own safety, what tools they will invest in, what institutions they will go work for, and really prioritize that. It’s one of those things I couldn't even imagine, the experience of going into the medical sector in this era, but Eko is certainly here to try to think of and get creative ways to help providers still provide the same level and caliber of care they provided before COVID, but in a much safer way, given the infection control concerns.

SHIV GAGLANI: That's super innovative, not only to protect that patient and that provider but then the next patient the provider sees as opposed to carrying the viral particles or any other particles from patient to patient.


SHIV GAGLANI: With that, Jason, I really want to thank you for the work you do at Eko and for taking the time to be with us today.

JASON BELLET: Thanks, Shiv. It was awesome talking to you, and I look forward to staying in touch. 

SHIV GAGLANI: Awesome. With that, I'm Shiv Gaglani. Thank you to our audience for checking out today's show, and remember to do your part to flatten the curve and raise the line. We're all in this together.