ChatGPT vs the FDA

A lot of us have had engaging, challenging, enlightening conversations online. We share our opinions and respond to comments, leaving a public accounting of the exchange for others to think about — or even chime in on. But sometimes there is more to say.

Shaherah Yancy, founder and CEO of Research Lifecycle Solutions, speaking at Capital Readiness Program

That was certainly the case for Shaherah Yancy, founder and CEO of Research Lifecycle Solutions, and Heath Naquin,  vice president of Government and Capital Engagement at the Science Center, when they bantered back and forth on Linkedin about the role of ChatGPT and other AI tools in the regulatory process for med-tech companies. So they decided to record a podcast to share what they’re seeing, sound the alarm on the overreliance on these products, and even reveal how they use ChatGPT in their own work.

Here is an edited excerpt from that discussion, but don’t miss the full episode.

Listen to the full episode


Lee Stabert [host]:
this discussion really came from the two of you wanting to share some insights on regulatory strategy and how you think some startups these days are missing the mark. So was there a specific moment or incident that inspired you to start talking about LLM usage, AI usage, and companies moving through regulatory strategy?

Heath Naquin: Was there a particular moment? I think there were moments. At the Science Center, we work with life science companies and have for our 60 year history, but especially in the last two to three years, we’re seeing a lot of what we'll call “AI-informed decisions” versus “lived-experience decisions.” And that led to some concerns, honestly, for the companies that we're working with on whether they're on the right path or not.

We work with Shaherah and Research Lifecycle Solutions quite regularly. We often recommend companies to work with her group for that actual lived-experience strategy in regulatory, which is so important for them in their journey.

I don't know what we can do to stop it, but we can talk about it — and explain that the use of shortcuts in regulatory just doesn't work.

Shaherah Yancy, founder and CEO of Research Lifecycle Solutions

Shaherah Yancy: For me, I can compartmentalize this into three different categories. First, I noticed that the failure rate is so high among startup companies. Specifically, medtech/biotech, the failure rate right now is over 85 percent. I wanted to figure out why these companies are failing at such a high rate. These innovations are great — I mean, some are lifesaving innovations, right?

So why are they not making it to market? I went on this year-long spree to figure out what's going on, talking to investors and founders that made it in that 15 percent and those that didn't make it in that 85 percent

So that's the first piece. The second piece is working with small companies — that's what we do at Research Lifecycle Solutions — to figure out how to get them over that commercialization chasm.

And the third piece, to Heath's point, we had this epiphany: People are running to AI. Now AI is amazing. ChatGPT is amazing. But people are using AI as their strategy and that's a huge mistake.

Host: So part of this conversation is inspired by a post Shaherah made on LinkedIn. I want to read a quote from that: “Trying to launch your product using only ChatGPT is like trying to build a plane from YouTube videos.”

Can you talk about that? Tell me more about what you're seeing when it comes to people using ChatGPT and other LLMs for this really complicated process of getting a device approved and going to market.

Heath: Going back like 10 years ago, before ChatGPT, when somebody came to you with a regulatory strategy, they would at least have a regulatory consultant. It's not that you need to have a consultant, it's just that for a startup it was more cost efficient to hire a really good person to help with regulatory strategy and the initial FDA submission.

The last two to three years, it's like, let's go to YouTube or ChatGPT, and it spits out something. And the weird part is, from what I've seen, it sounds generally correct. People often go running off in that direction without any underlying nuance. And then you start digging under the hood as an investor or in a program, and it goes sideways.

Heath Naquin,  Vice President of Government and Capital Engagement at the Science Center, speaking with group Capital Readiness Program

Shaherah: It's extremely true. In my recent experience, I've even had founders tell me, “So Shaherah, what can you tell me that ChatGPT can't tell me?” And that kind blew me away. And I was like, “Hey, just throw away my past 20 years of biotech and regulatory [experience] because ChatGPT has it figured out.”

You can look at it as “Dr. Google,” right? We get a symptom and we say, “Hey Google, my head hurts. What does that mean?” And Google tells you that you’re dying tomorrow. Go see the doctor! Maybe you're just stressed. But the point is that founders, CEOs, med tech companies, they're doing that.

They will hear something through Google, ChatGPT, or even read an FDA script and think they got the answer — and we wonder why it didn't work. You need strategy, you need background, you need context, you need interpretation. There are several things you need to do to come up with a strategy.

full episode

Connect on LinkedIn

And find them at HLTH USA in the Science Center Pavilion, October 19-22.
--
Connect with Heath Naquin
Connect with Shaherah Yancy

Let us know what you thought of this podcast! Reach out