Hello and welcome to the A16Z Journal Club, our podcast where we cover recent scientific advances in bio, why they matter, and how to take them from proof of principle to practice.I'm Lauren Richardson.
Today's episode covers the topic of digital therapeutics, given that earlier this month the FDA approved its first ever prescription video game.
And since this is a journal club, we go into one of the key clinical trials underpinning this historic decision, published recently in Lancet Digital Health by Scott Collins and colleagues, which evaluated this game's ability to improve attention in kids with attention deficit hyperactivity disorder, or ADHD.
Our sister podcast, 16 Minutes on the News, also covered this topic, debating the bigger picture questions of what is and isn't a digital therapeutic and where pricing and regulation comes in.
You can find that episode at a16z.com forward slash 16 minutes.
A16Z BioDeal team partner, former MD and entrepreneur Justin Larkin joins me in this discussion where we cover the pros and cons of this video game compared to traditional pharmacological therapies.
The game specifically targets the attention impairments in 8 to 12 year olds by having them help an avatar navigate a digital environment in the face of other distractions.
We also discuss how the randomized control clinical trial, or RCT, was designed, the limitations of the study, and the open questions to be addressed.We begin with what question this paper answers.
can a video game be a digital therapeutic?And I think this paper resoundingly answers that question of yes, a video game can target a specific mechanism of action, can have efficacy to cross that threshold of being considered a therapeutic.
But then I think more importantly is can this specific video game have an impact on a child with ADHD to the point where it could be considered a tool in the toolkit of a physician treating this patient?And I would argue
The resounding answer from this is certainly this is something that could be considered, both because there's an obvious benefit, but also the risks associated with it proved to be pretty minimal.
There's plenty more research to be done, but I think this paper does a great job at answering that and the subsequent FDA decision just went on to reinforce that even further.
My favorite part of the paper was getting to watch that supplemental file showing this kid's video game and see kind of what the kid sees and being able to connect that to kind of the goals of the game.
If you were to watch that video outside of the context of knowing that this was coming from a digital therapeutic company or a clinical trial, you probably wouldn't know that it was a digital therapeutic.
You'd probably think it was any other video game that our kids are playing these days. I think the opportunity for digital therapeutics is that it creates an environment where a therapeutic can actually be enjoyable, can actually be entertaining.
And the really cool thing about this program is that it learns, it's able to know what the kid's ability is and then tailor the game and how hard it is based on how quickly they pick it up.
And it's always responding to that level of engagement as the kid learns to adopt those skills.
Yeah.If you think about, you know, traditional therapeutic, like a pill, there's no opportunity for kind of real time iteration of the therapeutic.There's no ability to say, Hey, you know, this person is responding great.
So we're going to cut this thing off or, Hey, this person isn't responding.So let's up titrate the dose where in the game, there's this ability in real time to iterate on how the person's experiencing that therapy.
Yeah, I've known kids with ADHD and some days they have good days and some days they have days where it's more difficult.Being able to have that happen real time seems really appropriate for this type of condition.
I think it's a really great match between kind of underlying mechanism, underlying patient characteristics and the method of delivery of the therapeutic.
So there is proof of concept evidence from previous studies that this therapeutic video game can help kids with ADHD.But this particular study really tests that hypothesis with a sophisticated and careful clinical trial design.
So let's dig in now to some of the important details of the study.
This built upon the previous studies by having a truly randomized controlled trial.So this is where there's a control group and then the interventional group getting the acutely developed intervention.
But then also importantly, all of the researchers involved were blinded to the intervention, the families themselves and the patients themselves are blinded to whether they're in the intervention group or not.
So the control group also got a video game and it was like a word search game.It also learned, it was also rewarding.
It also had a lot of the same kind of game design elements, but that video game doesn't target those same pathways that are issues in ADHD.
And importantly, this also used a method called the intention to treat, meaning that all of the people that were initially stratified into the two groups were included in the statistical analysis, regardless of whether they completed or not.
So really meeting this top level criteria, what it means to be a rigorous RCT for a therapy.
Yeah, this study is pre-registered.And so the idea behind pre-registration is that you, as a researcher, lay out exactly what you're going to do before you ever do the study.
How you're going to collect the data, how you're going to analyze that data, and importantly, what you determine to be a significant outcome before you look at any of the data.
And that really helped prevent that kind of post-talk, we looked at the data, we see this is significant, therefore this is important.
you're saying from the start, this is important and we'll only know if our study is successful if it meets this particular end point.
Yeah, I think the study highlights the importance of having that foresight and the design, but also another thing that was interesting in the study was the choice of outcome metrics.
Traditionally in ADHD research, a lot of the outcomes have been focused on their number of different metrics and batteries of surveys and subjective measures to assess symptoms of ADHD.Whereas this particular study,
We use a measure called the TOVA API, that's T-O-V-A API, which is really focused on assessing objectively attention.
They picked an outcome that really got at some of the core impairment that comes along with ADHD, and then a lot of circumstances is equated with some of the challenges kids might have in the classroom or in other settings.
Yeah, it's FDA-approved, validated, computerized performance to measure attention and inhibitory control.
And using that as the readout is a much more, as you say, objective way to measure attention and control as opposed to even having doctors evaluate kids.
One could argue, you know, are they just picking the outcome measure that is most likely to correlate with the video game?
And, you know, by definition, if you have a video game focused on attention, do we just pick the one kind of validated FDA approved metric that likely is to see that effect?
But, you know, what struck me as I read this is that a lot of times what impacts the quality of life of these kids with ADHD are the impairments that they see in the school or as they get older in the workplace.
And there has been a lot of discussion and validation around this kind of API. metric of being in a lot of ways analogous to those impairments that cause issues in those settings.So I think it's actually a pretty insightful selection.
So the study is deemed successful or unsuccessful, whether it achieves this change in the Tova API score.But they also measured all of these subjective metrics as well.
One of the interesting things in this study is that the primary outcome, we saw the statistically significant difference between the two groups, but a lot of the secondary outcomes actually didn't see major differences.
And let's get that to the root question we mentioned earlier of what are those outcome measures actually measuring, whereas The primary outcome measures really looking at attention, which is the core target of the game.
A lot of the other secondary measures, we're looking at more broad definitions of impairments impairments beyond attention issues or symptoms more broadly that oftentimes the clinician or the parents or the patients themselves are assessing.
When they compared on the subjective measures, the test game did about the same as the control game, and they both saw improvements, which was interesting.
So what do you think is some of the underlying cause between seeing improvements in both the control group and the test group?
I think it's an important nuance of the study, because for typical therapies, you would expect the RCT to be structured to where the control group often gets the gold standard of care, gets an alternative therapy
But in this case, the control group got a game that most kids aren't playing with ADHD.
And I think that was important in the first place, because in order to really have this be blinded to the families and to the researchers, you had to have a video game.
You couldn't just give them a pill and say, hey, here's the control group with the pill, here's the test intervention group with the video game.
The one downside of that is that there could be actually some effect from the video game and the control group.
The authors speak to this a little bit in the discussion section, but they mentioned that the actual activity of the control video game likely requires additional attention and some of the same kind of cognitive
skills and functions that, although weren't necessarily intentionally targeted, the consequence is that perhaps some of the lack of more dramatic difference between the intervention and the control arm could be due to the fact that the control itself actually has some net benefit as well.
Yeah, the other important aspect that they mentioned in terms of why the control and the test did equally well was that when patients and parents get a treatment, they have a perceived benefit.
It's almost like a placebo effect, where you expect to have a benefit from this.So the game, both the parents and the patients of the control game and the test game, had that same level of expectation that this would help.
I think it just begs the question of, can a video game that demonstrates impact on attention have the long-term impact that we hope for, or is its failure to demonstrate impact across the other secondary measures going to somehow be an impediment to the ultimate impact of this game?
That's an excellent point, because ADHD is a really complex disease with a lot of different elements that all contribute to this ADHD phenotype, if you will.
They've chosen one particular aspect of this to target with their game, but it's still unclear whether just targeting that one particular aspect is enough to make a difference in the lives of these kids, and especially we don't know how long the duration of this impact is.
While I'm very enthusiastic about the outcome of this paper, I do want to be clear that it's not a silver bullet, that not everyone was a responder to this therapy.
That being said, it was statistically significantly higher group of people that did respond compared to the control, but the study was four weeks long and there wasn't a long period of follow-up after that.And so the question is, can
patients be expected to continue to engage with this game for the five sessions a day, five days a week over time?And if not, is there a lasting effect that is seen from earlier engagements with the game?
And that brings up another really interesting point, which is one of the cool things about digital therapeutics is that they inherently have real world evidence built into them.
And this gets back to our earlier point about iteration on dose and method of delivery. pills, there's no way to measure its effect in the future, measure use of the pill, you have to kind of layer on additional tools and things.
But software by its very nature is collecting data, it's presenting data, it's looking at outcomes from that data, it's presented to patients.And so it creates a really interesting way
where Keely or others could pretty quickly look at, how are patients engaging over time?And in the future, potentially make adjustments to this game or others based on that, obviously going through the proper regulatory processes along the way.
But it's, I think, a really cool opportunity for real-world evidence to be the default, as opposed to this kind of program and post-market study you have to layer on top of a pill that doesn't necessarily cater to that.
And arguably, you could even layer in a lot of the outcome assessments and a completely seamless process.So imagine at the end of every week, you know, instead of having a traditional Endeavor module pop up, the Tova API module pops up.
And it's just, it bakes in a process that, again, by default caters to real world evidence, as opposed to it having to be a really clunky add on.
In the paper, they cite two studies from the USA and UK that found that most children with pediatric mental health needs do not have proper access to services.
And so a digital therapy like this, and especially if it's able to get integrated with the Tova API and able to send that kind of information to the doctor, would really allow a great increase in the accessibility to treatment and to therapies to people who just don't have proper access.
And it begs the question of if we really want to realize that potential of scaling access, especially for those that otherwise might not have access to quality psychiatric care and pediatric psychiatric care, how can we build business models to enable that?
How can we build business models that allow people that are uninsured or underinsured, or that don't have a psychiatrist within driving distance to access this in a way that currently, you know, with the standard therapeutic go-to-market just
we aren't necessarily going to realize.And so again, it inspires, I think, an opportunity for a lot of creativity, but also, you know, some unique challenges that we're going to have to confront.
Yeah.And I think that links to the safety of these.So this video game had a really low risk, low rate of adverse events.There was 7% of kids experienced an adverse event, and that was headache or frustration, really.
And compare that to pharmaceuticals, the drugs for ADHD, which have 40% to 60% of kids experience adverse events.
So if they have such a low risk, you can really make them super easily available, something that can be really broadly used and distributed that may provide some real benefit to patients.
I chuckled when you said, you know, the headache and frustration, because I just envisioned my 11 year old self playing a video game and throwing the controller at the wall or something like that, because I just lost the level.
And so I can imagine that some kids had frustration as they didn't get their objectives in the game.But you're absolutely right.
Specifically for some of these mental health focused, neuropsychiatric focused interventions, I think we're more likely than not going to see risk profiles that are really low.
And I think that's amazing because to your point, enabled a number of them to already be launched even before FDA
formal approval in this pandemic era, but also in the future, I think it would suggest that there could be ways for these to be distributed more broadly.
And also, I think it just allows providers to feel more comfortable prescribing these, knowing that there's not as high of a risk, whether it's side effect or diversion of the medications or something like that.
So we have talked about the background and we've talked about the study design and that they achieved this statistically significant outcome.
Let's talk about the limitations of this study and what the next pieces of evidence that are needed to really solidify that this is an effective therapy.
A couple of big questions that come to mind for me is, is this representative?
So is this a study that you could then go on to extrapolate to a general population, in this case, you know, population of kids with ADHD, specifically, you know, be age eight to 12.
But also, you know, the bigger secondary questions, are there any major statistical or methodological issues that would be red flags that would lower your confidence in the results?
I think this paper sets a great example of what a model RCT can look like for a digital therapeutic.But as we alluded to earlier, I don't think it necessarily is that silver bullet.
So if you look at who they allowed into the study, the patients had to have an objective baseline deficit in attentional function, meaning that they had to have a specific score on the TOVA assessment to be included, which isn't necessarily representative of a lot of ADHD patients.
And so the implication of that is that, A, we either need to do additional research to prove that this is more broadly generalizable, or B, the providers are going to have a little bit more limited
subset of their patient population that they could, with confidence and conviction, prescribe this to and expect an impact from it.The other key piece, I would say, is the exclusion of medications from the trial.
So patients either had to not be on medications or they had to be several days removed from their medications when they came onto the trial.
The real promise of this digital therapeutic is likely in conjunction with other therapeutic modalities, whether those are behavioral approaches or medication approaches.And so without that, it's hard to, again, understand what is the impact
of this in combination with a therapeutic.And then I think the last major one is the duration of the study.So it was one month.There was no time period included after they had wrapped up using the game.
So there was no answer to when engagement goes down with the game, is the impact persistent?That would be something super interesting is to see how people engage with this video game over time.
And then when they don't engage, do we continue to see durable impact from the game?
So what is the key takeaway message from this research?
I think the key take home for this article is that the definition of what can be a therapeutic is arguably much broader than what we would imagine.Yet, the standard for what should qualify as a therapeutic is quite high.
And this, I think, paper does an amazing job of illustrating the type of rigor that should go into supporting that.
Thank you, Justin, for joining me on A16Z Journal Club this week.
Yeah, it was great joining.Thank you for having me.