Modernization Hub

Modernization and Improvement
Ethics for Using Health Technology (Next Generation Behavioral Podcast)

Ethics for Using Health Technology (Next Generation Behavioral Podcast)


[Dr. Armstrong] Hello, and welcome to “Next
Generation Behavioral Health”. [Dr. Kinn] 10 minute tips for modernizing
patient care. [music]
[Dr. Armstrong] I’m Dr. Christina Armstrong. [Dr. Kinn] And I’m Dr. Julie Kinn. Christy and I are psychologists in the Defense
Health Agency. We’re part of a team that makes health technology
for service members, veterans, their families, and their care teams. [Dr. Armstrong] That’s right. And we train clinicians on how to use those
technologies in clinical care. [Dr. Kinn] In this podcast, we answer the
common questions we hear, and we do it in 10 minutes, or you get your money back. [Dr. Armstrong] Today we’re going to be talking
about ethical issues, and the use of mobile apps in clinical care. As clinicians, it’s critical for us to be
thinking about potential ethical dilemmas that may come up. [Dr. Kinn] Let me foreshadow the end here
though. Bottom line, is that all the ethical principles
that apply to using health technology, are the same one’s you all are already familiar
with. There’s really not anything new that you need
to learn. It’s more about applying the existing principles
to using mobile apps instead of using things on paper. [Dr. Armstrong] That’s right. All of the ethical standards that we have
to follow as clinicians still apply even if it’s a new technology. [Dr. Kinn] Right. Just like when we started using telephones
in therapy. And when voicemail became a common thing,
and recordings. That fits into our ethical standards, just
like using recordings on the Prolonged Exposure Coach app. [Dr. Armstrong] For example, when we’re doing
therapy via holograms one day [laughter], a lot of people are going to have questions. A lot of people are going to imagine a lot
of ethical dilemmas that might arise. But we do always have to go back to those
ethical guidelines that are a part of our professional practice. [Dr. Kinn] Yes. So we’ll start getting into these today with
some examples. But we’re very much hoping to hear from you
all. Give us your questions. Send us some stumpers. [Dr. Armstrong] Yes. So in the meantime. Let’s think of some of the ethical dilemmas
we have heard from some providers. Let’s see, I’m going to give you one Julie,
and I’d like to see what you would do in this situation. OK. What would you do if a provider hears about
a really cool new app for cognitive behavior therapy for insomnia, and the–? [Dr. Kinn] Like the CBT-i Coach made by the
Department of Veterans Affairs with help by T2. [Dr. Armstrong] That’s right. Yeah. So imagine you’re a provider in the VA or
the DOD, and you are treating a patient for insomnia. And although you are not trained on cognitive
behavioral therapy for insomnia, you say, “Hey, I know about this really cool app, CBT-i
Coach. And I think I’m going to use it because my
patient has insomnia, and that feels like a good fit.” So, do you see any ethical problems there? [Dr. Kinn] Why yes, I do. This would be a really short episode if I
said, “Nope. No problem. Go for it.” OK. So, again, back to what I said before. That this isn’t anything new, it’s our same
ethical standards. Finding out about a really cool app, and then
starting to use it without being trained in the modality that it’s based on, is similar
to seeing a book on motivational interviewing, picking it up, flipping through it, and then
handing it off to your patient, and say, “Yep. We’re going to do motivational interviewing,”
instead of actually being trained on it. Except in this case, it’s a mobile app, and
by you recommending it to your patient, you’re implicitly saying, “Here’s a treatment I believe
in, and that I know about. Let’s start doing this treatment.” Now, there is safety measures built into these
mobile apps made in the DOD and VA, in that they make very clear in the app that some
of these should be used with a provider, and how to find providers, etc. But that’s not always going to be the case
with other apps out on the market. So really what we’re facing, is the ethical
standard of competence. [Dr. Armstrong] That’s right. Yep. Our ethical standards always–
[Dr. Kinn] Oh, did I get it right? Yes! [Dr. Armstrong] You got it right! Yes! So, at least for psychologists, which we both
are, and our ethical standards, we have to be competent in the therapeutic modality that
we are engaging in. So if we’re using CBT-i, Cognitive Behavior
Therapy for Insomnia, to treat our patients with insomnia, we have to be trained on that
therapy. I want to be trained on CBT-i before I’m using
CBT-i Coach. [Dr. Kinn] Can I throw one at you now? [Dr. Armstrong] Yes. I’m ready! [Dr. Kinn] So let’s say a patient is using
a self-assessment, either in a mobile app or on a website. And they indicate some tendency toward self-harm
or some really severe depression ratings. Now, all the mobile apps that we make in the
VA and DOD, don’t transmit this information to providers. It all lives on the patient’s device. So no one’s actually getting that information
in order to jump in and help with immediate care. [Dr. Armstrong] That’s right. [Dr. Kinn] Let me jump in the nightmare terrain. Let’s say a patient has an incident of self-harm. Is that provider liable? Were they ethically responsible for checking
the responses on that self-assessment? [Dr. Armstrong] That’s a really good question. And so there’s two ethical standards that
are applicable in that scenario. The first one is confidentiality, and the
second one is informed consent. So informed consent, I see as the bigger,
more important issue here, because it can prevent issues in the future. So in this scenario, it’s really important
throughout the informed consent process, is that you let the patient know, “Hey we’re
going to be using this mobile application as a part of our treatment. I’m not getting any of the information from
that app that is on that phone. And as a part of our treatment, I’m going
to do standard suicide assessment,” which you would do with all patients. But you’re also going to do a safety plan
if it needs to be put in place. And a part of that safety plan is going to
be, who are they going to call in case of an emergency? All of those things. But also being very, very clear that if you
have thoughts of hurting yourself, I want you to contact 911, or my office, or whatever
the safety plan is for that patient. And of course, document, document, document. [Dr. Kinn] And yet, another reason why we
always ask you to kick the tires first before recommending any application to your patients. [Dr. Armstrong] That right. I’ve got another scenario for you, Julie. You ready? [Dr. Kinn] Yes. [Dr. Armstrong] OK. So a patient comes into a session, and secretly
records the therapy session using their voice recorder on the phone. Then they take the audio and cut a new dance
track with their psychologist’s vocals on it. [Dr. Kinn] Did that really happen? [Dr. Armstrong] The first half did happen
[laughter]. The second half didn’t happen. [Dr. Kinn] Dr. Armstrong! [Dr. Armstrong] So there has been a couple
instances in military treatment facilities of patients coming in and recording the session
unknowingly. I don’t know what they did with those audio
tracks afterward. But hopefully, not a dance track. It would be a pretty boring dance track, I
think. [Dr. Kinn] Maybe [laughter]. OK. So, that is an invasion of the therapeutic
relationship. So, I think there’s a couple different ways
of looking at this. One is just therapeutically, you would want
to have a conversation about what this means, why they wanted to record. It’s possible that they want to remember what
you’re saying, and it didn’t even occur to them that it would be something that bothered
you. But the ethical issues here are numerous. First of all, confidentiality. Because they are possibly, without knowing
it, impairing their own confidentiality by essentially generating a new record of their
therapy session. Although it’s not HIPAA information until
you, the therapist own it, it’s still something that could potentially land them in some hot
water later on, or make life uncomfortable for them. So, to me, this one strikes me more as a therapeutic
issue, and also helping inform the patient. Along with you wanting to protect your right
to practice without being secretly recorded. That’s illegal in many places to record without
someone’s knowing, but not everywhere. All the more reason to just have some open,
frank discussions about the expectations for technology as soon as you can. [Dr. Armstrong] Absolutely. [Dr. Kinn] I feel like we could easily go
on about this for a couple of hours. In fact, I know we can, because that’s what
we do in our trainings [laughter]. [Dr. Armstrong] That’s right. [Dr. Kinn] We got to cut it off, but we can
return to this subject. [Dr. Armstrong] Take home notes from today
are, always put evidence-based practices that you’ve been trained on first. Also, having a well thought out informed consent,
and having conversations ahead of time with you patient so that they understand the expectation
for treatment. And the last take home point, is to maintain
competency in the use of mobile help in clinical care through training on core competencies. And also, getting information from all the
resources available to you. [Dr. Kinn] So, folks, I think we went over
our 10 minutes this time. But it’s an extremely important topic, and
one that we know is difficult to talk about. So thanks for lending us your ear. You can send me a bill for your extra few
minutes [laughter]. [Dr. Armstrong] Thanks for joining us today
on “Next Generation Behavioral Health.” Let us know what you thought of today’s episode,
and if you encountered any ethical issues when using mobile apps in clinical care. [Dr. Kinn] [music] Connect with us on Facebook
and Twitter @MilitaryHealth. Thank you so much for rating us and subscribing
on iTunes, or wherever you get podcasts. Be sure to check out our other shows like,
“A Better Night’s Sleep” and the “Military Meditation Coach.” And also, our free mobile apps and websites
for the military community. “Next Generation Behavioral Health” is
produced by the Defense Health Agency.

Leave a Reply

Your email address will not be published. Required fields are marked *