Week of Women STEM Panel Discussion: Smart Tech


KAREN WEATHERMON:
Since we are at 4:30, I’m going to go ahead
and get us started. If you’re saying, I don’t
think your mic is on, it’s because it’s
for the Livestream. So you’re not going
to hear me amplified. But this is so that folks
watching from afar can hear us. So, good afternoon. I’m Karen Weathermon. I’m the director of the
Common Reading Program. And together with the Office
of Undergraduate Education, for which I work,
and also the planning group for the Women in STEM
Week, which we’re celebrating this week, I want to
welcome you to this event about smart technology. I’m especially excited
about this presentation, because our more normal
Common Reading presentations– and probably more
normal, more regular university presentations
generally– are typically a single speaker talking from
his or her own perspective from that particular field. And this is such
exciting work to me, because it is a
way of highlighting how different disciplines
can really illuminate questions and problems and
bigger and often more exciting ways, fuller ways, to see
what questions are the most important to address and
what kinds of solutions might present themselves
for those questions. This is part of a year-long
look at technology, emerging technologies related to
this year’s Common Reading book Soonish. This, today’s topic, especially
connects to the book’s section on programmable matter. But as I said, it’s
also a wonderful example of the power of
collaborative thinking. Looking ahead to upcoming
Common Reading events, there will be a talk this
Thursday by Sue Paish. She’s the director of
structures for Boeing. That will be at 4:00
PM in Spark 335. This talk is also part of our
week celebrating women in STEM. And next week, we have,
on Monday, the Showcase for Undergraduate Research
and Creative Activities, otherwise known as
SURCA, from 3:30 to 5:00 in the Cub Senior Ballroom. That’s a wonderful
opportunity to see the kind of innovative work
being done by undergraduates across the university– in fact, across our
different campuses. Students from various
campuses will also be traveling to Pullman
to show off their work. So it’s a great chance to see
what undergraduate research can look like and, I hope,
inspire you to think about participating yourselves. And on Tuesday, there will be
a talk by a panel of faculty on gene editing and ethics,
another hot topic that is sort of on the
forefront of technologies. And that’s another
panel of folks both from philosophy and
from our reproductive science programs. So more about those events can
be found on the Common Reading website and also on the
Common Reading CougSync page. If you are attending for
Common Reading credit, see me at the end
of today’s event, and I’ll card swipe you in. Just to let you
know, today’s event has a required post-event survey
with just a few questions that help give us some useful
information and also the Women in STEM program
some useful information for us to be able to program forward. So that survey will come to
whatever email you have linked to your CougSync account. So whatever account
that is, your WSU account or another one, that
survey will arrive there. It’s just a few questions. It’ll take you just a few
moments to fill that out. And when you have filled
it out, then this event will populate into
your involvement page. If you have any
questions about how to do that, you can’t
find the survey, you don’t know how to
do that, there’s also instructions for that
on our website, which is [email protected] So now to introduce
today’s speakers. Diane Cook is a Regents
Professor and the Huie-Rogers Chair Professor in the School
of Electrical Engineering and Computer Science here in our
Voiland College of Engineering and Architecture. She received her
bachelor of science from Wheaton College and her
master of science and PhD from the University of Illinois. Before arriving at WSU in
2006, Dr. Cook’s experience included research
with IBM, NASA, the National Center for
Supercomputing, among others. Her research interests include
artificial intelligence, machine learning,
data mining, robotics, and smart environments. She is the director of
several things here at WSU. She’s the director
for the WSU Center for Advanced Studies
in Adaptive Systems, with the neat acronym CASAS. Very clever, since
it involves, often, adaptive technology in homes. And she’s also the director for
WSU’s Artificial Intelligence Laboratory and director for the
National Institute of Health’s training program in geron– no, gerontechnology. Did I get that right? Shelley Fritz is a
College of Nursing faculty member at our Vancouver campus. Her nursing background
is in public health, emergency nursing,
administration, and nursing education. Her research focuses on the
application of technology and the delivery of health
care and human-computer interactions. She earned her
bachelor of science in nursing at Walla Walla
University, her master of science at Walden University,
and her PhD in nursing from WSU. The third member of
this team, Dr. Maureen Schmitter-Edgecombe, is
unable to be with us today because of a death
in her family. She is the HL Eastlick professor
in psychology, specializing in clinical neuropsychology. And the perspective that
she brings to this team will be carried by
her team members who are able to be here today. So together, this
trio of researchers bring the lenses of
technology, of health care, and of neuropsychology
to questions about how our older
population and others can live more independently. For their innovative
research, they have been awarded
recently nearly $3 million in grants from the National
Institutes for Nursing Research and the National
Institutes of Health. So please help me welcome
our speakers this afternoon. [APPLAUSE] DIANE COOK: Well,
thank you, Karen. Thank you for inviting us. So I mean– you need
me to wear this? Sure. I’m Diane, and this is Shelley. And it’s been fun to
work in this group, because we are very
multidisciplinary. And the smart tech we’re
going to talk about today centers on smart homes. So let me first
poll the audience to see what background
you’re coming from. How many of you
live or have lived in smart homes or homes that
have smart technology in them? Any show of hands? Any smart appliances? SHELLEY FRITZ: Anybody
with Alexa or a Samsung Hub refrigerator? One person? DIANE COOK: What about a
smartwatch or a smartphone? And for those with mobile
technologies, how many of you have ever used an
activity tracking app? Now, for the same
group, how many of you have parents or
grandparents that also use smartwatch/smartphone
with an activity tracking app? SHELLEY FRITZ: A couple. DIANE COOK: There are a couple. SHELLEY FRITZ: Four or five. DIANE COOK: All right. So this is our
mission– to understand what data from those
sources can do in order to understand and
assess somebody’s health and to use that information to
intervene to help people stay healthy and aging in their
own homes and environments. And clearly, there’s a
need for this because of the aging of America. And while this slide
says aging of America, this age wave is global. But the statistics I
have focus on the US. For example, every day, one– or is it 6,000
Americans celebrate their 65th birthday,
3,000 Americans celebrate their 85th birthday. And I like this stat the best– American seniors now outnumber
the entire population of Canada at about 35 million. And this is a change that we
are going to have to respond to. What this graphic shows
is how the distribution of the population in the US has
changed over the last 50 years and how it’s going to change
over the next 30 years. If you look, when it
started in the ’50s, the biggest category
was the 0 to 5 years, and there was a sharp decrease
when it got 5 five to 10 years. But then with
improvement in medicine, these babies were able to
lived through toddlerhood and into adulthood. And then, as you know, with
the coming of the baby boomers, life has been
extended even further. So what started out as
kind of a triangle shape, as you move forward to–
we’re looking at the 1990s. That big, fat part is moving
higher and higher into the 65 age group is what I find
particularly interesting in this graph. And by the time we go past
where we’re at in 2019 and on to 2030, it’s going to
become one of the largest age groups. And that’s going to
present some crises that we have to deal with. So, what kind of crises? Well, if there’s an increase
in expected age due to benefits of medical technology,
then there’s a corresponding increase
in the cost of care and the need for care. However, the number
of people that are available to
provide that care is remaining near constant. So there’s an
ever-increasing gap between what we can provide
for older adults who are going to have multiple
chronic conditions, who may be failing in terms of
their memory and cognitive performance, and the
resources that we have to take care of them. There won’t be enough
physical resources. There won’t be
enough caregivers. So what we’re hoping is that
we can design technology to fill that gap and
address this question– how are we going to keep this
aging population functionally independent in their own home? And by functionally
independent, we don’t mean just
alive and existing, but able to perform
their daily activities with minimal intervention
so that we’re not putting the burden on this
increasingly small percentage of people that are
young and healthy enough to take care of them. But also, so that the costs are
kept within reason and people do not need to move to
assisted care facilities. So in terms of
the smart home, we are designing technology that
acts kind of like a prosthesis, almost like you would
think of a prostheses, as an artificial limb. In our case, it’s
whatever is needed– a prosthesis to keep
older adults mobile. It could be a
cognitive prosthetic to help them where they’re
having some memory gaps, or wherever it’s needed in
order to keep them functionally independent in their own homes. And so I first got
interested in smart homes because I had done research
in machine learning. And maybe like other women
who are interested in STEM disciplines, or
engineers in general, I wanted to move from looking
at just theoretical design of algorithms to
using machine learning for an application that
had societal impact. And I was talking to my
in-laws, and they said, well, are you familiar
with smart homes? Because they worked in
the real estate industry. And I said, no, but the Texas
State Fair is one city over, and they say they have the
home of the future there. So I went and visited
the home of the future, and it had a lot of
really neat gadgets. It had a washing
machine that you could start from your computer. This is a long time ago. It had a refrigerator
with a barcode scanner, so when you run out of
milk, you scan the bar code for your milk carton, and it
would create a grocery list. And back then, they had a
company called homegrocer.com that you could automatically
send your grocery list to. And they would
assemble the groceries and deliver them at
the door, so you never have to leave your home. It’s awesome. And I thought, that is
certainly a connected home, and it’s one that has a
lot of neat gadgets in it. But coming from artificial
intelligence and machine learning, I wouldn’t
call it a smart home. Because when we think
of a smart home, we think of one that
is able to sense what’s going on in the environment,
to reason about it in terms of what is its
goal maybe in terms of serving the people that live
there, and then act in such a way to reach that goal. And even if you have
a connected home with a lot of neat
gadgets, it’s still the resident that has to
make all the decisions and decide what is going
on inside of the home and what is the need. So I was interested in
seeing, can we actually build a home that is smart? So based on this definition
that I just gave, here’s what we need
to make a home smart. We need for it to be
able to sense what’s going on in the home and with
the residents that live there, to be able to identify
specific patterns. And if we’re using this for a
health goal, then from that, we will assess the health of
people that are living there and, finally, intervene. And the reason this is
a cycle is that if we design some intervention. We want to be able to
see the impact that it has on what’s going on in
the home and the health of the people that live there
and adjust the intervention correspondingly. So this is our goal
for a smart home. And at first, it seems a
little like a pipe dream, like a George Jetsons
futuristic home. But this is something that
the technologies are there to do it. But we also want them, at
the same time, to be subtle. So a lot of times, we will
have individuals come to campus and say, show me
your smart home. Because we have a
test bed smart home in one of the graduate
student apartments. And I think that they’re hoping
that the robot butler will open the door, and there
will be flashing lights. But remember, this
is a technology that we want to be
accepted by older adults and to pervade our homes,
and the last thing they want is a big yellow light saying,
I have need for assistance. So what we want is for them to
walk into this smart apartment and see an apartment. And the technology just
disappears into the home. Now, what we have in this
picture is not quite that. It is our smart
apartment, but you will see that there are sensors
throughout the apartment. So the goal of disappearing
into the fabric of the home is still underway. But it does do these
steps of sense, identify, assess, and intervene. So I’ll talk just a little
bit about what these are. In terms of sensing, we
flood homes with sensors, because that provides data
that our computer programs can use to identify what’s going on
in the home, what are patterns. Are there are changes that
we need to be aware about? And how can we use this to
assess the state of somebody’s health and intervene? This is just a collection of a
subset of some of the sensors that we work with. So going from left
to right, we see two of the people in our
lab that have opened up an infrared motion sensor. And we put them on the ceiling
of our homes and apartments, and we just connect them
with, like, command strips so that they’re
connected with adhesive that you can remove when
you don’t want them anymore. But these are infrared
motion sensors, so they’re not cameras. So they don’t require
lighting, and many people consider them a little bit less
invasive in terms of privacy. But they’re looking for
heat-based movement, because they’re infrared. So you find them, for example,
when you’re at the airport and you get near the
door and it opens. It’s because there’s
an infrared motion sensor that detects that
you’re walking in that area. So if there’s heat-based
movement of a certain mass, then it lets us
know that there’s some movement in the
area that it’s sensing. And when we get to
the challenges part, we will revisit. This because this
heat-based movement can present a challenge for us. It looks for heat-based
movement of a certain size. So if you have a
cat or dog, this drives us crazy,
because the cat kind of teleports through the house. Sometimes it shows up,
and sometimes it doesn’t. So infrared motion
sensors are really valuable, because
they let us know what’s going on in that
area, is there some movement. The next one to the right
is a magnetic door sensor. And you can see that
there’s two sides of it, because it’s actually
just forming a magnet. And when the two ends line
up and the circuit closes, then the sensor will send
our computer a text message saying that the door’s closed. When they no longer line
up, it sends the message that the door is open. And you can imagine putting
this on external doors if you have a family member
who might be prone to wandering and you want to know when
they’ve left the home. It also lets us know when
they’re performing activities that you really want to track,
like accessing the medicine cabinet, because they might
be taking their medicine. Then, further to the
right are object sensors. And you can put just little
tags on objects that monitor when you’re moving an object. So you can see if
somebody is actually accessing their medicine
bottle, if they’re accessing a bottle of water so
that they’re staying hydrated, have they gotten out of
bed, whatever, you could monitor in terms of motion. This is all within the
home, but there’s no reason that we’re limited to the home. So many of you said that
you’ve used a smartwatch, you’ve used a phone, you’ve
used an activity tracker. That’s another source of
information that we use. And finally, in
the very lower left is just a floor plan
of an apartment. And it shows all these red
dots where we will put sensors throughout the home. So we kind of try to
cover the whole home and see what’s going on. While we’re looking
at smart homes, I did put in one
video here that shows an example of the kind of
data that can be collected. This is my husband, and
he and I designed an app to collect sensor data
from a smartphone. And we have a little
plug in there, because you can actually
download this app and put it on your phone,
either Android or iOS, and tell it what activity
you’re doing at any given time. And it starts to learn
over time and suggest, oh, are you actually going
for a run right now? Are you sleeping? So you might want to
turn it off sometimes. But as he is moving
the phone around, you can see that the sensors
on the right side of the screen are updating the numbers. And all these numbers can be
sent to a central computer, or they can be
kept on the phone. And it’s our job to make
sense of those numbers and see what you’re
doing and see if there are changes in
your lifestyle patterns that we need to watch
out for if we’re trying to monitor your health. So similarly to
the previous video that showed what
sensor readings might look like as you
move a phone around, these sensors in a smart
home generate text messages that we have to
analyze to determine what’s going on in the home. Here, I’m showing a video
of one of the students in our lab, Jess, who
has entered the home and is just performing
a series of activities. She’s entered the home. She’s putting away
groceries, and then she’s going to walk in the living
room to do something else. In the lower left, I’m
showing all these motion sensors that light up as
she moves through the home. And then there are
green rectangles that show doors that light
up when the door opens. But what the
computer program sees are the text messages that
are in the upper left. And you see it’s just
a bunch of numbers. It’s just showing the date and
the time, a sensor identifier, and an on/off, motion/no motion. So the question is, given a
file of information like this, what can we tell about
a person’s health? It seems a little bit
cryptic, but that’s the fun challenge
for us in terms of designing computer programs. The first thing we
try to do once we have a sequence of
those text messages that are sent to our
computer is to try to identify what activity
a person is performing. Once you look at
enough sensor data– and Shelly has done
this– you start to recognize what is going on. And I have people who are
not trained in engineering and they help us out, provide
labels for activities, and they actually start
to get emotionally attached to the person
that they’re labeling. Because you can see what’s
going on in their life. For an example, if
you were looking at motion sensors and it
was 3:00 in the morning and there was a sensor above the
bed that was going off probably every 10 minutes–
because it only registers whenever there’s movement. And then, suddenly,
there’s a flurry of activity, followed
by motion sensors in the bathroom going off. And then back to
bed, and then it goes back to every 10 minutes. You can probably gather–
it’s 3:00 in the morning– that they were sleeping, they
got up to use the restroom, and then they went
back to sleep. This is an important
thing to identify, because these bathroom trips
in the middle of the night are one of the most hazardous
times for older adults, a very likely time that
they’re going to fall. And so being able to identify
what activity is going on is kind of the vocabulary for
us doing health assessment and intervention. So from this data that
you collect on your phone or in the home, we might
be able to tell are you sleeping, are you doing house
chores, are you working, are you socializing, all
those other components that you need in order to
stay healthy and functionally independent. So we sense, we identify,
and then we assess. So if we can express
your behavior in terms of these
activity vocabulary terms, then we have a basis
for determining what your health status is. And to provide
evidence for that, we ran a study in which we
collected data from smart homes for multiple years– so in some cases,
seven, eight years. And these were for
older adults living in retirement communities,
and the average starting age was 85. So we knew, at
that age, that they were likely to undergo
health changes. And we wanted to
see can we pick up on that in terms of changes
in their daily behavior? So we collected data non-stop. And this is those text
messages from a smart home. There’s about 1,000
of those a day. So that’s how many data
points we’re looking at. And then Maureen’s group,
Dr. Schmitter-Edgecombe, would visit these
individuals twice a year and do a set of
standardized tests. So, example of a
standardized test. There’s the TUG test,
the Timed Up and Go test. And that checks
for your mobility. So you start sitting in a chair. You get up, walk
forward 10 meters– SHELLEY FRITZ: 10 feet. DIANE COOK: 10 feet. Turn around and come
back and sit down and see how quickly you can do that. So that shows some
indication of mobility. And then RBANS is
a set of questions that you answer to determine
what your cognitive health is. And we were interested in
seeing can we actually predict what these standardized
scores would be for mobility and for cognition? And surprisingly, we could. And it wasn’t just a
question of how well do they sleep, how many
times did they go out of the home, how much time
did they spend socializing, but it was everything. It was their complete lifestyle. And actually, one of
the greatest indicators of change in health
were just changes in day-to-day variability
of their routine. Once their routine started
to kind of crumble, that was an indication that
maybe they were undergoing some cognitive changes. So being able to collect
all this data to get a sense of somebody’s
entire life pattern, identify those
patterns, and then look for changes in those
patterns was critical for us to discern that there were
changes in health status. So that’s an indication
of kind of slow changes in health status over
multiple months or years. The question is, if you’re going
to design computer programs, can you also identify
more immediate changes in health status that are
critical, like a fall? I mean, certainly, if
you look at this graph here, which shows movement
or activity level, just based on number of motion
sensor events within the home
on a daily basis– and it goes from
January to March. You might wonder what’s going
on in this person’s life, because clearly, there’s
a downward trend. They are getting a little more
sedentary for some reason. And maybe that’s a concern,
and you want to look into it. Can you also detect if there’s
a break from routine that is only a few minutes long? So this is an area that we
are looking at right now. An example of what
we can do is we can look for changes from
normal and see if they line up with known changes in health. So here’s one example,
and I think Shelley will talk about falls later. In this case, this
was a resident of one of our smart homes. And we used a time when
we knew she was healthy and had normal
behavior as a baseline. And then we looked
at every week after that to try to
determine when there might be a significant change
from baseline in her behavior. And the graph at the top
shows how much each week changed from that baseline. And whenever it goes
above the red line, that means a statistically
significant change. So if we look back
at her health history to try to identify
what’s going on, it turns out that
this is an older adult, 86-year-old female, who
was diagnosed with lung cancer. And that small blip that
you see above the red line is when she received
the diagnosis, and that bigger blip is
when she started treatment. And then, because
we automatically identify activities,
we can break it down and look at changes in each
of the individual activities. So the two bottom graphs are
what we would call heat maps. And they’re showing
one rectangle for each hour of the day, and
each row is a separate week. So you can see that there’s
a lot of very dark coloring late at night, early in the
morning before that week when she started treatment
for the sleep activity, because that’s when she
was doing a lot of sleep, and it was very regular. After this green
vertical line, week 11 when she started
treatment, there’s very little sleep in the
middle of the night, which one would expect. And if we looked at
dozing or sleeping out of bed in the middle of the
day, which is another category, we would see a lot more
of that during the day. So we can identify
and understand what the nature of
those changes is. Similarly, this is enter
home on the bottom right. Before she started
treatment, there’s some occasions when it’s
not perfectly white, indicating that it does happen. But after she started
treatment, there’s a lot of people entering the
home, most likely providing care and support for her. So this is what we want to do. We want to be able to identify
when there are changes, understand the nature
of those changes, and then, ultimately, intervene. So here’s just two
interventions that we have right now going on. And then I will turn
this over to Shelley. The first one is on the left,
and it’s called a Digital Memory Notebook. And I will show a
short video on that. This is Maureen’s creation. She wanted to have a
tablet interface face that older adults could use. So we designed it with
input from older adults, so it used high contrasting
colors and large fonts. And it allows them to keep track
of appointments and medicine information. But the other thing
it does is it partners with the smart home. And an individual who has
some memory difficulties can be very anxious
about whether they have performed certain
activities each day or not. They just can’t remember. So the smart home will
actually populate the notebook with what they’ve done that day
to give them some reassurance. Furthermore, because we
are designing machine learning programs,
we can anticipate when the person would
normally, in their routine, perform their next activity. And it’s not just time of
day, because taking medicine, if you take it with dinner, it
may not always be at 6:00 PM. So we need the smart home to
recognize that they’re eating. And then if they’re eating but
they haven’t taken medicine, this tablet will
prompt them to take the medicine, kind of acting
as that cognitive prosthesis. So I was– [VIDEO PLAYBACK] – So this is a digital
memory notebook. So it’s a task management app. And so what you do is you
can schedule different tasks you have throughout the day. It can also remind you
to do certain tasks. One of the cool features
that it has is it also has a partnership
with the smart home. DIANE COOK: If you were falling
asleep, you’re not anymore. – And it can also predict
when it thinks you’re going to complete activities. [MUSIC PLAYING] DIANE COOK: So the
notebook recognizes that she’s taking
medicine, but prompting her to eat food with it. [MUSIC PLAYING] – Please go to the notebook app. DIANE COOK: So it
detected medicine. And it– [COUGHING] [MUSIC PLAYING] [END PLAYBACK] So this is one intervention. And we first tested
it with individuals in the on-campus
smart apartment, and now we’re testing it with
older adults in their homes. And it’s letting us know what
the impact of this technology is. Because as with all research
projects, it has bugs. And so our first
pilot participant, when the app crashed on her,
as they’re likely to do, we needed it back
so we could fix it. And she wanted to drive back
with a Digital Memory Notebook, because she didn’t want
to be separated from it. So she wanted to drive two
hours back and have us fix it, and she could bring
it back with her. So we’re happy that some of this
technology is meeting this need and, hopefully, playing this
role of a prosthetic to help people stay independent. And the second
intervention technology is much more futuristic
and much more in progress. That is a robot that
partners with the smart home. Because a robot can provide
some physical assistance that a tablet cannot. Eventually, it should be
able to automate some actions and retrieve objects. But in this case, once again, it
is partnering with a smart home to recognize what activity
a person is doing. And if they’re having difficulty
completing that activity, the smart home
should detect that. The robot will
approach the person and ask them if they need help. And if they say
they do, then it’ll show them a video of what
they’re supposed to be doing, either step by
step or as a whole. And if they’re having
continued difficulty, it can actually lead
them to the objective that they need in order
to complete that activity. And I’ll show quick
video of this. And it’s a neat project. It will have a lot
of impact eventually. But you’ll see, right now,
just what the status of robots is, as well, and why we don’t
need to be too scared of them. So this is RAS, our
Robotic Activity Support. [? Anisha ?] is taking
out some medicine. But once again,
she forgot a step. So here comes RAS, asking
her does she needs some help. And then she can
interact with the tablet to say yes, I don’t know
what step I’m skipping. And it’ll show it
to her, and it will be a video of herself
having done it or a video of the
entire activity. So as I transition
this to Shelley talking about the research
that she is doing in nursing as part
of this project, you can imagine that
there’s many challenges to clinical translational
of this work. As engineers, we’re really
interested in seeing what can we develop next. We’re not necessarily
quite so interested in making them usable
or user friendly, because we just think
it’s really cool to have a robot in our smart home. But if we’re looking
at older adults, then we need to
think about the fact that this is a
very rickety robot. And if a person has
mobility difficulties, are they going to
grab on to this robot to try to steady themselves, and
they’re both going to go over? So we need to make technologies
acceptable to the user and safe to the user, which
is a big gap for people coming from engineering. We need to accurately
detect a need. When is there a need, and when
is it just a false positive? You know, maybe they’re
just going out for the day. Costs and privacy. Privacy sensor data
is a huge issue. When you are doing your activity
tracking apps, as many of you raised your hand for, that data
requires, often, that you’re turning on location services. And all that data is
captured by your provider– Apple or any other
provider that you have, maybe the app designer. That can be used
for other purposes. So how do we provide
these services while keeping your identity
and your lifestyle private? Sensor battery life– when
we run studies with watches and we turn on location
services, this watch typically will
run for four hours before running out of battery. So how can we extend
the battery life and still provide
all those services and not be asking older adults
to constantly replace the watch and charge it, which if they’re
having cognitive difficulties, is unlikely to be
done consistently? Creating reliable and
valid health tech. Even if we have achieved
a high enough success level to publish
papers, it’s going to take a lot more large
clinical trials for it to be adopted by the
health community. So we are a long way from that. And then, eventually,
we need to demonstrate the value added to clinicians. And that’s a good point for me
to turn it over to you, to see is there a value added. SHELLEY FRITZ: Well,
we’re working on that, so thank you for that. I’ll add to this list that we
do want to augment and assist older adults in the
various ways that can help them stay independent longer. But we also have to be careful
that we don’t over-assist and create dependence. And that’s one of the things
that we work on in nursing and we pay attention to when
we’re assisting older adults, is– or anyone with a disability
or children, for that matter. We don’t want to over-assist. We want people to do
as much for themselves as they possibly can. And you can imagine, there’s
a lot of reasons around that. So if you don’t mind
moving to the next slide. So I’ll start by
sharing that I joined this team and this research
after 20 years in nursing practice. And as you heard,
my background’s in emergency nursing
and public health, but I’ve done a wide
variety of other things, like start up a home
infusion company and do a lot of training
of nurses over the years on using technology
that becomes available. And so, let’s see, I saw
my first patient in 1988. So a lot of technology changes
have happened since then. And so I have many
years of experience of taking a technology that’s
been designed by engineers, and it comes to us as
this really cool thing, and then figuring out
how to actually make it work with the patient. So we use technology
with patients to assist with treatments,
assist with diagnosing, providing comfort, and then
sometimes just plain old saving their lives in a code. So there’s quite a bit of
technology around that. So that’s where my
background comes from. And when we talk about this
idea of clinical translation, it’s really what
I’m interested in. It’s also probably a
really difficult area to be doing research
and studying, because that jump from
technology to actually having it be added value
to the question clinician is important. A piece of this that
becomes really important is this idea of trust. If nurses don’t trust a
technology, if it fails them, they’re going to back
off from use of that and rely on their own judgment
and their own clinical skills and any non-technology
things that they can do if they’ve learned
to not trust the technology. So those false positives
or those failures become super important
in translating that into clinical practice. That also becomes really
important for the patients, too, because many
patients are being taught to use their own
technologies in their home. I have home health
patients who are learning to do their
own dialysis kidney transfusions at home. And they have the equipment
that is attached to their body, and they’ve learned how to hook
up that equipment themselves, push the right buttons. And this is considered
smart technology, when you’re doing in-home dialysis. So that piece of
having the patient be able to trust the technology,
too, is really important. Pardon me. I’m getting over a URI. That’s an Upper
Respiratory Infection. So in the environment of
smart homes and technology and where nursing plays a role
in this and with older adults– go ahead and push– one more time. There’s three concepts that are
really important that we find written about in the
literature, but also, I can tell you, is important from
my years of clinical practice. And that is, people are really
interested in quality of life. And included in that idea
of quality of life is– go ahead and hit the
button– this idea of health and improving health. So this is one of the things–
as we’re looking at identifying people’s normal motion
patterns, their normal behaviors and activities, and then
being able to identify what’s abnormal that might
be clinically relevant, that’s where we can look at
providing interventions that can improve health, by either
mitigating instances where they might have an event– like anticipating that there
might be a fall or that sometime in the
next couple of hours they’re really at risk for fall. Because of over a
variety of things that, as a nurse
on a unit, I might notice that the patient has
not had enough fluids today, that instead of standing
up in one fell swoop, they do what we call a stand
with one or a stand with two. OK? So there’s is a
difference in how someone’s feeling
and their strength level depending on how
they move with that. And we might then
get extra assistance. So how can we train
a computer to be able to understand
those kinds of movements so that we could
mitigate when we understand a fall might happen? The other thing is that if
we have improved health, we will have extended
independence. And what older adults are saying
is that what quality of life means to me is that
I can live in my home longer independently, not
be a burden to anyone, and in order to do that,
I need good health. So this is where
nursing comes in. Nursing kind of owns the
field of the human response to illness. So you may have
heard this before, but they often say
that doctors see the broken leg
and the nurse sees the person with the broken leg. And that is the
difference, really, in the focus on the human
response to illness. That’s really, really
important, to have people who are expert at broken legs. My husband could tell you
that after last summer, when a tree fell on his leg. So we do want people who are
expert at fixing broken legs. But we also need
people who are expert in understanding how is
that broken leg going to impact someone’s life. And if that broken leg
becomes a broken back and now they’re a paraplegic,
how does that impact– how do we still have
quality of life, good health, and independence? So this figure really
represents what we call the clinician in the loop. And if you look at
the top, there’s the smart home with the
sensors that Diane has already described to you, a
wide variety of them. And following the figure
around, the computer labels the activities that it
identifies that are normal. And then the clinician
looks at identifying data sets that are abnormal. So this is where
nursing really starts to come into play in that
any good algorithm has to be annotated by a
human for the machine to be trained to identify
something as a human would. So we have a team of nurses
who are doing weekly telehealth visits and monthly in-home
home health assessment visits. And they are using
that information that they gain from
actually listening to the heart and lungs of the
patient and watching them move and having them do different
activities like grip strength, et cetera– just like we would assess
someone in the hospital or as a home health nurse. We use that information to
then go into the data sets and find something
that we believe represents a change
in health state. And we provide that
information, that data set that’s annotated
with what we believe is going on, to the
engineering team, who then trains the computer. From there, we can figure out
how to provide interventions, once we understand a clinically
relevant abnormal activity. And then we can assess whether
the interventions that we provide are effective or whether
the patient is being adherent. And those are really
important concepts when it comes to
cost in health care. We have one of the highest GDPs
in the world for health care. And so our government
is very interested, especially with older
adults, who are 99% of them on Medicare– so that’s
our taxpayer dollars paying for those health care instances. Pardon me. And so CMS, Center for
Medicare and Medicaid Services, is saying, we need
to see what they’re calling value-based care. Value-based care
means that we are providing care that is low
cost but high good outcome. And in order to understand
whether we’re doing that or not, we have to
measure something, right? So this provides us this really
good measurement for that. So next one. So this area that
you see highlighted is really the area where
nurses are involved. And this is where, even though
nursing is not a STEM field– and certainly not women in STEM. Because I think we have
10% males in our field now. Nurses working in technology and
the development of technology is extremely low. I think there’s less
than 1% of nurses in the nation have a PhD. And I can tell you that
way less than that work with technology in the
design and development phase. So kudos to Diane for
having the insight to have a nurse at the table. We’re excited about that. Next one. So this is an example
of the similar data that you saw earlier when
Jess was in the smart home and moving around. But this is data
from the participant that you saw had fallen
in his kitchen earlier. And so what we
will do is identify when that fall
happened, what were the activities
around that, what was the health status around
that, based on our nursing assessment. And then go ahead
and hit the button. I have a lot of
animations in here. So we’ll identify for
the engineering team and annotate, basically,
when the fall happened. That’s called ground truth,
providing a ground truth for training the machine. And in nursing, we’re calling
that clinical ground truth, because that’s exactly
what we’re providing. So go ahead and hit the button. Over here, you’ll
see a data visual that the engineers have
created for us to look at. So this is the
translation piece. So I tell the engineers that
my nurse practitioner team will look at a new
kind of data to decide what they’re going
to do to treat a patient in about 60 seconds. We might review the chart
for two to five minutes. If it’s a complex
chart, we’ll probably review it for five minutes
before we go in the room. If it’s not a very
complex chart, we’re going to review that
chart in less than two minutes. There’s going to be
a chief complaint. That’s what we’re going
in there to see them for. And so this kind of data has
to be presented to us in a way that we can absorb
it very quickly or we’re not going to
use it in practice. So go ahead and hit the– here, you can see that
the person who fell then had very low activity
after the fall. And this is across four days. And so this is a clinically
relevant abnormal event. And so think about if
this wasn’t a fall. If this was your
grandparent and they were moving around their home
normally and all of a sudden they weren’t hardly moving
around their home at all– which would indicate
they’re spending most of their time on
their bed or on the sofa– would you be wanting to
call your grandparent? Would you be interested
in that kind of thing? I think the answer
that probably is yes. Somebody who normally is up and
about and then all of a sudden is down for four days–
think about yourself. If you don’t move for four
days, how do you feel? What is your human response
to your illness if you’re not moving for four days? Very clinically relevant. So this is an
example of a visual that we’re working
on for providing to nurse practitioners and
doctors for around this. So another thing
that nurses need to do in working in
this high-tech field is we need to disseminate the
information and the knowledge that we’re gaining to
our own discipline. Because remember what I said– there’s a lot of
nurses who don’t work in tech, specifically in
the design of tech or knowing what’s in the pipe. I know that in the
20 years that I was working at the bedside,
when a new technology arrived, it arrived. And we were told,
you got to use it. You got to learn it. And so that’s what we did. But to have nurses know
what’s in the pipes so that they can have a voice
in the design and development of that will mean that
there’s a lot less swearing at the bedside. So the other thing
that we need to do is we need to express
our ideas and what we think is important
regarding technology outside of our discipline, as well. So this is an example of an
article with the three of us who were here on the panel today
with engineering and computer science, neuropsychology
and nursing, disseminating that
cross-disciplinary idea generation in a
computer journal, as a featured article
in a computer journal. So how is this relevant
to clinical practice? We talked about some
challenges to clinical practice when we transitioned
between Diane and I, but this specifically is
looking at nursing practice. Currently, a nurse
gets her information about the patient from the
electronic health record. You might have heard
it referred to also as electronic medical record. In that record, we
have a ton of data. We have pictures of the patient. We might have an MRI. We might have a CAT scan. We might have an X-ray
of that broken leg. We will also have vital signs. We’ll have labs. And we’ll have
trends that are shown to us on what your
blood sugar is or what your blood pressure is. We’ll also know your history,
all the surgeries you’ve had, every time you’ve
seen the doctor, all of those kinds of things. Those are all data
points that we currently use to help us know
how to treat you when you’ve come to
us with complaint or come to us injured or sick. But I really believe
that in the future, we will add a new kind of
data to this electronic health record, and that will
involve the data that comes from the
environmental sensors from a smart home and/or the
wearables that we all wear. I can tell you that
the last time– so I’ve been in either hospital
administration or academia since 2006. But in 2006, when it was
my last shift in the ER, we had some people with smart
watch stuff starting right about that time. And we had a gentleman come in. I think it was one of the really
early Kardia AliveCor ones. And he was in
atrial fibrillation, which if you catch that within
a certain period of time, you can shock a person back
into a normal heart rhythm. But if it’s been
too long, you can’t do that, because then you run
the risk of, when you shock the heart, creating a
stroke because of a clot that may have
developed inside of one of the chambers of the heart. And so we were able to tell
when this person went into afib and know that they were
within the safe window to cardiovert them. And that is an example of
how these things can really impact not only care in
the emergency department, but certainly care when you come
see your doctor or your nurse practitioner or your PA at
your typical care points, whether that’s every six months
or every year, every two years, depending on what your
health conditions are. So this kind of
data can really help us intervene in all the life
and all the illness that happens in between
those care points so that we can provide
care between the care. Pardon me. So I show this to
you because it’s an example of why I believe
that this remote monitoring that we’re talking about
doing with the smart homes can be very clinically relevant. This is a virtual hospital
that actually exists in Sioux Falls, South Dakota. It’s in an industrial park
in an industrial building. And this hospital
has employees that are board certified
ER physicians, ICU nurses, pharmacists. And they are monitoring
patients that are in either hospitals
in the urban centers or in rural America and
seeing people’s live heart rates, their cardiac rhythms,
their blood pressures, their lab values, et cetera. And those people
from Sioux Falls might be monitoring
somebody at UMass General. They might be
monitoring somebody at University of
Washington Medical Center. They might be monitoring
somebody in Kenya. So interestingly enough, the
World Health Organization representatives were at this
hospital on this bridge– they call it a bridge– the day before I was
there, because they’re really interested
in looking at this as a model for how to
provide modern care in third-world countries. So the idea that we can remote
monitor people is very real, and there’s a lot of people
very interested in it. This is monitoring from
hospital to hospital. In this case, some of the people
that are boots on the ground are ICU nurses at UMass. Some of the people are boots
on the ground paramedics– not nurses, not doctors,
trained paramedics. Because that’s the
only thing that exists in their community
in rural America. And these people with the
education and the board certification are backing
them up and helping them know what to do. So we know that remote
monitoring works, and we know that our government,
as well as other governments across the planet, are
interested in this model of care. That’s an example of hospital. But you can see, based
on all the things that we’ve talked
about, that certainly, environmental and wearable
sensor monitoring– and we didn’t even
talk about biosensors, because I have a whole
other lecture on that. Implantables. How that can really impact how
we provide convenient, timely, and effective care. So I think we
transition here, right? We were told we talk a little
bit about women in STEM. DIANE COOK: Yeah,
so, women in STEM. This is WiSTEM week, right? And I’m asked often what it’s
like to be a woman engineer. I have to say, I
know many stories of others who have gone through
a lot of difficult times being in the minority
in engineering, but that’s not my story. I’ve been supported by
everybody around me, and I have thoroughly enjoyed
being a woman and an engineer. I think though one
point I can make is that I did not start
out being an engineer. It’s not that I was against it. It just wasn’t on my radar. I was going to be a lawyer. And my dad was a professor in
a small liberal arts school, and he took me to meet
with the law profs. And they said, if you
want to go to law school, don’t major in psychology
or political science. Major in a science and/or math. So I became a math major. And as a math major,
we were required to take a computer
science class. And I took it over the summer,
and it was so much fun. It was a way to be creative
and get the computer to do your bidding. And I never realized
how enjoyable it could be to be an
engineer, and you could still be creative. And so I immediately switched
into computer science. And problem was that by
the time I graduated, I only had four computer
science classes, so I went on to
grad school simply because I didn’t know
enough to get a job. And in grad school, then,
I got exposed to research, and I have kept going, because
this job is like getting paid to do my hobby. So I would say that, for me,
being a woman and an engineer has not been a trial. It’s really been just
an exposure to an area I hadn’t originally considered,
but from which I would never go back, because
it’s a lot of fun. What has it been like for
you working with engineers? SHELLEY FRITZ: Yeah, it’s
a really unique place to be for nurses, because– actually, I was trying to
think of other nurses who were working in this field who
were published in the field, and I know three besides myself. That’s what I’ve
encountered since 2012, when I started working with you. I think it was 2012 when I
entered the PhD program here at WSU. Graduated in 2015 and
immediately was hired on that fall by the university. It’s been a really
unique place to be. I think it’s been more of
a gentle transition for me, because I’m working with
an all-woman team, which is super cool. Diane’s really proud of that. DIANE COOK: And super unusual. SHELLEY FRITZ: Yeah, it’s
pretty unusual in the field. Like to have that computer
journal article, feature article, by all women
was a proud moment, I guess we could say. DIANE COOK: Yeah. SHELLEY FRITZ: Yeah. Yeah. So for me, the exposure
in nursing to technology and being one of the
people who are constantly the trainer on the
smart equipment, I guess I had probably
more exposure with IT and engineering along the way. And I definitely learned–
well, two things I’ve learned. Well, one, communicating
with engineers is unique. And the second thing I
learned is, basically, I’ve got to learn how to
communicate with engineers. They think differently. They communicate– I don’t think
they would want to communicate. If they never had to talk to
anyone, if they could just text/email people, that’s
what they would do. So I think those
years of working in the hospital with the IT– and as the electronic
health record emerged, this field of informatics
emerged, which was a new field in health care. And some nurses moved into that. And some people who work
in informatics and health– which is considered a pretty
high-tech area within health care– are not nurses at all. They’re trained now in
informatics and data science. So I think the other
way that I learned how to communicate
with engineers was I’ve been married
to one for 33 years. Successfully, which is
a feat in and of itself. DIANE COOK: It is. It is. Well, when we designed
this talk with Maureen, we had, like, 15
minutes for Q&A, and I think we talked so
much that we have none. But I have been told
to include this QR code and ask you to complete
the survey for WiSTEM week. And do we have
time for questions, or do we need to be done? KAREN WEATHERMON: Are
there any questions to ask? SHELLEY FRITZ: Any questions
about the smart tech or about the careers? DIANE COOK: All right, well,
we will be around afterward– SHELLEY FRITZ: We will be. DIANE COOK: –if you do. KAREN WEATHERMON: And again, if
you need Common Reading credit, make sure you come see me. We’ll [INAUDIBLE]. There will be that other short
survey that we email to you. So you have two
surveys happening. This is WiSTEM, and then if
you want Common Reading credit, there’ll be a second short
survey emailed to you. DIANE COOK: Thanks for coming. SHELLEY FRITZ: Thank you. KAREN WEATHERMON:
Thank you so much. [MUSIC PLAYING]

, , , , , ,

Post navigation

Leave a Reply

Your email address will not be published. Required fields are marked *