Patients can feel isolated from their loved ones while in the hospital. Nausea, pain, and reduced mobility can all impair their ability to use mobile technology. To better facilitate this, we came up with Convey, a phone dock that helps patients communicate with their loved ones by providing them a form that is easy to touch with a voice controlled interface to make and receive calls.
UX Design, Prototyping
Catherine Jou, Chris Chung, Amy Roberts, Xiao Yan
This project started off as an investigation into the design problem of creating a technological solution to help in the event of a pandemic event such as an Ebola outbreak in a rural community.
In order to take into account all the stages and touchpoints of an emergency health visit, we created matrix and visualization (shown above) of the patient journey and emotional states during each step of the emergency health visit process
Our investigation was centered around a case study of a pandemic occurring in the rural town of Walla Walla, Washington. In order to design with the background and motivation of key stakeholder populations in mind, we identified three representative personas: agricultural workers, students, and the elderly. We were able to use these personas to gain empathy for our target population.
In order to gain deeper insight into the challenges of preparing for a pandemic outbreak in a rural town, the we talked to Dr. Phil Green, a physician who leads the Ebola Outbreak Response Team in Walla Walla, Washington.
Some problems identified include:
- Panic at the onset of a pandemic outbreak, causing misinformation and bottlenecks.
- Lack of awareness and preparedness; people are uninformed pandemic response guidelines differ from state to state.
- Doctors need up-to-date training as well as an emergency response system that can be rapidly put into place.
- Patient may feel isolated and scared in the ICU, they need a channel of communication with their loved ones.
- Scalability of response given the limited resources and geographic isolation of a small town.
Initial concepts included a doctor training status database, amber alert style notifications, virtual reality training, a pandemic response wiki, and home diagnosis blood test, and bedside flower with LCD for patient communication.
We evaluated our ideas according to positive value criteria (scalability, feasibility, and affordability) as well as user value criteria (ease of use, effectiveness, and user satisfaction). From these results, we found that bedside patient communication rated the highest on average for both these criteria sets.
Our final concept was a mobile phone docked within a protective snowglobe cover programmed to respond to voice and touch inputs.
- Medical staff pre-enters a list of important contacts for the patient.
- Snowglobe is placed beside the bed and activated by a simple touch.
- Touch commands to wake the system up or pick up an incoming call.
- Glow from the snowglobe creates a warm and calming ambiance.
- Protective barrier allows technology to be kept sanitary and reused.
While our phone dock idea was initially proposed as something that could be used for a patient with Ebola, we saw that it could be
applicable to a broader variety of situations, like aiding communication for people with mobility impairments or providing a barrier for less severe contagious illnesses. We wanted to create a testable prototype to see if our idea was a valid approach to this space.
We designed our prototype to test whether touch input and voice commands would be effective and desirable for users. Although a future goal would be to test with actual patients, in this iteration, we focused on first determining whether our prototype could be effectively used to make and receive calls and evaluating our users reactions to the experience. Our design goals for the prototype were to make a system that would be accessible, emotionally comforting, and hygienic.
When we decided on building a prototype of our proposed smartphone dock, we chose to focus on the look and feel of the prototype. We chose to model our prototype in 3D space, using 3D printing as a means to allow the user to touch our design. The concept was first developed using SolidWorks, where the 3D model was built. This would allow us to transfer the model into an .stl file for Makerbot Desktop, which is the tool we used to print out our model.
During the process, we learned of the many intricacies that 3D printing entails.
- When we proceeded to print out the model, we found that the dimensions of the model exceeded the base of the Flashforge Pro 3D Printers that were available. To remedy this, we chose to split the model into 5 parts (2 for the base, 3 for the cover).
- Other issues that we worked through included the layer bed of the printer being uneven at first so the pla wasn’t sticking, and the filament getting clogged.
Despite the roadblocks, our final result turned out well. Because we chose to split the parts into long thin even slices, it was easy for the printer to print each layer since it was flat and parallel to the surface. The total print time ended up being 40 hours which means it would be difficult to scale up production of this product using this method. This was even after printing at a lower quality (higher layer height and lower infill) to reduce print time.
A possible alternative method for building the prototype could be slipcasting, which could be easily scaled. However, the visual design of the model wouldn’t be the same. Also, we specifically chose 3D printing is because you can print translucent models which can allow light from LEDs to pass through. This was imperative to the overall finish of our prototype, as we had an LED components inside providing ambient visual feedback through the translucent plastic.
We wanted to create a user interface that was easily visible and simple enough to use for someone in bed at a hospital, who might have an illness or motor impairment. The interface would complement both voice and touch interactions, making things easier for patients of varying ability who might not be able to easily communicate. We created task flow simulating an outgoing call and an incoming call. In our design, we used large text and contrast to provide clarity for the user, and simple diagrams aid the user in understanding how the touch interactions work and are large enough for them to view through the screen.
Because there was no straightforward way to control an iPhone screen from the computer, we used an app called Skala to emulate a phone call by mirroring a screen to the iPhone from Adobe Photoshop. For the voice interface, although we discussed solutions involving Arduino and playing pre-recorded sounds, we decided to simply place a second phone inside the housing with a call already in progress. This allowed us to respond to the user’s interactions during behavioral testing and provide appropriate sounds in real time.
I took charge of the electronics component of the prototype, which entailed a responsive system of cues to alert the user to changes in system status. The system was built using an Arduino Uno platform and additional hardware components.
We started out with deciding on a sensor to use to detect when a user has touched and activated the device. We played with several possibilities, such as proximity sensors, touch sensors, infrared sensors, and light sensors. In the end, we chose to move forward with the light sensor (LDR), as it provided the most unobtrusive method of detection on the device’s end. If we had chosen to implement the proximity sensor, infrared sensor, or touch sensor, we would have had to cut a hole at the top of the device, which would detract from the prototype’s form. By avoiding this step we were able to stay true to our original goal of keeping the phone completely under quarantine with a complete encasing around it.
However, we faced several obstacles getting the light sensor working in all situations, since lighting conditions can vary. We resolved this by taking several light readings when the device starts up, and averaging them to determine a baseline and threshold.
Once the sensor was optimized, we worked on the LED and audio feedback sequence. We originally planned on only using simple LEDs, but ran into issues with the LED flickering as it transitions between states, since the light sensor readings fluctuate and the LEDs use pwm to adjust brightness. We decided to switch to a BlinkM LED and Adafruit NeoPixel LED strip, which were brighter and also did not have this issue. The NeoPixel strip also enabled us to have fine-tuned control over each of the 17 LEDs in the strip, so we could easily specify color change effects. After soldering the pinhead of the Neopixel strip, we wrapped it around the inner base of the prototype.
Our system cycled through six different stages:
- Stage 1: Hand touches shell to wake up device
- Stage 2: Hand is removed and device is awake
- Stage 3: Hand touches shell to initiate call
- Stage 4: Hand removed from shell and call initiated
- Stage 5: Hand touches the shell to hang up
- Stage 6: Hand removed from shell and call terminated.
Each of these stages was printed out to the serial monitor upon completion of the stage, which enabled us to keep a clear understanding of what stage the Arduino was in, for debugging purposes. Additionally, we could tell which stage the device was currently in by the different LED feedback that was programmed into the device.
- Stages 1+2: the BlinkM LED and NeoPixel strip glow blue.
- Stages 3+4: the buzzer plays a melody, the BlinkM LED glow psurple and the NeoPixel strip pulsates purple after having some lights chase each other around the circumference of the base twice.
- Stages 4+5: the buzzer to plays two short tones, initiates one light chase, then turned off all the LEDs to reset to the device’s original sleeping state.
Lastly, we experimented with incorporating a microphone amp sensor to detect sound levels during the in-call stage of the sequence, and output voltage to the BlinkM LED based on the volume of sound detected. This would provide an additional piece of visual feedback from the system to the user.
However, we ran into complications such as varying baselines as well as the encasing muffling the sound. In the end we decided that this functionality was nonessential to the goal of our device, and set this exploration aside to iterate on in the future.
Throughout our prototyping process, we conducted informal tests with others to gauge the effectiveness of our prototype. The insights we gained by watching people interact with our prototype gave us some direction for our next iteration. People liked the glowing lights in our first iteration, but they wanted to see a stronger visual cue, so we added an LED lighting strip. They also expressed interest in having
a ringtone play during an incoming call, so we added that as well.
We evaluated our final prototype by conducting three behavioral user tests with people that were unfamiliar with our system. After giving users a brief overview of how Convey works, we gave them instructions for making a call. As the user interacted with the phone dock, we had someone sitting behind a laptop changing the screens accordingly. We then asked each person a short series of questions to learn what they liked and didn’t like about the experience. Because we were not able to test with actual patients, we opened with questions about their previous hospital experiences to better frame the interview.
When asked about their overall experience interacting with the prototype, we received generally positive feedback.
- Many users commented positively on the form of the device, such as the curved aesthetic and the visual LED feedback.
- One suggestion they did make was perhaps to eliminate the touch to confirm when initiating the call. Some users felt this was an unnecessary step if you already have voice control.
- Users viewed the hospital experience as negative when forced to stay for an extended period of time, due to the isolation from your normal social life. One possibility for improvement was making the experience more personal.
- Users felt that the Convey prototype was a great way of making a hospital room more personal, without having to make too many changes. The light and sound feedback was a welcome change compared to the normally dull hospital room.
This project began as an exploration into designing an artifact that would help tackle problem spaces in a pandemic situation. The result of this exploration was the design concept of Convey, a piece of technology with accessibility features built in to help a patient stay connected with their loved ones during recovery in a hospital.
As an outcome, our team was able to build a testable prototype that met our goal of creating an accessible, emotionally comforting, and hygienic form of patient communication. Through our user testing, we found that the form was ergonomic for people to touch and simplicity of the user interface paired with the voice and touch commands made the prototype easy to use.
The next steps would be to conduct more in-depth testing to specifically address each of our design goals: accessibility, emotional comfort, and hygiene. Testing users of varying abilities (motor impairments, weak grip, soft voice, blindness, or another disability) would help us tweak the features to make our prototype easier to use. A diary study would allow us to track and gauge emotional comfort received from the device over a period of time. While further testing would reveal more clear design directions, some ideas for future work involve investigating whether UV lights could be used to sanitize the phone dock, integrating a video chat option for a more realistic experience, integrating real-time voice interactions, and adding features for patient entertainment.