Consider User Context
Even as devices cross over into both – a hospital will never be a home.
Emmet Connolly, a designer at Google, believes the most effective apps will follow Google's dictum that computing must fade into the background and “have the right information appear automatically based solely on context.” From a health and medical perspective this means having a wearable device automatically capable of sensing when a user is active versus sedentary, at a certain location such as work, home, or even outdoors, and even whether a user is in an emergency situation.
Connolly points to an Android Wear running app as an example. Rather than having a user active an individual app, a context-sensitive wearable could know when you start running, based just on inputs of time, location, habits and movement, and start run up your tracking apps – and even playing music - automatically.
As sensors for devices grow more robust, Connolly feels this functionality will be easily obtainable. “The real interesting thing happens when we take the combined total of all this sensor data and put it together into one single rich picture of the user's situation.”
“So as developers we can look at this and say 'how can we present the user with useful information that can help them?' ” Connolly said. He again encouraged developers to think about simplicity in develiing this information and stressed the importance of “one simple, clear piece of info showing up at the time – prioritized by importance.” The Android Wear and Google Glass interfaces, for example will allow users to arrange and rank simple screens so that users can dictate what information is important to them. Again this becomes important in a medical perspective in thinking of what health metrics to deliver and what would be most helpful for a particular patient.