Read Hacking Happiness Online

Authors: John Havens

Hacking Happiness (9 page)

BOOK: Hacking Happiness
2.58Mb size Format: txt, pdf, ePub
ads

The first sensors we’re made aware of in life are our bodies. They’re highly articulated instruments, as they can feel, interpret, and respond to stimuli almost simultaneously. As children, we’re exposed to sensors in a doctor’s office, having our blood pressure taken and wondering why the scratchy black fabric of the monitor gets so tight we can feel our hearts beating in our arms. We get older and go through security and experience metal detectors. The concept of technology measuring the invisible is something we accept at a fairly young age.

But where we get skittish is when sensors begin to track us. We don’t mind if air quality is measured for negative emissions, or if thermal patterns are tracked to portend weather patterns. But when things get personal, we flinch at first. It’s a bit like scraping your knee and seeing blood for the first time; it feels like something that’s supposed to stay inside of you has come out. Data has an unexpected intimacy upon revelation. It also cries out for comparison and action. You measure your resting heart rate knowing you’ll gauge the resulting number against an elevated reading during exercise.

And what about Hacking H(app)iness? Can we use technology to identify and predict emotion? Would we want to if we could?

Affective computing is a multidisciplinary field deriving from the eponymous paper
2
written by Rosalind W. Picard, director of Affective Computing Research at the MIT Media Lab. Picard’s work has defined the idea of measuring physical response to quantify emotion. While it’s easy to focus on the creepy factor of sensors or machines trying to measure our emotions, it’s helpful to see some applications of how this type of technology can and is already improving people’s lives.

In the
New York Times
article “But How Do You Really Feel? Someday the Computer May Know,” Karen Weintraub describes a
prototype technology focused on autism created by Picard and a colleague that helped people with Asperger’s syndrome better deal with conversation in social settings. The technology featured a pair of glasses outfitted with a tiny traffic light that flashed yellow and red, alerting the wearer to visual cues they couldn’t recognized due to Asperger’s (things like yawning that indicate the person you’re speaking to is not interested in what you have to say).
3

It’s easy to imagine this type of technology being created for Google Glass. The famous American psychologist Paul Ekman classified six emotions that are universally expressed by humans around the globe: anger, surprise, disgust, happiness, fear, and sadness. Measuring these cues via facial recognition technology could become commonplace within a decade. Cross-referencing GPS data with measurement of these emotions could be highly illuminating—what physical location has the biggest digital footprint of fear? Should more police be made available in that area?

Picard’s boredom-based technology would also certainly be useful in the workplace. Forget sensitivity workshops; get people trained in using this type of platform, where when a colleague looks away while you’re speaking you get a big text message on the inside of your glasses that reads, “Move on, sport.” Acting on these cues would also increase your reputation, with time stamps noting when you helped someone increase their productivity by getting back to work versus waxing rhapsodic about the latest episode of
Downton Abbey
.

“The Aztec Project: Providing Assistive Technology for People with Dementia and Their Carers in Croydon” is a report from 2006 documenting sensor-based health solutions for dementia and Alzheimer’s patients in South London. The report starts off with the harrowing statistic that “there are currently some twenty-four million [Alzheimer’s disease] sufferers, a number that will double every twenty years until 2040.”
4
The scale of the population including patient families greatly increases this number, and the
financial burden for all parties involved places significant stress on health costs.

My grandmother had Alzheimer’s, so I identify with the scenarios described in the report. Wandering is a standard behavior, with patients not recognizing they’ve left or entered a room or even their home. Accidents in the kitchen are common, as is forgetting to eat for days on end. The report identified that previous solutions, including things like locking doors to keep patients from wandering or having them wear bulky lanyards outfitted with alarms, were ineffective. When lucid, patients felt trapped or tagged and resented feeling so scrutinized in their own homes.

Technical fixes due to advances in technology, even in 2006, provided solutions that brought great comfort to patients, their families, and caregivers. For instance, instead of having a patient wear an arm or leg band outfitted with a sensor detecting when they went beyond the radius of their home or property (a practice associated with criminals and upsetting to patients), an early form of geo-fencing technology was utilized instead that sent a warning text to caregivers when patients crossed over a virtual perimeter on their property. Sensors were also placed on doors that acted as simple alarms when patients left their houses unattended.

A more recent implementation of sensors to help treat Alzheimer’s patients is taking place in Greece via a technology in development called Symbiosis.
5
Pioneered by a team from the Aristotle University of Thessaloniki’s Department of Electrical and Computer Engineering, Symbiosis has a number of components to help patients and their families and caregivers. SymbioEyes incorporates the automatic taking of photographs via a mobile app also outfitted with GPS tracking and emergency detection capabilities. Worn by patients as a way to monitor location, the pictures are also viewed at the end of the day as a way to inspire memory retention and curb the onset of dementia. SymbioSpace utilizes augmented reality to create digital content that reminds patients of simple be
haviors. For instance, the image of a plate triggers a text reminding patients how to eat with a spoon. Along with the pragmatic benefits of these reminders, they are designed to make a patient feel they are “surrounded by a helpful environment that provides feedback and seems to interact with him/her, responding to his/her needs for continuous reminding and memory refreshing.”

Kat Houghton is cofounder and research director for Ilumivu, a “robust, patient-centered software platform designed to capture rich, multimodal behavioral data streams through user engagement” (according to their website). I asked Kat her definition of affective computing and why sensors are so central to her work:

Affective computing is the attempt to use systems and devices (including sensors) to identify, quantify, monitor, and possibly simulate states of human affect. It is another way in which we humans are attempting to understand our own emotional experiences. Sensors, both wearable and embedded in the environment, combined with ubiquitous wireless computing devices (e.g., smartphones), offer us a large data set on human behavior that has never before been possible to access.
6

A great deal of Kat’s work is focused on autism, where wearable sensors are being utilized for preemptive or just-in-time intervention delivery. “Using data from sensors to generate algorithms that allow us to accurately predict a person’s behavior could radically change our ability to facilitate behavior change much more quickly and effectively,” she says. Sensors can also play a role in identifying and preemptively intervening in states of what is known as “dis-regulation”:

When a person with autism (actually all of us to some degree) is in a state of physiological dis-regulation, they are more
likely to engage in challenging behaviors (tantrums, self-injury, aggression, property destruction, etc.), which cause a lot of stress to themselves and their caregivers and significantly restrict the kinds of learning opportunities available to that individual. We are using wearable sensors that monitor autonomic (involuntary) arousal levels in combination with momentary assessments from caregivers and data from sensors in the immediate environment to see if we can identify triggers of dis-regulation. If we can do this, then we are in a position to be able to experiment with providing preemptive interventions to help people with autism maintain a regulated state by providing input before they become dis-regulated. Right now the only option to caregivers is to try to offer support after the fact.
7

As Kat notes, being able to know ahead of time what is going on for a person with autism could be a significant game changer for many of the more challenged people on the autism spectrum, along with their loved ones and caregivers. Sensors are providing a unique portrait of behavior invisible before these new technologies existed.

To find out more about the idea of interventions involving sensors and the tracking of emotions, I interviewed my friend Mary Czerwinski, research manager of the Visualization and Interaction (VIBE) Research Group at Microsoft.

Can you please describe your most recent work?

For the last three years we have been exploring the feasibility of emotion detection for both reflection and for real-time intervention. In addition, we have been exploring whether or not we can devise policies around the appropriateness and cadence of real-time interventions, depending on personality type and context. Interventions we are exploring include those inspired by cognitive
behavioral therapy, positive psychology, and such practices, but also from observations of what people naturally do on the Web for enjoyment anyway.

How would you define “emotion tracking”?

Emotion tracking involves detecting a user’s mood through technologies like wearable sensors, computer cameras, or audio analysis. We can determine a user’s mood state, after collecting some ground truth through self-reports, by analyzing their electrodermal activity (EDA), heart-rate variability (HRV), [and] activity levels, or from analyzing facial and speech gestures. Machine-learning algorithms are used to categorize the signals into probable mood states.

How has emotion tracking evolved in the past few years, in general and in your work?

Because of the recent advent of inexpensive (relatively speaking) wearable sensors, we have gotten much better at detecting mood accurately and in real time. Also, the affective computing community has come up with very sophisticated algorithms for detecting key features, like smiling or stress in the voice, through audio and video signal analysis.

Do you think emotion tracking will have its own “singularity”? Meaning, will emotion tracking become so articulated, advanced, and nuanced that technology could get to know us better than we know ourselves?

That’s a very interesting question and one I’ve been thinking about. What is certainly clear is that most of us don’t think about our own emotional states that often, and perhaps aren’t as clued into our own stress or anxiety levels as we sometimes should be. The mere mention of a machine being able to sense one’s mood pretty accurately makes some people very uncomfortable. That is
why we focus so much on the hard human-computer research questions around what the technology should be used for that is useful and appropriate, given the context one is in.
8


Along with trying to demonstrate how technology is helping map and quantify our emotions, my bigger goal is in providing you permission for reflection. As Mary points out, many of us don’t think about our emotional states, which means it’s harder to change or improve. Note there’s a huge difference between observing emotions and experiencing them, by the way. Observation implies objectivity, whereas in the moment it’s pretty hard to note, “Gosh, I’m in a blind rage right now.” So while it may take us some time to get used to tracking our emotions and understanding how sensors reveal what we’re feeling, a bigger adjustment needs to happen in our lives outside of technology for the greatest impact to take place.
9

Sensor-tivity

“We’re really in the connection business.” Iggy Fanlo is cofounder and CEO of Lively, a platform that provides seniors living at home with a way to seamlessly monitor their health through sensors that track health-related behavior. “Globalization has torn families apart who now live hundreds of miles from one another,” he noted in an interview for
Hacking H(app)iness
. “Our goal is to connect generations affected by this trend.”
10

After investing in years of study, Fanlo came to realize that older people care more about
why
they’re getting out of bed than
how
. So he focused on finding the emotional connections that would empower seniors while bringing a sense of peace to the “sandwich parents” (people with kids who are also dealing with elderly parents) concerned about their parents’ health. Sadly, a primary reason for friction in these child-to-parent caregiver situations is that
kids tend to badger parents to make sure they’re taking their medications or following other normal daily routines. The nagging drives the parents to resent their kids and potentially avoid calling, even if they have a health-related incident that needs attention.

The company’s tech is surprisingly simple to use, although its Internet of Things sensor interior is state-of-the-art. The system contains a hub, a white orb-shaped device the size of a small toaster. It plugs into the wall and is cellular, as many seniors don’t have Wi-Fi or don’t know how to reboot a router if it goes down. A series of sensors, each about two inches in diameter, has adhesive backing to get stuck in strategic locations around the house:

  • Pillboxes—sensors have accelerometers that know when the pills are picked up, serving as proxy behavior of assuming parents have taken their meds.
  • Refrigerator door—a sensor knows when the door is opened and closed, serving as proxy behavior for parents eating regular meals.
  • Silverware drawer—a sensor knows when the drawer opens and closes, serving as a secondary proxy measuring number of meals eaten.
  • Back of the phone receiver—a sensor knows if the phone hasn’t been lifted, and after a few days, a message is sent to the child of the elderly person living at home reminding them to give their parent a call.
  • Key fob—this features geo-fencing technology and indicates if a parent has left the house in the past few days.
BOOK: Hacking Happiness
2.58Mb size Format: txt, pdf, ePub
ads

Other books

The Longest Road by Jeanne Williams
Murder in a Hurry by Frances and Richard Lockridge
Gather Ye Rosebuds by Joan Smith
Chomp by Carl Hiaasen