Moodstone with Google

An emotional object beyond the smartphone

Consider a weekend visit to the beach. As the sun sinks below the horizon, a companion folds up the blanket and scrapes the sand from between their toes. You take a last stroll along the shoreline, looking for a little souvenir to bring back to the city.

You find a stone. Smooth, slightly flat, unremarkable in color, it fits just so in the palm of your hand. When you make a fist, it disappears completely. You slip it into your pocket, and head home.

Over the coming hours, days, months or years, you find yourself reaching for it, holding it, and tracing its contours. You draw comfort from it somehow, even though it was never designed for such a purpose. A very simple and primitive connection forms between you and this object. But affect only travels in one direction — you feel the object, but it can’t let you know that it feels you back.

Objects you may find

Whatever this object is — perhaps it’s a twig, a coin or a button — imagine for a moment that feeling could travel in both directions and that it could connect with us in an empathetic way. Every time it was touched it could carry a record of our emotional state, and could later share that record back to us — a record of emotions over a day, a month or a lifetime.

At the beginning of 2013 we began to design this object. Aware of the dangers of tasking a tool with this kind of responsibility, we set out to make one which was calm, familiar and modest. This is the story of our design process, our experience of trying to quantify experiences, and our questions about the type of relationship we want to have with objects.

Our experiences were so unexpected, and the questions we were forced to ask ourselves along the way became so complex, that we decided to share our design challenges here.

The Brief

Our investigation began with Tea Uglow, creative director at Google’s Creative Lab in Sydney, Australia. She had just finished an experiment with London product designers Berg in creating “lamps that see,” and her curiosity had shifted from sight to touch. Her brief to us read:

Touch is an expressive form. Smartphone and tablet apps rarely make use of this. Emotion tracking is generally unemotional and data­-led. How can we allow users to express themselves through touch to create a diary (or data-set) of their emotion and then represent that to them in a way that is either profound (i.e. art) or helpful (i.e. a tool) or timely (i.e. a diary)?

We wanted to know what a successful outcome would look like, and Tea replied:

Inspiring. …and it would be nice if the outcome was useful to an organization working in mental health.
Drawing of Judith Moskowitz
Judith Moskowitz

Tea connected us with Judith Moskowitz, then a professor at the Department of Medicine at the University of California in San Francisco, researching chronic stress. She saw a convergence in our goals. Prof. Moskowitz uses qualitative and quantitative methods to study how people can invoke positive emotions under stressful conditions, such as being newly diagnosed with HIV. What she needed for this kind of study was a tool which her participants could use to record their emotional state; a self-reporting tool which could be used discreetly and easily in private and public spaces. She wrote to us: “I would love to use a device to get a behavioural signal of an emotion so you can understand how they [participants] are feeling without having to ask them.”

The Value in Noticing

Recording emotion is difficult. As Prof. Moskowitz said to us, mental health research already uses “skin conductance, heart rate, facial expression, brain activity,” to derive emotion. But we only think this kind of information reflects real emotion. These biometric methods are still paired with and vetted against the gold-standard in mental health research: self reporting – where a person documents their emotional state as it’s felt, or shortly after.

“We give people the opportunity to record, quantify and check in on their own experiences,” said Michael Cohn, one of Prof. Moskowitz’ research partners, over the phone from his lab at the University of California, San Francisco. “We talk to them about the value in noticing and valuing positive experiences. They do that every day in a journal format.”

Cohn and his peers are continually testing a variety of self-reporting tools, each with their own set of limitations. In Prof. Moskowitz’ studies, “frequent testing, embedded in participants’ real lives” is required to sufficiently identify and analyze triggers and responses. Some researchers used daily questionnaires administered via paper or simple web forms. Cohn has been using simple text messages to gather spontaneous emotional assessments from his participants. Others have built self-reporting apps into smartphones or tablets.

Drawing of the conspicuous tablet

In our survey of these tools, the smartphone seemed the most logical delivery system, but after tapping our way through a few existing emotion logging apps, something felt off. There were too many steps involved (unlocking the phone, closing the last app, opening the right app, etc.) and there were too many distractions within the phone (our friends, our work, our vices, etc.). “Their phone buzzes, they pull it out, they have to unlock it, if we text them, they have to open up the text message, they have to read it, they have to respond in whatever way,” said Cohn, who is also concerned about the steps smartphones introduce. “The fact that it’s such a general purpose tool means that it kind of sucks still.”

Tablets pose other problems. In 2009 a team of Dutch researchers gave a lecture at the ‘Affective Computing and Intelligent Interaction and Workshops’ in Amsterdam titled “Emotion measurement platform for daily life situations,” where they explained some of the other drawbacks to using a tablet for self-reporting emotion. In their study, participants found the size of the tablet and its keypad inconvenient. But perhaps more troubling was the discomfort they felt if they took the tablet out in public and the people around them were “informed about the fact that they experienced an emotion.” Screen-enabled touch devices were bringing the wrong kind of friction to our simple task of expression.1

Drawing of the compressed space of the smartphone

The Space Behind the Glass

The smartphone isn’t the first tireless recorder of our lives, but no other device has augmented the daily experience of so many, so deeply, so quickly. There are well over a billion smartphones in use and by some estimates 50% of the world’s 4.55 billion mobile phone users will use a smartphone by 2017.2

Over its short history, the smartphone has become a canvas of near infinite dimensions, the broadest expressive medium we have invented for ourselves. Its glass surface, combined with a palm full of sensors and networked services, is infinitely rewritable — shape-shifting into a notebook, a library, a gathering place. This mutability has triggered a massive conversion of tools from hardware to software, all stuffed into the space behind the smartphone’s glass.

The convergence of tools behind smartphone glass has lightened our loads, but also led to unexpected side-effects. Smartphones invite a relentless chain of mode shifting — from killing zombies to joining meetings to breaking up — that’s drawing us toward a cognitive load beyond anything we experienced with the personal computer. Meanwhile trends in interface design have stripped software of most remaining references to their physical predecessors, leaving us to negotiate these modes via a stripped-down shorthand of nouns, verbs and symbols floating on featureless color fields.

Designers are again searching for new forms, interactive objects and service layers, which can be meaningfully adopted into our daily lives. Devices like Google Glass and the Pebble Smartwatch reveal — as much in their naming as in their form — a value assertion of familiar materiality as they struggle to break out of the smartphone.

This was the first major challenge we faced. To impart a quiet clarity in this intimate act of recording emotional state, we needed to extract it from the smartphone’s dense conversion of emotional contexts – love, belonging, responsibility, anxiety and pleasure – to pull it out from behind the glass.

Drawing device forms

We experimented with notches to orient the object in your hand without looking at it.

We decided that the function of the object we were designing demanded a discrete physical form; a compact, screenless, networked object, capable of generating a simple data set. Superficially (like the smartphone) it would be a touch device, but we wanted something which could be touched in a different way.

“If you want people to use it consistently and to feed awareness,” said Cohn, “it needs to have a smooth, immediate feel, and be something which people don’t have other associations with, that doesn’t interrupt what they’re doing to record on it. One of the downsides of any questionnaire is that in order to answer them, you have to interrupt the situation that the person is in.”

An illustration of the power of touch

Sense Making

Touch is deeply connected to emotional well-being and affect (the physiological experience of emotion). Touch is unique among the human senses in the richness of input and output capabilities it affords us. Through our bodies we are capable of communicating a vast set of information with a smudge, knead, tickle, tap, caress, twist, or grasp, and we can receive information through position, shape, texture, heat and motion.

As a touched object, the smartphone affords us only a narrow subset of these input/output capabilities. As an input device it understands time, distance, rhythm, direction and velocity. As output it communicates through warmth and the crude grunt of its internal vibrations. We could do better.

We envisioned people using the device to bookmark everyday moments of emotional punctuation, each bookmark consisting of only a time and an emotional state. Over hours, weeks or years, the accumulated bookmarks could be sliced across timelines and geographies which might correlate with personal patterns and environmental stimuli. At its best, this tool would heighten sensing in order to trigger sense-making.

Explorations in form

Forgotten Objects

Our first prototype was a silent, somewhat bulky 3D-printed plastic enclosure. It was inspired by the angular handles of traditional Japanese kitchen tools like fish knives, highlighted by contemporary designers like Kenya Hara for their uniquely open ergonomics — use is not prescribed, but lightly suggested. Inside the 3D enclosure was a GPS chip for location, a motor for vibration, storage for a week of data, and a battery large enough for a day of use.

Emotional states were recorded along a positive/negative axis. Touch the object one way and it would record a positive emotion, the opposite would record a negative emotion. And the intensity of the touch would represent a greater or lesser emotion. We chose to break the complex and contested world of emotional categories down into such a simple dichotomy as a means of reducing friction.

To our horror, these early prototypes often found their way to the bottom of a handbag or were abandoned at home during casual lending trials. Some compared its appearance to a weapon or a sex toy. While we were confident that shrinking the object and improving its aesthetic would improve adoption somewhat, users reported being unsure of what kind of moments were “worth” recording. Relying on the user not only for the emotional self-evaluation but the timing was creating a layer of anxiety.

While our object could fairly be classified as an artifact of the “quantified self,” as a self-reporting tool, we did not have the luxury of the “turn on, check back later” interaction model enjoyed by the many personal wellness tools that currently fill the genre. Our first prototype gave us the painful realization that our object must be remembered in order to remain useful.

As we weighed strategies for increasing use, we grew concerned that the design characteristics required by Prof. Moskowitz’ research — data consistency, volume and fidelity — may run counter to the user experience needs of her research participants, which included things such as calm, familiarity and modesty. A lean too far toward the former could result in device-related anxiety, while a lean toward the latter could mean participants didn’t feel compelled to self-report frequently, rendering the data unusably sparse.

Sketches of the device imagined as a coin, a pencil, and a watch

We identified two closely related types of anxiety, familiar to anyone with a smartphone, which we would need to address in out design: first, anxiety from the addictive stream of beeps, vibrations and alerts, and second, separation anxiety. In our early design phase, we explored ways of reducing both types of anxiety. We considered disposable, tag-like sensors that could be placed anywhere on the body or in the environment in order to obviate the anxiety of a single, precious device, but could not find an implementation that didn’t require a larger host device. We also considered disguising the device as a pencil or coin, but concluded that the benefits of familiarity were outweighed by the increased risk of misplacement.

Discussing swipe gestures

We had a suspicion that the final form and materials would help make the device less likely to be forgotten and concluded that natural objects — wood, stones and shells, the type of things washed up on a beach or found in a forest — were more likely to be remembered and engaged with, without aggressive prompting. Users might become invested in such an object. We agreed with what Jonathan Chapman wrote in 2005, “objects that evolve slowly over time build up layers of narrative by reflecting traces of the user’s invested care.”3

We worked under the assumption that an ideal object would remain quiet until the user chose to interact with it, but began to accept the likely reality that some user prompt, at least in the short term, may be a necessary concession to test any of our other hypotheses.

Meaningful Friction

Mitsuru Takizawa at work

In December of 2013, we began building our second prototype, coalescing all our aesthetic ideas, research, and testing into a single object. The result was a step closer to a stone washed up on a beach; hand-carved from a small block of Japanese cherry wood, it rests comfortably on the inside of the hand. A row of five thin, silver protrusions sit along the top surface of the object (several testers later described it as a “cats paw”).

Emotions were stored by simply swiping a thumb back and forth. Brushing a thumb left-to-right across the protrusions feels pleasantly smooth and records as a positive emotion. Brushing right-to-left and the protrusions dig slightly against the skin in resistance, recording a negative emotion. Brushing more of the protrusions records a progressively stronger emotion, on a scale of one to five. We chose this negative/positive scale for a number of reasons. A research paper from 2007 titled “The sensual evaluation instrument: Developing a trans-cultural self-report measure of affect” highlights the difficult relationship between language and emotion. “Language is wonderful for summarizing and categorizing and processing one’s emotions after an experience,” write the researchers, “but might sometimes be a clumsy tool for communicating affect in the fleeting moment of interaction, particularly if one is feeling a jumble of half-resolved emotions-in-progress.”4 By reducing the emotional assessment to a single tactile scale, we reduced latency and increased frequency, allowing use of the device without looking at it, or even needing to remove it from a pocket.

The final object

In March, we conducted a simple series of trials using the second prototype.

During testing, the prototype would remain tethered to an external enclosure storing electronic components, limiting its mobility. To ensure we were available to provide technical support for the still fragile assembly, we assigned our Tokyo design studio as the primary testing environment, though once we began tests, our participants were also invited to take the object home overnight.

We recruited five participants from our studio for a quick round of trials: two women and three men from their mid twenties to early forties, four Japanese and one Russian, ranging in occupation from engineering to sales. We asked each participant to log their emotional state every 10 minutes over two eight-hour work days, prompted by a vibration alarm on their smartphone. Each log would appear immediately as a numerical value in a simple text document on their own computer.

After the second day, we interviewed the participants. As we conducted each interview, the successes and failures of our object revealed itself in unexpected ways.

Photo of our test user during an interview

Kono Hito

In just two days, our test users achieved a personal attachment to the form of the object beyond anything we could imagine. Several users wanted to bring it home after the test, and were looking forward to how its color would change as the oils from their hand rubbed into the wood surface. Others described the unnamed object as kono hito, “this person” in Japanese.

Reactions to the method of interaction were more mixed. Several users described the tactile feedback of the protruding sensors as “natural” and “seamless,” while also expressing uncertainty that data had really been recorded. Some users became numbed or overwhelmed by the prompts, and felt that they were too frequent. Some felt that the input was too reductive. One commented “I would have liked to input a single word each time, which would give the data more meaning when I reflected upon it later.”

The biggest surprise however, was not related to form or interaction, but reflection and insight. We initially had hypothesized that users would wait until a day or a few days of data had accumulated before attempting to view their collected responses on a computer and reflect upon their emotional experiences or draw any conclusions. In our trials however, users reflected upon their state, drew conclusions, and in some cases adapted their behavior immediately after each log. Several reported responding to dips in their emotional state by “eating sweets,” “taking a short break” or “making tea.” One user said using the object was cathartic; a way to offload emotions.

Emotional Objects

Our formative trials are insufficient to draw conclusions about the long-term value of this object as a tool for improving emotional health, but speaking with Prof. Moskowitz’s research colleague made us wonder about applications. “I definitely think it would produce valuable data,” he assured us. “Most studies that prompt people frequently are prompting them five or six times a day. If you can get them 50 or 100 times a day, that would be really interesting. I don’t know if that’s something anyone has tried before.”

Our goals with the object were originally aligned with clinical research, but our observations revealed a potential applicability among a much wider population. Perhaps a new class of objects, protocols, software and services purpose-built for raising emotional awareness is just a few years away. Such a stack of technologies might provide a platform for individual and community change – an entire ecosystem of analytical activities, self-serving and guided programs, leading to individual decisions as well as public policy and investment in mental health services.

Whether the vitality of such an ecosystem is expressed in public policy, personal bests or unit sales; who will define data models, interoperability standards and user experience standards, all this is yet to be seen. Digital tools for emotional well being can learn much from the successes and failures of their physical health counterparts, but ultimately will need to adhere to unique principles of user experience, because the data is quantitatively different, and because people’s awareness of and ability to interpret their own emotional health is highly subjective.

A year has passed since we first read Tea Uglow’s proposal and began imagining an empathetic object. One which could “allow users to express themselves through touch to create a diary (or data-set) of their emotion and then represent that to them in a way that is either profound (i.e. art) or helpful (i.e. a tool) or timely (i.e. a diary),” as Tea wrote. Though we began from a starting point of utility in mental health research, our final prototype left us at a place where we wondered about the possibility of a new type of object.

We imagine something like a smooth stone, which never needs charging, which quietly reminds you of its presence, and which you can store simple emotions with, by moving a thumb or finger across its surface. It’s a device with applications far beyond individuals with chronic stress, and is potentially capable of opening up new ways of measuring and raising our emotional intelligence, and general health. As one participant in our trial reflected, “emotion is the entry point for health.” We imagine the art of self-reporting migrating from the laboratory into an empty space in your pocket — somewhere between your smartphone and that stone you found, washed up on the beach.

A trail of moments

Credits

  • Design: AQ with Jan Rod
  • Prototype production: Mizuru Takizawa and Jan Rod
  • Essay: Chris Palmieri with Jan Rod

Many thanks to Tea Uglow, Judith Moskowitz and Michael Cohn for their guidance, encouragement and inspiration.

Footnotes

1. Westerink, Joyce and others. “Emotion measurement platform for daily life situationsAffective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on Affective Interaction in Natural Environments (AFFINE).

2. eMarketer

3. Chapman, Jonathan. Emotionally Durable Design: Objects, Experiences and Empathy, 2005.

4. Isbistera, Katherine, Kia Höökb, Jarmo Laaksolahtib and Michael Sharpa. “The sensual evaluation instrument: Developing a trans-cultural self-report measure of affect.International Journal of Human-Computer Studies 65.4 (April 2007): 315-328.