Robots: the new consumer truth serum
Honesty is the currency of human interaction, with those caught bending the truth living in fear of punishment, both criminally and socially. Yet anyone who has sat waiting for a friend or partner outside of a changing room appreciates that telling the truth is far from straightforward. As the curtain twitches, you brace yourself for the hardest question known to humanity — “So, what do you think?”
But it would seem we are not always trapped by our social conventions. There is an increasing number of conversations happening that only contain the stark and unfiltered truth. An honesty of unheard purity where the answer to “what do you think?” is truthfully “you look like a hot mess”. It seems we don’t lie to our robots.
The human inability (perhaps unwillingness is a better word) to tell the truth is closely linked to how we form social groups. Degrees of truth are tightly controlled and allocated based on trust and closeness to the person we are interacting with. Ask yourself, are you ever 100% open, even with your closest confidant?
Let’s be honest — we are all guilty of the odd white lie, commonly told to grease the wheels of interaction or to protect our own or someone else’s blushes. Yet this inability to tell the truth, whether stimulated by shame, embarrassment or good intentions, is a significant challenge to brands and medical professions, for whom the only way in is what we are willing to let out.
Trusting the tech
Simplicity of interaction is key to robots being afforded the highest level of truth clearance. But it’s not that our robots are our new best friends — unlike their fleshy counterparts, they hold no social value and no social risk. The trust we are placing in Siri, Alexa or one of a million chatbots filling our lives is due to the fact that they simply don’t matter, allowing us to escape the most significant risk of being truthful — namely judgement and social consequence.
An early example of our willingness to open up to robots is Eliza, a very basic chatbot created in 1963. Eliza asked the user a series of open questions, based on their inputs, to simulate a conversation. The first user notably asked the creator, Joseph Weizenbaum, to leave the room, stating that the conversation she was having was private. Transcripts later revealed a level of honesty that even psychologists rarely encounter, and users unanimously reported the freedom brought about by a knowing lack of judgement and secrecy. They were unaware the transcripts would be later read by Weizenbaum.
Roll forward to 2017 and a new app, Replika, was created to help fill the void left by the death of the developer’s partner. Although infinitely more complex than Eliza, it is still trading off eliciting the same feelings of trust and lack of judgement. Unlike Eliza, who only existed in a single machine, Replika is mobile and has amassed 2.5 million users, all developing deep and meaningful relationships based on unfiltered truth never to be shared… we hope.
Accurate data delivery
Gaining access to valid data is the Holy Grail for any brand wanting to better understand their customer, but there is perhaps a more exciting area of our lives that will benefit — an area long overdue for technical disruption: medicine.
So what does this mean for brands and for medicine?
Attitudes regarding the role of chatbots, interactive voice response and voice technology need to change. Instead of focusing on trying to make the technology as humanlike as possible, we need to recognise and embrace some elements of robotic interaction, namely the lack of judgement and our resulting willingness to be truthful.
Let’s move from the changing room to the waiting room. You are called into the doctor’s office and sit opposite another real-life human who looks you in the eye — “How can I help you?” What follows is another bizarre dance where the word bottle turns into glass, and you spontaneously forget that you have ever smoked more than very occasionally.
The role of honesty when it comes to our health cannot be underplayed and is of far greater importance than social truth. There are several significant medical consequences to our reticence in being honest. Firstly, it is taking a lot longer than it should to report our issues and, when we pluck up the courage to do so, we are still giving a poor account of the details. This has the rather frightening result of costing a lot of people their lives with numerous reports from cancer to mental health, citing a lack of early diagnosis as a primary cause of preventable deaths.
The second consequence for medicine relates to diagnosis itself and, more pointedly, to the reported symptoms that have been attributed to various ailments and diseases. How we feel, what feels right or wrong or strange or unusual, is always going to be a profoundly personal consideration, and it is not always easy to articulate. Any way to stimulate a more open, frank and regular conversation about how we feel should be a priority. The resulting data will give us further insight into the causes, and potential cures, of the things that get in the way of us living our best lives.
The trust we place in technology is growing as it becomes ubiquitous and critical to our everyday lives. This familiarity is already providing brands and medical institutions that recognise its value the opportunity to engage with their users on a whole new level.
A final point to consider is that these interactions go both ways. Any human comfortable and willing to disclose their deepest truths is likely to build a relationship with a brand or service that will go well beyond even our most committed brand advocates.
Irrespective of the motivations, the prospect of asking a customer or patient a question and getting a truthful answer is an exciting one. And if technology can help us get there, and it can, brands will need to invest time and money if they want to benefit from the invaluable insight truthful answers have the potential to give them.
Concept to clinical care: what's holding back healthtech?
Australia is globally recognised for its exceptional medical research output. So why isn't...
Why more needs to be done to support home-grown innovations
Commercialising new medical devices or drugs is highly risky, extremely expensive and returns can...
Opinion: Securing the backbone of health care
Unified, reliable databases provide healthcare organisations with immediate access to...