Where is the Fitbit for mental health?
If technology has been known to improve lives, there's no better application than personal wellness. I'm sure you've heard some variation of the saying that improvement requires measurement, and automated measurement makes finding actionable insights much, much easier.
Imagine: wouldn't it be cool if a device could tell you exactly how much water you need to drink at any given moment for peak hydration, or an implant that could inform you of how your sobriety would change before consuming a defined amount of a mind-altering substance?
That future isn't too far away. But remember: physical health is only one part of overall wellness.
Until now, most wearables and fitness apps have been developed to be firmly grounded in the physical—the "shell" part of the "ghost." There's been unfortunately little innovation in mental health technology.
Let's break into why.
The limitations of mental health understanding
According to Mental Health America:
A mental illness is a disease that causes mild to severe disturbances in thought and/or behavior, resulting in an inability to cope with life’s ordinary demands and routines.
Symptoms may include changes in mood, personality, personal habits and/or social withdrawal... Mental illnesses may be caused by a reaction to environmental stresses, genetic factors, biochemical imbalances, or a combination of these.
This definition is problematic. In order to establish "disturbances" in thought and/or behavior, doctors have to establish what "normal" thought and/or behavior is, let alone "life's ordinary demands and routines." Never forget that misapplied mental illness diagnoses have a long history of marginalizing perfectly healthy communities. Consider this: penis envy, homosexuality, and "moral insanity" were all once considered mental illnesses, worthy of hospitalization and disruptive "treatment."
Well aware of psychiatry's problematic history, psychiatrists have moved to a checklist-sort of approach for diagnoses.
Consider the evolving definition of depression according to DSM (repurposed from Wired:
DSM-I (1952) | DSM-II (1968) | DSM-III (1980; revised 1987) |
---|---|---|
Depressive reaction: Classified as a psychoneurotic disorder characterized by anxiety. | Depressive neurosis: No longer considered a form of anxiety, it’s now explained as a reaction to internal conflict or the loss of a beloved object or person. | Major depression: Now a category of disorder. An exception is created for bereavement following the loss of a loved one, which is called a “normal reaction.” |
DSM-IV (1994; revised 2000) | DSM-5 (2013) |
---|---|
Major depressive episode: The bereavement exception is limited: Only if a griever’s symptoms last less than two months are they considered normal. | Major depressive episode: The bereavement exception is removed, since “evidence does not support” distinguishing grief from other “stressors.” |
The consistent evolution of mental health diagnoses undermines psychiatrist as a whole; for a practice to have any kind of authority, a doctor must be able to prove that he or she can accurately diagnose why a patient is suffering. You wouldn't see a cardiologist who isn't sure of what a regular heartbeat sounds like, and the same applies to patients interested in psychological treatment. One psychiatrist opined fifty years ago, “there is a terrible sense of shame among psychiatrists, always wanting to show that our diagnoses are as good as the scientific ones used in real medicine.” That same sentiment holds true today.
But that doesn't mean that mental illness doesn't exist. David L. Rosenhan, author of (PDF)On Being Sane In Insane Places, explains,
To raise questions regarding normality and abnormality is in no way to question the fact that some behaviors are deviant or odd. Murder is deviant. So, too, are hallucinations. Nor does raising such questions deny the existence of the personal anguish that is often associated with “mental illness.” Anxiety and depression exist. Psychological suffering exists. But normality and abnormality, sanity and insanity, and the diagnoses that flow from them may be less substantive than many believe them to be.
Without a label, doctors and researchers know that mental illness exists, but, definitionally, they struggle with defining or monitoring it.
What does this have to do with mental health technology?
Say we could come up with a comprehensive understanding of mental illness—say the psychological community says that the DSM is "good enough." There doesn't need to be definitional perfection to start treatment.
That problem still doesn't address the checklist method. There are no quantitative tests for ADHD, PTSD, or depression—just symptoms monitoring. There is no available technology that can help diagnose or monitor symptoms that doesn't involve self reporting.
And therein lies the next problem. Mental health tech, as it exists now, needs end-user involvement. There's no "set and forget" to monitor your emotions, thoughts, behaviors, and coping mechanisms. Apps like Breathe to Relax, Lantern, and Workguru, while all popular and clinically effective, also have a caveat about user reliability and use.
That's problematic when their client base is likely unreliable to begin with, as they have a mental disorder.
Physical health technology has already permeated the consumer market, quantifying health indicators like daily steps, heart rate, and blood oxygen levels. They don't require much manual effort; Fitbit slaps onto your wrist and records your heart rate and daily steps, the K'Watch monitors glucose levels without blood tests, and the Ava bracelet provides health insights tied to the menstrual cycle.
Equivalent mental health technology—even at enterprise or the hospital grade—simply doesn't exist yet.
There are a few exceptions, of course, often under the umbrella of clinical neuroscience. For example, research in neuroimaging has very recently revealed that combining research from from SPECT, PET and fMRI studies may be predictive of whether a patient has depression, and whether that patient will respond to SSRI antidepressants. Those kinds of insights are great for quantitative diagnostics, but still have a long way to go to be useful on the daily consumer level.
That said, there are certainly a few companies just now breaking into the space, largely in the field of stress monitoring and management. Products like Thync, WellBe, and Spire are all great first steps for the future of mental health technology.
So is there really no hope for deeper daily mental health monitoring?
Between developments in the Internet of Things, smart homes, citizen data science, and potential artificial intelligence applications to mental health care, I see a ten- to twenty-year waiting period before daily mental health technology can really take off. Users would install monitoring apps into their environments to create massive data sets, which business intelligence application would then interpret. That kind of technology, including advancements in clinical neuroscience, needs to develop far from where it is now.
(I should couch this by saying HIPAA requirements may have to shift for the tech to truly be viable.)
That said, it's important to go back to the original problem with mental health, especially when introducing machine learning. Because people don't fully understand mental illness, AI will necessarily be limited (e.g. it would have likely been programmed to believe homosexuality was a diagnosis 50 years ago). And if AI makes insights that humans aren't comfortable with, it won't be taken seriously.
Like this post?
If you're interested in more future tech discussions, please follow me at @RachelBurger and upvote to let me know you liked this article!
Do you use physical health trackers, or invest in other quantified self technology? Do stress monitors deserve more applause? Were there parts of this article that resonated and inspired action—or made you roll your eyes?
I look forward to hearing and responding to your reactions in the comments.
Interesting post. Isn't part of the problem of why there are no quantitative tests or clear diagnoses of mental health problems the fact that we still are far from a complete understanding of how the brain, mind, and consciousness work?
I would really like to make a substantive reply to this, but I am a little lost for word on how to go about it, and out of time 'cause I have an interview on MSPwaves radio coming up.
yay for me
:)
But I did enjoy your article and the thought processes you have sparked.
If I can think how to say what I would like to say, I will come back.
big hugs