Is artificial intelligence the answer to some of Canada’s mental health obstacles?

(Courtesy of Riddhi Jani)

Until recently, artificial intelligence (AI) has been a largely misunderstood concept in popular culture; starting as futuristic sci-fi fantasy, it has now evolved for use in real world applications, such as ads pushed by social media apps or curated playlists in Spotify. Current research, however, has found uses for it in the mental health sector, especially around depression, anxiety, and post traumatic stress disorder.

Recent years’ backlogs in mental health care have pushed the need for therapy that is accessible, affordable, and manageable, but also allows for the medical system to treat people in a timely fashion, which are some problems that researchers are hoping AI can help solve when used in tandem with traditional therapy. 

When asked what that would look like, Kelly Parke, professor and artificial intelligence design architect at York, states that many people are already using AI for their physical and mental health.

“We are working with wearable technology (see Fitbit) to understand stress and sleep patterns. By monitoring a person’s bio readings in a passive way (wearing a watch sensor) a person can monitor their own stress levels and see a change over time. We can also measure heart rate variability and blood oxygen levels for a deeper understanding of how a person reacts to stress. So the treatments can vary from person to person but the key is being able to measure stress over time. That measurement will lead to a greater understanding for the individual.”

Aside from wearable technology, smartphones and other digital devices have proven helpful, especially when considering apps that feature chatbots or communication links to registered therapists. Simon D’Alfonso’s 2020 study of AI uses within mental health highlights both the pros and cons of app and chatbot therapy.

“While the arrival of an AI agent capable of replicating a human therapist is not on the near horizon, if at all, more advanced AI agents are able to simulate a modest conversation employing therapeutic techniques. While not intended to replace the human therapist, such therapeutic chatbots can provide their own form of interaction with users. They can be available at any time to communicate, can be used by individuals who experience stigma or discomfort with seeing a therapist, and can be accessed by those with limited access to traditional mental health services.”

D’Alfonso touched on a key issue surrounding access to mental health, and that is the need to keep therapy and treatment private. This is a primary concern within the field as Parke also comments that “most of the ethical questions centre around keeping data private to the individual. Controlling how the data is used is critical.”

While York is doing research into the application of AI in various forms of the health sector, it is also doing research and application in governance and privacy. It is through partnerships York has with other institutions, like MIT and Harvard, that will enable research to develop more sophisticated and nuanced care that is safely — and hopefully — more accessible. 

About the Author

By Jeanette Williams

Photo/Video Editor

Jeanette is in her third year double majoring in Film and English at York University with a keen interest in science and technology. She loves to write and aspires to be a showrunner or major writer for a TV series or documentary filmmaker. When Jeanette isn’t writing or studying, she is watching documentaries on anything related to politics, the health industry, or true crime.

Topics

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments