Personal sensing data could help monitor and alleviate stress among resident physicians, although privacy concerns over who sees the information and for what purposes must be addressed, according to collaborative research from Cornell Tech.
The study was published in the Proceedings of the ACM on Human-Computer Interaction.
Burnout in all types of workplaces is on the rise in the US, where “Great Resignation” and “silent quitting” have entered the lexicon in recent years. This is especially true in the healthcare industry, which has been strained beyond measure due to the Covid pandemic.
Stress is physical as well as mental, and evidence of stress can be measured through the use of smartphones, wearables and personal computers. But data collection and analysis – and the larger questions of who should have access to that information, and for what purpose – raise myriad sociotechnical questions.
“We’ve looked at whether we can measure stress in workplaces using these types of devices, but do these individuals actually want this kind of system? That was the motivation for us to talk to those actual workers,” said Daniel Adler, co-lead author with fellow doctoral student Emily Tseng.
Adler and Tseng worked with senior author Tanzeem Choudhury, the Roger and Joelle Burnell Professor in Integrated Health and Technology at the Jacobs Technion-Cornell Institute at Cornell Tech. Contributors came from the Zucker School of Medicine at Hofstra/Northwell Health and Zucker Hillside Hospital.
The resident physician’s work environment is a bit different from the traditional apprenticeship situation in that their supervisor, the attending physician, is also their mentor. That can blur the lines between the two.
“That’s a new context,” Tseng said. “We don’t really know what the actual boundaries are there, or what it looks like when you introduce these new technologies, either. So you need to try and decide what those norms might be to determine whether this information flow is appropriate in the first place.”
Choudhury and her group addressed these issues through a study involving resident physicians at an urban hospital in New York City. After hourlong interviews with residents on Zoom, the residents and their attendings were given mockups of a Resident Wellbeing Tracker, a dashboard with behavioural data on residents’ sleep, activity and time working; self-reported data on residents’ levels of burnout; and a text box where residents could characterize their well-being.
Tseng said the residents were open to the idea of using technology to enhance well-being. “They were also very interested in the privacy question,” she said, “and how we could use technologies like this to achieve those positive ends while still balancing privacy concerns.”
The study featured two intersecting use cases: self-reflection, in which the residents view their behavioural data, and data sharing, in which the same information is shared with their attendings and program directors for purposes of intervention.
Among the key findings: Residents were hesitant to share their data without the assurance that supervisors would use it to enhance their well-being. There is also a question of anonymity, which was more likely with more participation. But greater participation would hurt the potential usefulness of the programme since supervisors would not be able to identify which residents were struggling.
“This process of sharing personal data is somewhat complicated,” Adler said. “There is a lot of interesting continuing work that we’re involved in that looks at this question of privacy, and how you present yourself through your data in more-traditional mental health care settings. It’s not as simple as, ‘They’re my doctor, therefore I’m comfortable sharing this data.’”
The authors conclude by referring to the “urgent need for further work establishing new norms around data-driven workplace well-being management solutions that better centre workers’ needs, and provide protections for the workers they intend to support.”
The research was supported by grants from the National Institute of Mental Health, the National Science Foundation and the Digital Life Initiative at Cornell Tech.
Psychreg is mainly for information purposes only; materials on this website are not intended to be a substitute for professional advice. Don’t disregard professional advice or delay in seeking treatment because of what you have read on this website. Read our full disclaimer.