Assessing Interventions on Crowdsourcing Platforms to Nudge Patients for Engagement Behaviors in Primary Care Settings: Randomized Controlled Trial

J Med Internet Res. 2023 Jul 13:25:e41431. doi: 10.2196/41431.

Abstract

Background: Engaging patients in health behaviors is critical for better outcomes, yet many patient partnership behaviors are not widely adopted. Behavioral economics-based interventions offer potential solutions, but it is challenging to assess the time and cost needed for different options. Crowdsourcing platforms can efficiently and rapidly assess the efficacy of such interventions, but it is unclear if web-based participants respond to simulated incentives in the same way as they would to actual incentives.

Objective: The goals of this study were (1) to assess the feasibility of using crowdsourced surveys to evaluate behavioral economics interventions for patient partnerships by examining whether web-based participants responded to simulated incentives in the same way they would have responded to actual incentives, and (2) to assess the impact of 2 behavioral economics-based intervention designs, psychological rewards and loss of framing, on simulated medication reconciliation behaviors in a simulated primary care setting.

Methods: We conducted a randomized controlled trial using a between-subject design on a crowdsourcing platform (Amazon Mechanical Turk) to evaluate the effectiveness of behavioral interventions designed to improve medication adherence in primary care visits. The study included a control group that represented the participants' baseline behavior and 3 simulated interventions, namely monetary compensation, a status effect as a psychological reward, and a loss frame as a modification of the status effect. Participants' willingness to bring medicines to a primary care visit was measured on a 5-point Likert scale. A reverse-coding question was included to ensure response intentionality.

Results: A total of 569 study participants were recruited. There were 132 in the baseline group, 187 in the monetary compensation group, 149 in the psychological reward group, and 101 in the loss frame group. All 3 nudge interventions increased participants' willingness to bring medicines significantly when compared to the baseline scenario. The monetary compensation intervention caused an increase of 17.51% (P<.001), psychological rewards on status increased willingness by 11.85% (P<.001), and a loss frame on psychological rewards increased willingness by 24.35% (P<.001). Responses to the reverse-coding question were consistent with the willingness questions.

Conclusions: In primary care, bringing medications to office visits is a frequently advocated patient partnership behavior that is nonetheless not widely adopted. Crowdsourcing platforms such as Amazon Mechanical Turk support efforts to efficiently and rapidly reach large groups of individuals to assess the efficacy of behavioral interventions. We found that crowdsourced survey-based experiments with simulated incentives can produce valid simulated behavioral responses. The use of psychological status design, particularly with a loss framing approach, can effectively enhance patient engagement in primary care. These results support the use of crowdsourcing platforms to augment and complement traditional approaches to learning about behavioral economics for patient engagement.

Keywords: Amazon Mechanical Turk; Mturk; behavioral interventions; crowdsourcing; medication safety; patient engagement; primary care.

Publication types

  • Randomized Controlled Trial
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Behavior Therapy
  • Crowdsourcing* / methods
  • Humans
  • Motivation*
  • Patient Participation*
  • Primary Health Care
  • Surveys and Questionnaires