Methods to Recruit Healthcare Providers for Virtual Advisory Boards in Drug Development

Article

Tufts CSDD study addresses barriers in recruitment.

Jennifer Kim

Jennifer Kim

Emily Botto

Emily Botto

Background

Healthcare Provider (HCP) and Patient Advisory Boards (PAB) or Community Advisory Boards (CAB) have become common practice1,2 among clinical trial sponsors to evaluate the validity and feasibility of study protocols. The expertise of HCPs is essential to the outcomes of these advisory boards (AdBoards); however, recruitment of HCPs for research studies can be challenging, and not much has been written about how to tackle these challenges. The nature of the patient-facing healthcare profession often means that carving out time for AdBoard participation may be more of a barrier for HCPs than the average patient targeted for such discussions.

The COVID-19 pandemic has further impeded attempts at gathering data from HCPs, including physicians, nurses, and physician assistants. The well-documented burnout effect of the pandemic and the increase in requests for surveys or focus groups investigating COVID-19 experiences,3,4 further contribute to survey and videoconference fatigue. Such difficulties in the recruitment of HCPs can hinder successful implementation of AdBoards. Hence, recommendations and strategies to attract more eligible HCPs to participate in these AdBoards are essential to the support of drug development research.

The increasing prominence of the internet, and by extension the ability to host AdBoards, focus groups, and interviews virtually, has reduced geographic and time-related barriers that previously prevented recruitment of a wider, more diverse sample of participants.5 Although research around the transition from in-person to virtual AdBoards began well over a decade ago,6 the uptake in the medical field has been relatively slow and publications around recruitment strategies and methodology for virtual studies have remained noticeably thin.7 The COVID-19 pandemic, however, forced a rapid shift towards virtual interactions, catalyzing a few studies around the pandemic’s effect on qualitative research methods and best practices. According to a 2021 study by MphaR, 20% of advisory boards had already moved online prior to the COVID-19 pandemic, but 2020 saw this number jump to 80%.8 Despite some research emerging on the topic, indicating increasing interest, the literature on best methodologies for conducting virtual qualitative research in medical science remains scant. The shift in how we recruit and incentivize HCP research participants, in particular, requires additional research for adaptive strategies.

Thus, the purpose of this article is to share learnings related to our effort to recruit HCPs to a virtual AdBoard during the pandemic. We outline the challenges to recruitment, the adaptive strategies used in response to these challenges, and suggest recommendations for future studies.

Methods

The aim of this study was to gather a group (n = 4 – 6) of US-based patient-facing healthcare providers with experience treating patients with multiple myeloma to provide input on a Phase III clinical trial protocol for an experimental drug. The inclusion criteria were: (1) US-based patient-facing HCP (2) Currently treating patients with multiple myeloma. Recruitment began following exemption by an independent ethical review board.

Recruitment efforts were done virtually and lasted approximately 2 months. A personalized email containing a screener link to determine eligibility was created. This email contained logistical details related to the duration of the AdBoard (two 2-hour videoconference sessions), participants’ responsibilities (i.e., reviewing the study protocol and providing feedback), and compensation (market rate). In the same email, participants were asked to share the study with qualified colleagues who may be interested in participating. Participants’ input was reviewed by the research team to determine their eligibility. Eligible participants were then contacted regarding their availability.

To ensure that we reached a diverse population of HCPs, we used several outreach outlets. During the first outreach effort (Wave 1), we used an existing purchased list of 733 US-based HCPs comprised of physicians, nurses, and physician assistants in oncology. Recipients who had not clicked on the link or who had not finished the screener were sent a reminder email one week after the initial email, and a second follow-up two weeks after the first. Individuals who chose to click the “unsubscribe” link, or reached out directly to request removal, were immediately removed from the contact list.

The above recruitment strategy was supplemented with a more targeted strategy (Wave 2a), which involved searching for email addresses for potential participants from publicly available sources, including Pubmed, ClinicalTrials.gov, and multiple myeloma research and healthcare institution websites. We considered Wave 2a a pilot run with the intention to expand the strategy depending on the response rate. Our initial search using this method yielded 106 contacts. No other elements of the recruitment strategy were altered for Wave 2a. The following search criteria was used on both PubMed and ClinicalTrials.gov: “multiple myeloma” [or] “hematology.” Filters were applied on Pubmed to exclude any article published before 2017. We also noted the participant’s gender and racial background where possible. This targeted approach resulted in a larger, more diverse pool of eligible participants and an increased enrollment rate, which we will detail in the results section.

Following the initial trial run (Wave 2a), we expanded our effort using this method, which yielded 677 contacts who were sent an email via Qualtrics. In this wave, we made two adjustments: 1) the introduction text was revised to include session dates and was considerably shortened and 2) as opposed to a personalized link, which was used in Wave 1 and Wave 2a, an anonymized link was sent, making it easier for recipients to share the screener with their network. One reminder email was distributed one week after the initial invitation to those who had not unsubscribed from the list.

Lastly, a small number (n = 20) of professional contacts within the research team’s immediate network were also contacted for recruitment.

Participants who qualified for the study were sent an official invitation, consent form, and a Zoom link. A reminder was sent 24 hours prior to the first session to ensure attendance. Overall, two 2-hour AdBoard sessions were held among 5 US-based multiple myeloma healthcare providers. Sessions were recorded and transcribed.

Results

Market list (Wave 1)

A pre-purchased list of 733 physicians, nurses, and physician assistants in oncology were contacted in late October 2021 during the first phase of recruitment. Of these 733 recipients, 20 unsubscribed (2.7% unsubscribe rate) and 51 emails bounced (7.0% bounce rate). An additional 4 recipients reached out to the research team directly to clarify that they did not treat multiple myeloma and were therefore not interested in the study.

Twelve invitees opened the screener link (“clicks”), all of whom completed the screener. Two additional screener responses were recorded from referrals. One screener respondent specified that they were unable to participate due to scheduling restraints. Of 14 potential participants, 3 were deemed eligible and 2 were enrolled in the study.

Public Sources (Wave 2a-b)

Wave 2a was sent to 106 contacts collected from publicly available sources, resulting in 2 clicks and 1 screener completion. The individual who completed the screener was deemed eligible and was enrolled in the study.

Wave 2b resulted in 677 contacts gathered from Pubmed and ClinicalTrials.gov who were contacted between early to mid-November 2021. Of the 677 contacts, 9 invitees opened the screener link and 7 finished filling out the screener. Six additional recipients reached out via email to inform the research team that they were not available during the session dates, or that they would not be able to devote 4 hours to the project regardless of the scheduled times. These participants did not click on the screener link but expressed interest in participating. One additional screener response was recorded from referrals. A second referral never filled out the screener but was deemed eligible based on known background and experience, however, this individual was not able to participate due to scheduling conflicts. Of the 9 potential participants, 6 were deemed eligible and 2 were enrolled in the study.

Four recipients unsubscribed in this wave, and 1 reached out to the research team requesting removal from the list. See Table 2 and Figure 1 below for recruitment summary for both waves.

Overall, 5 individuals were identified for AdBoard participation and completed all 4 hours of the AdBoard. The original recruitment goal was 6 to 8 participants total, making the final group fall short of meeting this quota. However, this number of participants was sufficient for this AdBoard. Enrolled participant characteristics are provided in Table 1 below.

Table 1. Participant Characteristics

Table 1. Participant Characteristics

Table 2. Response Metrics - Method 1 & 2

Table 2. Response Metrics - Method 1 & 2

Figure 1. Comparison of Recruitment Methods

Figure 1. Comparison of Recruitment Methods

Discussion

Recruitment methods in this study

The screener distribution through the purchased list received a slightly higher volume of responses and a slightly higher response rate overall compared to gathering contacts from publicly available sources. However, the higher response rate did not yield a higher participation rate such that only 21.4% of those contacted via the purchased list were eligible compared to 70.0% of those identified on Pubmed and ClinicalTrials.gov. This problem could be avoided by purchasing a more specialized list; however, such lists can be harder to generate depending on how narrow eligibility requirements may be. For non-physician HCPs, the availability of specialized lists narrows, as many nurse and physician assistant contact lists are not specific to treatment specialty.9

In terms of cost, many list providers require a minimum expenditure, the lower end of which typically falls between $1,000 and $2,000 for an annual license, with the cost per contact varying depending on number of uses, type, and size of overall purchase. Specialized contact lists are difficult to use for future projects for general research teams, and the requirement of a larger number of contacts without the ability to repurpose them for other projects further decreases the benefit of purchasing a contact list. While some companies offer a single-use option, which can decrease the overall cost, this gives the researchers less control over the database, and may restrict the ability to send reminder or follow-up emails. Another limitation of purchased lists is that their viability cannot be guaranteed despite their high cost. For example, in this study, the purchased list carried a 7.0% bounce rate.

The distribution using contacts manually collected from Pubmed and ClinicalTrials.gov proved to be more fruitful. This method allowed the research team to generate a contact list of highly specialized HCPs using targeted keywords (i.e., “multiple myeloma”), which increased the eligibility rate of respondents despite a comparable click rate and referral rate to the purchased list distribution. Further, we were able to track the participant’s demographics (i.e., gender, race, and provider type – physician, nurse, etc.), allowing us to create a more diverse pool of potential participants. The slightly lower click rate and screener completion rate for the Pubmed/ClinicalTrials.gov distribution may be a result of decreased availability from recipients since individuals from Wave 2b were contacted closer to the session dates and were therefore more likely to have scheduling conflicts. This supposition is supported by two factors:

  • Wave 2a (publicly available sources), which was distributed during the same time period as Wave 1 (purchased list), carried a slightly higher click rate (1.9%) than the purchased list (1.6%)
  • An increased number of individuals during Wave 2 directly contacted the research team regarding scheduling conflicts

Although Wave 2 also resulted in a high bounce rate (17.0%), the fact that the research team was still able to enroll more eligible participants with this method suggests that a high bounce rate may not heavily affect the success of this recruitment method. However, to avoid collecting out-of-date contact information, we recommend applying a 5-year “Publication Date” filter to the Pubmed search criteria to account for turnover within the institution. ClinicalTrials.gov also has a filter that only includes studies that are currently recruiting in the search results, which would produce more up-to-date contacts as well. The likelihood of turnover within target institutions is especially high within the last few years due the COVID-19 pandemic, making the application of these filters important for the viability of contacts.10

During recruitment, manually gathering contacts from Pubmed and ClinicalTrials.gov was more time-consuming than using a purchased list, as it required combing through webpages and hand-recording each entry. The research team conducted further research following the conclusion of this study and found that there are software packages (i.e., R and Python) that allow for easier gathering of emails. One such package is R’s easyPubmed, whichallows researchers to retrieve Pubmed records, including names, affiliations, and emails where available. However, this is only possible for US contacts, as data protection regulations in some ex-US countries (primarily those within the European Economic Area) prevent the availability of this data scraping method internationally. By using appropriate keywords, this program can conduct similar recruitment methods as searching through the platform manually by automatically gathering names and contact information.

Another method includes adding keywords to the Pubmed advanced search function and downloading the full Pubmed text file. From there, it is possible to extract emails from the text file. This method, however, does not include first or last names,11 making it difficult to personalize emails, which some studies deem more effective in reaching potential participants,12 though others have found that personalization has little to no effect on recruitment and enrollment.13 This method is simpler and may be preferable for research teams without experience in R programming.

Utilizing publicly available resources such as Pubmed and ClinicalTrials.gov can be helpful in AdBoard, focus group, or other qualitative studies where eligibility requirements are strict; the research team can manually comb through records or scrape the webpage for contact information; and/or the study budget does not allow for purchased lists or other common recruitment methods such as social media advertisements.

Ethical considerations

We note that automatic extraction of email addresses from publicly available sources can be controversial and that some may perceive unsolicited emails from any source as spam.14 However, we believe that using such methods for academic purposes that are directly related to the HCPs’ field of interest is less likely to result in complaints from email recipients compared to the use of a general market list. Further, the low unsubscribe rate from our own study is a potential indication that our recruitment efforts were not perceived as spam by most of the screener recipients. Notably, this effort can be a viable way for academics and scholars operating under a limited budget to achieve enrollment goals given that purchased lists tend to be expensive and, at times, controversial as well.15 

Nevertheless, it is important to contact only those whose expertise is directly related to the given study and therefore may be most likely to benefit from participation. The study team was mindful to use a targeted approach by only recording emails and contacting individuals from Pubmed and ClinicalTrials.gov who would be most relevant for our study.

Recommendations for future strategies

Incorporation of asynchronous elements

Recent survey research by MphaR suggests that utilizing asynchronous discussion forums—defined within this research as the ability for participants to engage with the content (i.e., relevant publications and materials) on their own time and individually consider their comments and feedback in place of or in addition to video interaction—may be a more productive AdBoard medium. In this survey, 50% of HCPs preferred asynchronous discussion only, while 30% preferred asynchronous discussion in addition to video interaction. Ten percent reported a preference to return to face-to-face, and 10% preferred video only for AdBoard formats.8 Although our AdBoard was conducted synchronously, asynchronous discussion may be an advantage for recruitment to overcome one of the most difficult aspects of HCP recruitment: scheduling. Multiple recipients of invitation emails in this study showed interest in participating but stated that they could not allocate 4 hours to the project due to limited bandwidth or schedule conflicts.

An asynchronous discussion eliminates many of these issues, as participation can take place at any time. With a hybrid method that combines synchronous and asynchronous discussion, the synchronous session could be pared down to a much more manageable length (i.e., 1 hour instead of 4) that may be more accommodating to the average HCP. Several studies have been successful in implementing an asynchronous focus group method for HCP qualitative data collection, further supporting the use and validity of this method among this population.16-18

Recruitment using social media

Though using social media for study recruitment has been studied extensively, few studies concern the recruitment of HCPs.19 Additionally, quantifying the effectiveness of social media on recruitment has proved difficult because previous studies have defined and used different metrics to measure success (i.e., views, clicks, or participant enrollment).20 Pizzuti et al. (2020) found that Facebook was the most commonly used social media platform among US-based HCPs compared to seven other platforms (Snapchat, Google+, Pinterest, Twitter, Reddit, LinkedIn, Instagram).21 Yet, Facebook Ads did not always yield high success rates compared to other social media platforms.22 In fact, our review of the literature revealed that the platform’s success often depended on the target population. Thus, when the recruitment strategy involves social media advertisements, the research team should have a thorough understanding of their target audience, particularly which platform(s) their audience most commonly uses (i.e., Facebook vs. TikTok).

As with the type of platform and general effectiveness of social media recruitment, its cost effectiveness varies widely across studies and target populations. According to Whitaker et al. (2017), in a review of 35 global studies between 2012 and 2017, the average cost of Facebook recruitment for health research was $14.41/participant at $0.51 per click.23 In a separate, randomized controlled trial, the reported cost/randomized participant reached $172.76/participant before the researchers discontinued the social media advertisement due to low engagement.24 Studies that were successful in recruiting their target sample reported a low cost overall and improved cost effectiveness25 compared to other strategies such as email and print ads, while those unsuccessful in social media recruitment often canceled their paid social media advertisement once the effort proved unfruitful24 or hit a cost limit,26 making it difficult to estimate an exact cost if social media were to be used until the full sample size was achieved.

Few online sources reported recruitment methodology for focus group studies, with those reporting cost data instead consisting primarily of cross-sectional or randomized controlled clinical trials. Although some studies identified during this literature review reported on recruitment costs of HCPs via social media,27 few concerned focus group studies,28 and none addressed both. That said, given the review of studies reporting the cost of social media as a recruitment method, accuracy rather than cost seems a more significant limitation.

To get around this limitation, researchers could post on professional groups or associations relevant to their target participants. However, as with any cold call method, the response rates may be low. Professional associations receive a large number of requests and are limited in how many research studies they can promote. During the initial phase of our study, for example, five hematology and oncology associations were contacted regarding the possibility of advertising our study, yet none responded. Often, having professional ties to someone in those organizations can be an effective way to recruit niche populations for qualitative studies. For example, Dalessandro et al. (2018) cited difficulties recruiting LGBTQ participants for a qualitative study but managed to ask a contact to share the study on a private Facebook group, which helped the research team to reach their recruitment numbers.29 Another study cited “partner organizations’ social media outlets” as a recruitment tool.30This professional network requirement to research recruitment is a noted barrier, with McRobert et al. (2018) stating that social media recruitment “relies on pre-existence of a diverse and functioning social network.”19

Our institution conducts a wide array of research in a variety of fields and topics, making it difficult to have such contacts available, particularly for studies on niche topics. However, as we demonstrated in this study, recruiting HCPs on a specialized topic without the research team having pre-existing professional ties to HCPs in particular healthcare specialties is still possible. It simply requires additional time, effort, and a bit of creativity. Additionally, though initial recruitment using cold emailing methods can be time-consuming, once the relationships between the research organization and the HCPs have been established, they can lead to improved future recruitment opportunities.

Lastly, using a combination of recruiting methods (i.e., cold emailing; word-of-mouth; social media) provides more viable ways to recruit a diverse pool of participants compared to using one method. For example, social media recruitment is associated with a less diverse sample, namely young, female, white, educated, and economically advantaged, compared to traditional recruitment methods.30 Thus, a combination of the different strategies may be ideal to produce a diverse sample that reaches the target sample for the study.

Conclusions

In this study, collecting multiple myeloma HCP emails from publicly available sources helped the research team to find more eligible participants compared to using purchased lists. Additionally, packages such as R’s easyPubmed as well as Khalifa’s strategy to scrape emails from Pubmed make it possible to generate contact lists more suited to the needs of researchers at little to no cost to the institution. Scheduling was a major challenge faced by researchers during this study, which could potentially be mitigated by utilizing asynchronous virtual AdBoards to improve the scheduling process and make AdBoard or focus group participation easier for HCPs from all backgrounds. Although social media recruitment has been studied as an alternative recruitment medium, not much is known about recruiting HCPs with this method, and most studies conclude that its usefulness and cost-effectiveness depends on the target population and the professional contacts of the research team and institution. For small, non-profit institutions, particularly those that focus on a range of research topics and target populations, cold emailing using contacts gathered form publicly available sources is one of the most cost-effective and accurate options for the recruitment of HCPs.

Emily Botto, BA, and Jennifer Kim, PhD; both with the Tufts Center for the Study of Drug Development, Boston, MA, USA

References

  1. Michaels, DL, Lamberti, MJ, Pena, Y, et al. Assessing biopharmaceutical company experience with patient centric initiatives. Clin Ther 2019; 41: 1427-1438. DOI: 10.1016/j.clinthera.2019.07.018.
  2. Eder, M, Evans, E, Funes, M, et al. Defining and measuring community engagement and community-engaged research: CTSA institutional practices. Progress in Community Health Partnerships: Research, Education, and Action 2018; 12(2): 145–156. DOI: 10.1353/cpr.2018.0034.
  3. De Koning, R, Egiz, A, Kotecha, J, et al. Survey fatigue during the COVID-19 pandemic: an analysis of neurosurgery survey response rates. Frontiers in Surgery 2021; 8. DOI: 10.3389/fsurg.2021.690680
  4. Morgantini, LA, Naha, U, Wang, H, et al. Factors contributing to healthcare professional burnout during the COVID-19 pandemic: A rapid turnaround global survey. PLoS ONE 2020; 15(9): e0238217. DOI: 10.1371/journal.pone.0238217.
  5. Boydell, N, Fergie, G, McDaid, L, et al. Avoiding pitfalls and realising opportunities: reflecting on issues of sampling and recruitment for online focus groups. Crime & Delinquency 2014; 303-311. DOI: 10.1177/001112876100700402.
  6. Kenny, AJ. Interaction in cyberspace: An online focus group. J Adv Nurs 2005; 49: 414–422. DOI: 10.1111/j.1365-2648.2004.03305.x.
  7. Archibald, MM, Ambagtsheer, RC, Casey, MG, et al. Using zoom videoconferencing for qualitative data collection: perceptions and experiences of researchers and participants. International Journal of Qualitative Methods. 2019. DOI: 10.1177/1609406919874596.
  8. Denisova, N. Virtual advisory boards: what pharma learned in 2020 and new trends for the Post-COVID Era. MphaR,https://m-phar.com/virtual-advisory-boards-what-pharma-learned-in-2020-and-new-trends-for-the-post-covid-era/ (2021, accessed March 2022).
  9. Harrison, JM, Germack, HD, Poghosyan, L, et al. Surveying primary care nurse practitioners: an overview of national sampling frames. Policy, Politics, & Nursing Practice 2021; 22(1): 6-16. DOI: 10.1177/152715442097608.
  10. Mitchell, EJ, Ahmed, K, Breeman, S, et al. It is unprecedented: trial management during the COVID-19 pandemic and beyond. Trials 2020; 21(784). DOI: 10.1186/s13063-020-04711-6.
  11. Khalifa, M. Using PubMed to generate email lists of participants for healthcare survey research: a simple and practical approach. Health Informatics Vision: From Data via Information to Knowledge 2019; 262: 348 – 351. DOI: 10.3233/SHTI190090.
  12. Short, CE, Rebar, AL, and Vandelanotte, C. Do personalised e-mail invitations increase the response rates of breast cancer survivors invited to participate in a web-based behaviour change intervention? A quasi-randomised 2-arm controlled trial. BMC Med Res Methodol 2015; 15:66. DOI: 10.1186/s12874-015-0063-5.
  13. Hennrich, P, Arnold, C, and Wensing, M. Effects of personalized invitation letters on research participation among general practitioners: a randomized trial. BMC Med Res Methodol 2021; 21: 247. DOI: 10.1186/s12874-021-01447-y.
  14. Thomas, B. E-mail address harvesting on PubMed--a call for responsible handling of e-mail addresses. Mayo Clin Proc. 2011; 86(4):362. DOI: 10.4065/mcp.2010.0817.
  15. Kowalewicz, R. Why you should never buy an email list. Forbes, https://www.forbes.com/sites/forbesagencycouncil/2020/12/03/why-you-should-never-buy-an-email-list/?sh=17f308836511. (2020, accessed March 2022).
  16. LaForge, K, Gray, M, Stack, E, et al. Using asynchronous online focus groups to capture healthcare professional opinions. International Journal of Qualitative Methods 2022; 21: 1 – 9. DOI: 10.1177/16094069221095658.
  17. Ferrante, JM, Friedman, A, Shaw, EK, et al. Lessons learned designing and using an online discussion forum for care coordinators in primary care. Qual Health Res 2016; 26(13): 1851-1861. DOI: 10.1177/1049732315609567.
  18. Williams, S, Clausen, MG, Robertson, A, et al. Methodological reflections on the use of asynchronous online focus groups in health research. International Journal of Qualitative Methods 2012; 11(4): 368-383. DOI: 10.1177/160940691201100405.
  19. McRobert, CJ, Hill, JC, Smale, T, et al. A multi-modal recruitment strategy using social media and internet-mediated methods to recruit a multidisciplinary, international sample of clinicians to an online research study. PLoS ONE 2018; 13(7):e0200184. DOI: 10.1371/journal.pone.0200184.
  20. Arigo, D, Pagoto, S, Carter-Harris, L, et al. Using social media for health research: Methodological and ethical considerations for recruitment and intervention delivery. Digital Health 2018; 4: 2055207618771757. DOI: 10.1177/2055207618771757.
  21. Pizzuti, AG, Patel, KH, McCreary, EK, et al. Healthcare practitioners’ views of social media as an educational resource. PLoS ONE 2021; 15(2): e0228372. DOI: 10.1371/journal.pone.0228372.
  22. Topolovec-Vranic, J, Natarajan, K. The use of social media in recruitment for medical research studies: a scoping review. J Med Internet Res 2016; 18(11):e286. DOI: 10.2196/jmir.5698.
  23. Whitaker C, Stevelink S, and Fear N. The use of Facebook in recruiting participants for health research purposes: a systematic review. J Med Internet Res 2017; 19(8):e290. DOI: 10.2196/jmir.7071.
  24. Heffner, JL, Wyszynski, CM, Comstock, B, et al. Overcoming recruitment challenges of web-based interventions for tobacco use: the case of web-based acceptance and commitment therapy for smoking cessation. Addict. Behav. 2013; 38 (10): 2473-2476. DOI: 10.1016/j.addbeh.2013.05.004.
  25. Geist, R, Militello, M, Albrecht, JM, et al. Social media and clinical research in dermatology. Curr Derm Rep 2021; 10(4): 105-111. DOI: 10.1007/s13671-021-00350-5.
  26. Kapp, JM. Peters, C, and Oliver, DP. Research recruitment using Facebook advertising: big potential, big challenges. J Canc Educ 2013; 28(1):134-7. DOI: 10.1007/s13187-012-0443-z.
  27. Khatri, C, Chapman, SJ, Glasbey, J, et al. Social media and internet driven study recruitment: evaluating a new model for promoting collaborator engagement and participation. PLoS ONE 2015; 10(3): e0118899. DOI: 10.1371/journal.pone.0118899.
  28. Townsend, A, Leese, J, Adam, P, et al. eHealth, participatory medicine, and ethical care: a focus group study of patients’ and health care providers’ use of health-related internet information. J Med Internet Res 2015; 17(6):e155. DOI: 10.2196/jmir.3792.
  29. Dalessandro, C. Recruitment tools for reaching millennials: the digital difference. International Journal of Qualitative Methods 2018; 17(1): 1 - 17. DOI: 10.1177/1609406918774446.
  30. Benedict, C, Hahn AL, Diefenbach, MA, et al. Recruitment via social media: advantages and potential biases. Digital Health 2019; 5:2055207619867223. DOI: 10.1177/2055207619867223.


© 2024 MJH Life Sciences

All rights reserved.