+1 (951) 902-6107 info@platinumressays.com

Distinguish ABA from other psychological fields by identifying which of the articles is behavior analytic and which is not, and provide an explanation for your choice.

  • Identify which of the seven dimensions of ABA are present in the behavior analytic article.
  • Analyze why the other article is not behavior analytic. How do you know the seven dimensions are not present?
  • Differentiate ABA from the other branches of behavior analysis, by explaining the differences and connections between ABA, behaviorism, and EBA.
  • Which aspects are shared and which aspects make them different from each other?

Data/PresetImageFill3-27.jpg

Data/PresetImageFill2-26.jpg

Data/PresetImageFill1-25.jpg

Data/PresetImageFill5-29.jpg

Data/PresetImageFill4-28.jpg

Data/PresetImageFill0-24.jpg

Data/bullet_gbutton_gray-30.png

preview.jpg

,

Contents lists available at ScienceDirect

Currents in Pharmacy Teaching and Learning

journal homepage: www.elsevier.com/locate/cptl

Experiences in Teaching and Learning

Evaluation of a mock interview session on residency interview skills

Kelsey Buckleya, Samantha Karra, Sarah A. Nislyb, Kristi Kelleyc,⁎

aMidwestern University College of Pharmacy – Glendale, 19555 N 59th Avenue, Glendale, AZ 85308, United States bWingate University School of Pharmacy, 220 North Camden Road, Wingate, NC 28174, United States cAuburn University Harrison School of Pharmacy, 1323 Walker Building, Auburn University, AL 36849, United States

A R T I C L E I N F O

Keywords: Pharmacy residency Education Pharmacy Interview Postgraduate residency training

A B S T R A C T

Background and purpose: To evaluate the impact of student pharmacist participation in a mock interview session on confidence level and preparation regarding residency interview skills. Educational activity and setting: The study setting was a mock interview session, held in con- junction with student programming at the American College of Clinical Pharmacy (ACCP) Annual Meeting. Prior to the mock interview session, final year student pharmacists seeking residency program placement were asked to complete a pre-session survey assessing confidence level for residency interviews. Each student pharmacist participated in up to three mock interviews. A post-session survey evaluating confidence level was then administered to consenting participants. Following the American Society for Health-System Pharmacists (ASHP) Pharmacy Resident Matching Program (RMP), a post-match electronic survey was sent to study participants to de- termine their perception of the influence of the mock interview session on achieving successful interactions during residency interviews. Findings: A total of 59 student pharmacists participated in the mock interview session and completed the pre-session survey. Participants completing the post-session survey (88%, n = 52) unanimously reported an enhanced confidence in interviewing skills following the session. Thirty responders reported a program match rate of 83%. Approximately 97% (n = 29) of the re- spondents agreed or strongly agreed that the questions asked during the mock interview session were reflective of questions asked during residency interviews. Discussion: Lessons learned from this mock interview session can be applied to PGY1 residency mock interview sessions held locally, regionally, and nationally. Summary: Students participating in the ACCP Mock Interview Session recognized the importance of the interview component in obtaining a postgraduate year 1 (PGY1) pharmacy residency.

Background and purpose

The recognition of the value of formal postgraduate training by major pharmacy organizations and the evolving role of phar- macists in direct patient care likely influence student pharmacists to pursue residency and fellowship training.1 Compared with previous research, surveyed pharmacy residents and fellows indicate that they understood [it] as a prerequisite for certain jobs when questioned about motivating factors for pursuing residency and fellowship training.1 Despite the increasing number of residency

https://doi.org/10.1016/j.cptl.2017.12.021

⁎ Corresponding author. E-mail addresses: [email protected] (K. Buckley), [email protected] (S. Karr), [email protected] (S.A. Nisly),

[email protected] (K. Kelley).

Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

1877-1297/ © 2017 Elsevier Inc. All rights reserved.

T

programs and positions over the last several years, pharmacy residency program placement continues to be highly competitive, resulting in a large number of unmatched candidates annually.2 Nationwide, the success rate of securing a postgraduate year 1 (PGY1) residency was approximately 64% in 2014.2 A focus on residency application and interview preparation efforts has been occurring among both clinicians and academicians in pharmacy practice in recent years.

A survey study by Dunn et al.3 found that colleges of pharmacy have a variety of informal programs and/or information sessions in place (lecture seminars, panel discussions, small group activities) promoting residency training. Recent efforts have also focused on residency interview preparation. Data has indicated that a higher number of interview offers may increase the likelihood of an applicant to match with a PGY1 program.4 The interview component of the application provides an opportunity for residency program directors and preceptors to gather holistic information, such as nonacademic qualities of candidates, not otherwise evident from the application packet.5–7 Several interview preparation efforts have been published.

In 2012, Phillips et al.8 described an elective course using a variety of teaching methods (short lectures, group discussions, mock match, and mock interview) at the University of Georgia College of Pharmacy. Post-semester surveys (n = 36) demonstrated a statistically significant increase in students’ abilities to not only understand the purpose and components of a residency training program, but also the actual steps in the residency application process. Student participation within residency application preparation sessions also appears to have a positive correlation with the American Society of Health-System Pharmacists (ASHP) Resident Matching Program (RMP) match rates. In 2012, Caballero et al.9 described a residency interviewing preparatory seminar elective at Nova Southeastern University College of Pharmacy with five of the ten course hours dedicated to interview preparation. Survey results demonstrate improvement in students’ (n = 10) confidence and ability to interview and prepare for the ASHP Midyear Clinical Meeting (MCM). Seven of ten students (70%) participating within the elective secured ASHP-accredited residencies. At Drake University College of Pharmacy and Health Sciences, a faculty-led mock residency interview exercise was described by Koenigsfeld et al.10 Twenty-seven (of 28) students participated in a post-ASHP RMP survey, with 25 (92.6%) indicating they had secured a residency position. Recently, Rider et al.11 described a collaborative approach to residency preparation programming between students and faculty from The Ohio State University College of Pharmacy and pharmacy residents and residency preceptors from The Ohio State University Wexner Medical Center. Of the four programming components (Curriculum Vitae Critique, Mock Residency Interviews, Residency 101, and Midyear to Match), Mock Residency Interviews received the highest ranking, in terms of value, by students completing an anonymous post-programming survey (n = 57). There were 26 survey participants seeking a residency and 20 of these (77%) obtained a position.

The Education and Training (EDTR) Practice and Research Network (PRN) of the American College of Clinical Pharmacy (ACCP) was developed to provide an opportunity to network with others who share similar interests and to work collaboratively to develop programs and projects to advance pharmacy education and training. Each year since 2004, the EDTR PRN holds a mock interview session at ACCP's Annual Meeting. The purpose of the mock interview session is to provide current student pharmacists seeking PGY1 residency positions, PGY1 pharmacy residents seeking PGY2 residency positions, or residents and fellows seeking employment the opportunity to participate in mock interviews with an ACCP member. Interviewers are typically active ACCP members who are seasoned practitioners and educators. Additionally, many are also PGY1 or PGY2 directors representing various residency programs in the United States.

No specific information on a mock interview session conducted at a national level was found in the literature. Thus, the purpose of this research was to evaluate the impact of final year student pharmacist participation in the mock interview session on confidence level and self-reported preparedness regarding residency interviewing skills. In addition, this study set out to explore if an im- provement in confidence and preparedness for interviews correlated with ASHP RMP applicant match rates.

Educational activity and setting

Advertisement of the program

Student pharmacists, residents, and fellows who attended the 2014 ACCP Annual Meeting were eligible to participate in the mock interview session. Interested participants were made aware of the mock interview session via advertisements on the electronic registration webpage for the Annual Meeting, flyers handed to pharmacist trainees at the meeting registration booth, and verbal announcements made to students who attended the national meeting residency preparatory session. The Student College of Clinical Pharmacy (SCCP) chapters of ACCP at schools and colleges of pharmacy in the United States were also provided with information regarding the mock interview session to share with their student members.

Recruitment of interviewers began approximately one month prior to the Annual Meeting via email. The email request came directly from many PRN chairs and/or vice chairs, requesting volunteers from within their PRN membership. Volunteers were eligible to participate as an interviewer if the member was a pharmacist or pharmacy educator. To maximize the student's experience within the mock interview session, the study aimed to recruit an equal number of volunteer interviewers to student participants. To re- cognize the need to recruit a large number of interviewers, the time commitment involved in interviewing, and the opportunities for networking missed by interviewers who could not attend other PRN sessions during the same time slot, interviewer participation was supported in various ways. In addition to recruitment by PRN leadership and members, interviewers attending the mock interview session were encouraged to enter into a raffle for one of two gift cards in the amount of $50.

K. Buckley et al. Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

512

Consent and randomization

This study was approved by the Investigational Research Boards (IRBs) of all universities for which study investigators were affiliated with or employed. Student pharmacists, residents, and fellows were eligible to participate in the mock interview session. However, only final year student pharmacists seeking PGY1 residency program placement were eligible to participate in the survey component of this research. The subsequent details surrounding the mock interview session are illustrated in Fig. 1. For all perception questions, a 4-point Likert scale (strongly disagree, disagree, agree, strongly agree) was utilized. This was done to limit neutrality and gather concrete opinions from participants.

Before receiving a pre-session survey, a written statement approved by the IRBs was provided to all interested research parti- cipants. The statement notified student pharmacists that participation in the survey indicated informed consent for research. Study investigators then administered a pre-session survey to assess initial confidence for residency interviews (Appendix 1). Surveys requested a self-generated study code for each participant (utilizing the last four digits of the participant's mobile phone number) in order to match pre-session, post-session, and post-match survey data while maintaining subject anonymity.

Preparing for the session

Each student participant was provided with an interviewing tip sheet (developed by members of the EDTR PRN in 2008) for guidance on how to prepare for an interview, actions to take and to avoid on the day of interviews, and what to expect and follow-up on after an interview (Appendix 2). The volunteer interviewers were provided with a preparatory sheet for mock interviewers, also compiled by members of the EDTR PRN (Appendix 3). The preparatory sheet provided guidance to interviewers on general inter- viewing tips, sample questions to ask residency applicants, and topics to avoid (e.g., religion, marital status). Interviewers were encouraged to ask additional questions, beyond those on the preparatory sheet, as time permitted.

After providing consent, taking the pre-session survey, and receiving the interviewing tip sheet, each student participant then received a random assignment to an interviewer at a numbered table. Students were encouraged to avoid interviewing with faculty or preceptors that they knew. In the case of random assignment to a known interviewer, students were reassigned.

Conducting the mock interview session

After randomization, interview sessions commenced. Designed to simulate one-on-one postgraduate program interviews, each session lasted approximately ten minutes, followed by interviewers providing participants with approximately five minutes of im- mediate feedback on interviewing skills. Participants were then randomly assigned to the next interviewer. Student pharmacists could participate in up to three mock interviews.

Fig. 1. Flow Chart of Mock Interview Session. ASHP=American Society of Health-System Pharmacy; min=minutes.

K. Buckley et al. Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

513

Post-interview session

After completing interview sessions, participants were asked to complete a post-session survey (Appendix 4). This survey at- tempted to evaluate the student pharmacist's confidence level in interview skills, assessment of interview questions asked, and opinion of the helpfulness of guidance received during the mock interview session.

Post-match survey

A subsequent survey utilizing Qualtrics software (version 9506381) was sent via email to study participants following release of results of the ASHP RMP match in March 2015 (Appendix 5). This anonymous post-match survey attempted to determine perceptions of the EDTR PRN Mock Interview Session after completing residency interviews. To encourage participation, students could enter into a drawing upon completion of the post-match survey. Two participants were randomly selected to receive one of two $50 gift cards.

Descriptive statistics were used to analyze nominal data. Non-parametric data was compared across surveys using the Wilcoxon Signed Rank or Spearman's Correlation Coefficient, as appropriate. IBM SPSS software (version 22) was used to analyze data. All survey data were kept confidential and no name or contact information was recorded on any survey instrument. The student gen- erated survey code was used to match pre-session, post-session, and post-match survey data.

Findings

A total of 59 final year student pharmacists participated in the mock interview session and 59 (100%) completed the pre-session survey. Throughout the study period, the majority (>75%) of responders were female with an average age of 27 years. Pre-session respondents noted the primary reason for attending the EDTR PRN's Business and Networking meeting was to participate in the mock interview session (95%, n = 56). Survey completion diminished slightly following the mock interview session, with a total of 52 student responders (88%). Of those completing the post-session survey, 83% (n = 43) reported completing at least two rounds of mock interview sessions and 30% (n = 16) reported completing three rounds of interviews. Participants unanimously reported an enhanced confidence in interviewing following the session. From the open-ended question on the post-session survey regarding what was most valuable about the mock interview session, the highest percentage of participants (25%) mentioned enjoying receiving specific, con- structive feedback from interviewers and the next highest percentage (8%) mentioned the opportunity for multiple mock interviews. From the open-ended question regarding suggested areas of improvement for the mock interview session, the highest percentage of participants (30%) indicated nothing and the next highest percentage (10%) indicated the wait time before starting interviews and between interviews. All respondents reported having interest in receiving additional residency preparatory information from ACCP.

Survey data captured following the ASHP RMP match was the lowest of the three surveys with 30 total responders (51%). The program match rate for responders was reported at 83% (n = 25), with the majority experiencing success during Phase I of the ASHP RMP process (n = 24). Applications submitted per respondent were consistent with national averages; over 50% (n = 18) of respondents applied to ten or more programs. During the pre-session survey, participants (n = 29) reported intentions of applying to seven to nine residency programs. This changed during the post-match survey to 10 or more programs, although the change was non- significant (p = 0.058). On the post-match survey, approximately 97% (n = 29) of the respondents agreed or strongly agreed that the questions asked during the mock interview session were reflective of questions asked during residency interviews, which was a slight decrease from 100% (n = 52) of respondents who agreed or strongly agreed for the same question on the post-session survey. This decrease was statistically significant (p = 0.003) when survey matched pairs (n = 27) were analyzed. Additionally, all respondents indicated they would recommend the session to other students interested in pursuing a residency. Respondents overwhelmingly used

Table 1 Percentage of respondents participating in additional interview preparatory activities.

ACCP=American College of Clinical Pharmacy; ASHP=American Society of Health-System Pharmacy; MCM=Midyear Clinical Meeting.

K. Buckley et al. Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

514

multiple preparatory activities to prepare for interviews (Table 1). Importantly, there was no statistically significant correlation between the number of preparatory activities completed and student confidence in interview skills prior to the mock interview session. However, when asked if the mock interview session was beneficial for residency preparation, 80% (n = 24) responded agreed or strongly agreed. Lastly, approximately 97% (n = 29) of the respondents on the post-match survey agreed or strongly agreed that the interview portion of the residency application process is significant to obtaining a residency.

Discussion

The students that participated in the ACCP EDTR PRN Mock Interview Session in Fall 2014 portrayed similar student char- acteristics to those applying for residencies with the study population weighted more towards females (83%).4 The average age (27 years) was consistent with other studies investigating residency preparation among student pharmacists.9

The growing number of students that are pursuing residencies is creating an even more competitive ASHP RMP process. In the post-match survey, most participants reported that they had applied to ten or more PGY1 programs. As such, student pharmacists are looking for opportunities to increase their chances of matching to a residency position. In a recent national survey assessing mentor involvement with student pharmacists pursuing postgraduate residency training, at least half of the overall student sample indicated a desire for increased mentorship in the area of interviewing.12 Mock interviews are one method to help students prepare. In this study, the majority (97%) of students agreed or strongly agreed on the post-match survey that the interview component of the residency application process is an important element in obtaining a residency position. Reports indicate that interviews continue to be included in the PGY1 selection process and remain the top factor in selecting candidates for a residency program.5–7 If aware of the available resources at the local, state, and national level, students seeking residencies may be more likely to pursue opportunities to participate in mock interviews. Eighty-three percent of respondents in our study successfully obtained a residency position following the ASHP RMP match. This is compared to the national reported success rate of securing a PGY1 residency of approximately 65% in 2015.2 Although there was a high ASHP RMP match rate for those students that participated in the EDTR PRN's Mock Interview Session, it is unclear whether this participation or alternative preparatory activities had the largest impact on ASHP RMP match rates. On the post-match survey, respondents indicated that they participated in additional interview preparatory activities after attending the mock interview session. This could indicate the mock interview session encouraged them to seek out additional preparation for residency interviews. A confounding factor is that students that participate in mock interviews may be more motivated candidates. Students who attend pharmacy meetings may also have strengths, such as leadership experience, which make them more qualified candidates for residency positions.

When student pharmacists who participated in this research were asked about the most valuable aspect of the EDTR PRN's Mock Interviews Session, almost half identified receiving specific, constructive feedback from the interviewers as most valuable. On the post-session survey, all participants responded agree or strongly agree that the questions asked during the mock interview session were reflective of actual residency interview questions. It should be noted, however, that participants had no residency interview experiences at the time of post-session survey completion. A slight decrease was noted on the post-match survey with approximately 97% (n = 29) of the respondents indicating agree or strongly agree that the questions asked during the mock interview session were reflective of questions asked during residency interviews. This suggests that on-the-spot interviewing without dedicated interviewer time to review candidate materials in advance can still provide a valuable experience for the learner.

Although students unanimously reported enhanced confidence in interviewing following the session, they identified some specific areas for improvement in the mock interview session, similar to what ACCP members who had conducted the mock interviews had identified. Members have been conducting mock interviews at the ACCP Annual Meeting for over ten years, demonstrating the organization's commitment to providing this opportunity for student pharmacists, residents, and fellows. Similarly, other national and state organizations as well as schools and colleges of pharmacy offer mock interviews in preparation for pharmacy residency program application.9,10,14 In 2015, Powell et al.13 described a mock interview session held in conjunction with the Arizona Phar- macy Association Annual Convention. Although an objective method to measure residency preparation was not identified, post- session surveys (n = 33) indicated that more students felt confident in their ability to interview, as compared to pre-session surveys. A subsequent study in 2017 by Ulbrich and Boyle14 found that the mock interview included in their institution's local residency boot camp activity was deemed most the most helpful program aspect by the majority (n = 5) of students participating.

Interestingly, when students participating in the EDTR PRN Mock Interview Session were asked about their participation in local chapter or state organization efforts, few respondents had participated in these preparatory activities despite the overwhelming use of alternative residency application preparatory activities. The procedures described in this research study could be replicated by local student chapters and pharmacy organizations, providing resources to student pharmacists and residents unable to travel to national conferences. Additionally, there may be available resources at the local or state level for students to secure additional opportunities to participate in mock interviews over the course of their final professional year, leading up to actual residency interviews. Having multiple chances for students to participate in mock interviews may also allow for them to prepare for mock interviews ahead of time.

This analysis is not without limitations. Students from across the country are included, although the limited sample size and decreasing participation makes the external validity limited. The 4-point Likert scale in this study did not include an option for neutrality. Forcing either positive or negative responses without offering a neutral response could induce a potential bias, however, it has been shown that the overall difference in response between a 4-point and 5-point Likert scale is negligible.15 Lastly, recruitment of a sufficient number of qualified interviewers may limit applicability to smaller organizations.

The lessons learned from student surveys in this study can be applied to local or state organization efforts as well. Students reported that they would like to have a longer time to interview than the allotted time of ten minutes with an additional five minutes

K. Buckley et al. Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

515

for feedback. This length was consistent with the time specified in Rider and colleagues11 description of mock interviews that involved a similar number of students. Although time constraints are always an issue, students may prefer to increase the amount of interview time to a total of 12–15min with only three minutes for feedback. Additionally, if a different environment is available, the time for each individual mock interview possibly could be lengthened to 20–30min and a quieter space utilized rather than the large ballroom traditionally provided for the EDTR PRN Mock Interview Session. Often, the length of the mock interview limits the number of mock interviews for participants. Given the nature of a national mock interview session, it can be logistically challenging to provide resources to students ahead of time. However, at the local/state level, holding a formal session to prepare students for interviews or provide interview tips may ease the students’ minds and increase their comfort level going into mock interviews. Even though a majority of participants responded agree or strongly agree that the questions asked during the mock interview session were reflective of residency interview questions, a statistically significant reduction was noted when the same paired survey questions were compared between post-session and post-match responses. This may suggest that some participants did not perceive mock interview questions to be representative of actual questions received during residency interviews. Future residency preparatory efforts and research may aim to more closely align mock interview questions to those asked in residency interviews.

Summary

Student pharmacists participating in this study recognized the importance of the interview component in successfully matching with a PGY1 pharmacy residency. As such, they were eager to participate in the mock interview session and reported that receiving specific, constructive feedback from interviewers was helpful in preparing for the residency application process. Findings from this mock interview session research can be helpful to other organizations offering similar sessions locally, regionally, and nationally. These findings may also be useful for PGY2 residency candidates or others seeking post-graduate employment opportunities.

Conflict of interest and financial disclosure

All authors have served in leadership roles within the Education and Training (EDTR) practice and research networks (PRNs) of the American College of Clinical Pharmacy (ACCP). This research did not receive any grant funding from agencies in the public or commercial, or not-for-profit sectors.

Disclosure statement

Research findings have been presented as a poster in October 2015 at the ACCP Global Conference on Clinical Pharmacy.

Acknowledgements

The authors wish to thank Drs. Eliza Dy-Boarman, Lori Hornsby, Alex Isaacs, Amy Leung, Haley Phillippe, Cynthia Phillips, and Nathaniel Poole for their help in conducting this research.

Appendix A. Supporting information

Supplementary data associated with this article can be found in the online version at http://dx.doi.org/10.1016/j.cptl.2017.12. 021.

References

1. McCarthy Jr BC, Weber LM. Update on factors motivating pharmacy students to pursue residency and fellowship training. Am J Health-Syst Pharm. 2013;70(16):1397–1403. 2. National Matching Service Inc., American Society of Health System Pharmacists. ASHP resident matching program for positions beginning in 2014. Available at ⟨https://

natmatch.com/ashprmp/stats/2014applstats.html⟩. Accessed 23 December 2017. 3. Dunn BL, Ragucci KR, Garner S, Spencer A. Survey of colleges of pharmacy to assess preparation for and promotion of residency training. Am J Pharm Educ. 2010;74(3):43. 4. Phillips JA, McLaughlin MM, Rose C, et al. Student characteristics associated with successful matching to a PGY1 residency program. Am J Pharm Educ. 2016;80(5):84. 5. Mersfelder TL, Bickel RJ. Structure of postgraduate year 1 pharmacy residency interviews. Am J Health-Syst Pharm. 2009;66(12):1075–1076. 6. Oyler DR, Smith KM, Elson EC, Bush H, Cook AM. Incorporating multiple mini-interviews in the postgraduate year 1 pharmacy residency program selection process. Am J

Health-Syst Pharm. 2014;71(4):297–304. 7. Olyer DR. Getting the most from residency interviews. Am J Health-Syst Pharm. 2013;70(23):2082–2085 [2084-5]. 8. Bryles Phillips B, Bourg CA, Guffey WJ, Phillips BG. An elective course on postgraduate residency training. Am J Pharm Educ. 2012;76(9):174. 9. Caballero J, Benavides S, Steinberg JG, et al. Development of a residency interviewing preparatory seminar. Am J Health-Syst Pharm. 2012;69(5):400–404. 10. Koenigsfeld CF, Wall GC, Miesner AR, et al. A faculty-led mock residency interview exercise for fourth-year doctor of pharmacy students. J Pharm Pract. 2012;25(1):101–107. 11. Rider SK, Oeder JL, Nguyen TT, Rodis JL. A collaborative approach to residency preparation programming for pharmacy students. Am J Health-Syst Pharm.

2014;71(11):950–955. 12. Hammond DA, Garner SS, Linder MA, Cousins WB, Bookstaver B. Assessment of mentor involvement with pharmacy students pursuing post-graduate residency training. Curr

Pharm Teach Learn. 2018;8(1):18–23. 13. Powell AD, Yip S, Hillman J, et al. Preparing pharmacy graduates for interviews: a collaborative statewide mock interview session to improve confidence. Curr Pharm Teach

Learn. 2015;7(5):684–690. 14. Ulbrich TR, Boyle JA. Preparing students for residency interviews through a residency interview boot camp. Curr Pharm Teach Learn. 2017;9(4):671–682. 15. Chang L. A pyschometric evaluation of 4-point and 6-point Likert-type scales in relation to reliability and validity. Appl Psych Meas. 1994;18(3):205–215.

K. Buckley et al. Currents in Pharmacy Teaching and Learning 10 (2018) 511–516

516

  • Evaluation of a mock interview session on residency interview skills
    • Background and purpose
    • Educational activity and setting
      • Advertisement of the program
      • Consent and randomization
      • Preparing for the session
      • Conducting the mock interview session
      • Post-interview session
      • Post-match survey
    • Findings
    • Discussion
    • Summary
    • Conflict of interest and financial disclosure
    • Disclosure statement
    • Acknowledgements
    • Supporting information
    • References

,

Full Terms & Conditions of access and use can be found at https://www.tandfonline.com/action/journalInformation?journalCode=worg20

Journal of Organizational Behavior Management

ISSN: (Print) (Online) Journal homepage: www.tandfonline.com/journals/worg20

Using Web-Based Behavioral Skills Training to Teach Online Interview Skills to College Students

Davis E. Simmons, Nicole Gravina, Andressa Sleiman & Faris R. Kronfli

To cite this article: Davis E. Simmons, Nicole Gravina, Andressa Sleiman & Faris R. Kronfli (2024) Using Web-Based Behavioral Skills Training to Teach Online Interview Skills to College Students, Journal of Organizational Behavior Management, 44:2, 88-112, DOI: 10.1080/01608061.2023.2219466

To link to this article: https://doi.org/10.1080/01608061.2023.2219466

View supplementary material

Published online: 01 Jun 2023.

Submit your article to this journal

Article views: 920

View related articles

View Crossmark data

Using Web-Based Behavioral Skills Training to Teach Online Interview Skills to College Students Davis E. Simmons , Nicole Gravina , Andressa Sleiman , and Faris R. Kronfli

Department of Psychology, University of Florida, Gainesville, FL, USA

ABSTRACT Behavioral Skills Training (BST) is an effective procedure for teaching new skills, like interview skills. BST typically includes instruction, modeling, rehearsal, and feedback. Multiple studies have demonstrated that BST can be used in a web-based con- text, but no studies, to our knowledge, have extended the literature by using BST to teach online interview skills. This study extended and replicated previous research on teaching interview skills with BST by using this procedure to teach inter- view skills to college students in a synchronous web-based video format. We evaluated the intervention using a multiple baseline design across targets with follow-up sessions testing for maintenance and generalization. All participant performance improved from baseline to post-training across all targeted dependent variables. Participants also rated social validity mea- sures highly, aligning with previous research. Implications, lim- itations, and future directions are discussed.

KEYWORDS BST; interview skills; training

A commonly held belief is that obtaining a college degree leads to greater success and ease in obtaining a good job. Despite a consistent employment rate of 70–80% for those with a four-year degree over the last 20 years, individuals may interview and fail to secure a job many times before finding the right fit (U.S. Bureau of Labor Statistics, 2000–2020). Some issues that might prolong job seeking success include ineffective communication of relevant skills in the hiring process and lack of professionalism during interviews (Campbell & Roberts, 2007; see also Bye et al., 2013; Paulhus et al., 2013; Uhlmann et al., 2013). For example, the Association of American Colleges and Universities (2021) found that only 40% of employers indicated that recent college grad- uates effectively communicated the skills and knowledge they gained in col- lege, aligning with similar findings from previous reports (e.g., Chronicle of Higher Education and Marketplace, 2013) and indicating a continued need to train college students interview skills. Interviews continue to be the most common form of prospective employee evaluation (Macan, 2009) and are, therefore, critical in the job-seeking process. Interviews are also a typical

CONTACT Nicole Gravina [email protected] School of Psychology, 945 Center Dr, Gainesville, FL 32611 This manuscript was in fulfillment of the first author’s master’s thesis.

Supplemental data for this article can be accessed online at https://doi.org/10.1080/01608061.2023.2219466.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 2024, VOL. 44, NO. 2, 88–112 https://doi.org/10.1080/01608061.2023.2219466

© 2023 Taylor & Francis

component of the Graduate School Admission process, especially with an increasing focus on holistic review (Michel et al., 2019).

One intervention that has been used to improve interview skills is beha- vioral skills training (BST; Stocco et al., 2017; see also Barker et al., 2019; Edgemon et al., 2020; Roberts et al., 2020; Wirantana et al., 2019). BST is a training process comprised of instructions, modeling, rehearsal, and feed- back, which is repeated until mastery is achieved (Ward-Horner & Sturmey, 2012). BST is typically conducted with individual learners and is effective at training a wide variety of skills, including motor and verbal behaviors, making it a good choice for training interview skills (Davis et al., 2019). Researchers have found that skills trained using BST often maintain over time and general- ize to new environments (Brock et al., 2017; Erath et al., 2021; Kirkpatrick et al., 2019; Slane & Lieberman‐Betz, 2021; W. J. Higgins et al., 2017).

Stocco et al. (2017) used BST to improve the interview skills of college students in an analog face-to-face setting. Specifically, they targeted question answering, question asking, and non-vocal behaviors, and the data suggest that all targeted skills improved following training. The average training time using BST was eleven hours, and participants rated procedural acceptability high. Barker et al. (2019) replicated and extended Stocco et al. (2017) by examining the effects of immediate and delayed feedback on interview skills. Although all participants’ responding improved with both forms of feedback, immediate feedback was more efficient, saving an average of 123.4 min (about 2 hr) in training to mastery and was more effective (Barker et al., 2019). Procedural acceptability was also rated highly, and all post-training participant interview videos received high-performance ratings from university career center staff. It is important to emphasize that these and other studies using BST to train interview skills (Wirantana et al., 2019) have reported high measures of social validity.

Despite the success of BST in training in-person interview skills, no research to date has evaluated the effects of BST conducted online to train interview skills in a virtual format. Web-based interviews may present unique challenges for interviewees by introducing technological barriers (e.g., internet connectivity issues, a less natural social medium), possibly leading to communication disruption, reduced social and con- textual cues, and greater difficulty engaging in appropriate dimensions (e.g., frequency, latency, and magnitude) of verbal and non-vocal beha- viors (Basch et al., 2020). Despite these potential barriers, they are becoming more common. In a survey of over 700 executives by FutureStep, 71% reported using real-time video interviewing (Futurestep & Kornferry, 2015). Gartner (2020) polled 334 human resources (HR) leaders and discovered that 86% reported that their organizations are using virtual interviews to vet candidates for jobs. Lauren Smith, Vice President of the Gartner HR practice, states, “ . . .

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 89

virtual interviewing may become the new standard for recruiting leaders and candidates long after social distancing guidelines are lifted” (Gartner, Inc., 2020).

Furthermore, respondents to Deloitte’s Global Human Capital Trends survey predicted an almost 50% increase in using technology for job interviewing with even higher predictions across other components of the hiring process (Deloitte, 2019). Web-based interviews are one com- ponent of an exponential trend since the early 90’s to utilize technology in the hiring process by using increasingly complex and artificial intelli- gence (A.I.) driven Applicant Tracking Systems (ATS) and Recruiting Management Systems (RMS) to source, screen, rank, and manage job candidates throughout the hiring process (Harvard Business School, 2021, Automation of the Hiring Process section). Despite continued limitations with these systems, a myriad of initial benefits regarding equity, efficiency, and cost effectiveness have been established (Chapman & Webster, 2003). As the anticipation of remote workplaces increases following COVID-19 and organizations focusing on expanding their searches to a global talent pool (Deloitte, 2021), the efficiency and effectiveness of remote interviewing is being realized. Thus, there is a need to continue research on training interview skills and extend prior research on teaching interview skills to an online video- conferencing interview format.

Applying BST in an online format has become increasingly common in recent years (Ferguson et al., 2019; LeBlanc et al., 2020; Tomlinson et al., 2018). For example, BST implementation has been used fully online to train early intensive behavioral intervention skills (EIBI; Fisher et al., 2014, 2020); self-care skills (Boutain et al., 2020), recrea- tional and independent living skills (Pellegrino & DiGennaro Reed, 2020), communication skills (Carnett et al., 2020), functional commu- nication training (FCT; Clay et al., 2021), acceptance and commitment therapy (ACT; Magnacca et al., 2021), and functional analysis (FA; Lloveras et al., 2022). These studies demonstrate that it is feasible to conduct BST in a fully web-based platform, and prior research on it could be extended by evaluating BST to teach interview skills in a fully online format.

Therefore, the purpose of this study was to extend and replicate Stocco et al. (2017) by improving the interview skills of college students and extending it to a fully online, synchronous web-based training and interview format. This study also included stricter mastery criteria dur- ing BST training, updated vocal criterion for certain measures to fit the online format and improve scoring, and added environmental variables unique to an online format.

90 D. E. SIMMONS ET AL.

Method

Participants and setting

We recruited seven college students from a large public university by sending flyers to undergraduate classes. Three participants withdrew or were dropped from the study due to scheduling difficulties (see discus- sion). The four remaining participants identified as women. Table 1 displays these participants’ demographic and background questionnaire responses. We did not collect age, however, anecdotally, all participants were within the age range of a typical undergraduate student (18–26 years old). Furthermore, it is important to note that reported goals included both vocational and academic opportunities for two partici- pants and focused on academic opportunities for two participants. To be included in the study, participants needed to be at least 18 years old; have consistent access to a computer, web camera, internet connection, and a space free from distractions. After the first baseline session, participants needed to perform at 60% or less on at least three depen- dent variables to continue the study. All participants met that criterion.

Participants attended 5–10 experimental sessions on Zoom® lasting 30 min to 2 hr in duration (see discussion on time variation). They received $25 at three points in time, after baseline, after training, and after the study was completed, for a total of $75 in compensation. Participants received payments via an Amazon® Card code.

Table 1. Demographic Survey Questions and Background Questionnaire Responses. Demographic Survey Questions and Background Questionnaire Responses Participant 1 Participant 2 Participant 3 Participant 4

With which of the following genders do you identify?

Woman Woman Woman Woman

With which of the following race and/or ethnicities do you identify?

Asian or Asian American White

Asian or Asian American

Asian or Asian American

American Indian or Alaskan Native Black or African American White

Do you have a disability? Emotional or Psychological

None None Autism/autism spectrum

Academic Major: Psychology and Political Science

Pre-Health (Nursing)

Psychology Psychology

Career Interest: Analyst in High Tech Professor in Academia

Physician’s Assistant

Clinical Psychologist

Clinical/Counseling Psychologist

Goals: Minimum Master’s Degree Interest in Ph. D

Minimum Master’s Degree

Minimum Master’s Degree Interest in Ph.D.

Minimum Master’s Degree Interest in Ph.D.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 91

Dependent variables

The two main types of dependent variables were vocal and non-vocal responses, mostly consistent with Stocco et al. (2017). We also tracked the duration of sessions.

Vocal responses Vocal responses consisted of answering questions appropriately and asking appropriate questions.

Appropriate responding to questions. There The two main types of dependent variables were seven types of questions, each with different criteria and main- taining definitions from Stocco et al. (2017). Participants needed to satisfy all criteria for the question type to be scored as correct. We used the percentage of correct answers across questions and the percentage of correct criterion for each question type to measure participant responses. Specifically, we divided the number of items in agreement by the total number of items, then multi- plied by 100 to obtain a percentage of correct response. The question types and criteria are listed below.

Explaining interest. The participant (1) complimented the business (e.g., company’s services, type of people they work for, etc.) or school (e.g., the school’s presence at conferences or professional journals, the success of pre- vious graduates) and (2) mentioned their personal goals. The answer focused on (3) how the position/program helps the participant achieve, or work toward, the stated personal goals. An example question for explaining interest is, “Why are you interested in attending this graduate program?.”

Explaining relevant experience. The answer focused on (1) experience related to the job or field (e.g., if applying for graduate school, focus on research or clinical experience) and (2) how those experiences helped develop skills that match those required for the applied position or prepared them for graduate school (e.g., social skills if the position requires interacting with people). One example question for explaining relevant experience is, “what can you offer us that someone else cannot?.”

Conflict resolution. If participants had little experience with these types of problems, then they could describe (1) a proactive approach, (2) specific preventative skills (e.g., clear expectations of group members), and (3) how these skills prevent problem scenarios.

If participants had experience with these scenarios, then they could describe (1) a reactive approach, (2) specific reactive skills (e.g., met with a group member and provided feedback on their performance), and (3) how these

92 D. E. SIMMONS ET AL.

skills rectified the problem or what they learned from that experience. An example question for conflict resolution is, “Have you ever been in a real dilemma at work? What did you do?.”

Soft skills. The answer described (1) a specific task or leadership role (e.g., a term paper or work project, leadership position), and the student empha- sized (2) organizational skills (e.g., breaking large tasks into smaller tasks; pre- planning) and (3) the outcome of their efforts (e.g., completing a research project and manuscript). One example of soft skills is “How do you accom- modate last minute changes that have to be incorporated into your work?.” We added “or leadership position” to the first criteria in this question type.

Get to know you. The answer focused on (1) a few appropriate interests to discuss in a professional environment (e.g., sports, outdoor activities, movies, travel, bicycles). An example question for this question type is “Tell me about yourself?.” We added a new question to this question type, “tell me about what you like to do when you’re not working,” instead of Stocco’s “tell me about a time that you made a mistake.”

Future orientation. The answer focused on (1) achieving personal goals and (2) progressing beyond the individual’s current position within the organiza- tion or, if the position was temporary (e.g., internship or graduate school), then the participant described progressing within the field (e.g., starting my own business). If the question was salary-related, the answer also (3) described why the skills related to the position would support that salary amount. An example question for future orientation is “What is your future dream job?.” We added a third conditional criterion to Future Orientation (e.g., described why the skills related to the position would support that salary amount) if it was a salary related question.

Interpersonal skills. The answer highlighted (1) the benefits of working in groups, alone, or establishing rapport with others. Participants described (2) specific skills related to working in groups or independently or establishing rapport with coworkers. The skills mentioned (3) matched the requirements of the position. One example of interpersonal skills is “What steps do you take to establish rapport with others?.”

Appropriate question asking. We recorded the frequency of appropriate and inappropriate questions using a yes/no dichotomy. Participants met mas- tery criterion when they asked between two and four appropriate ques- tions (a common general recommendation) and no inappropriate questions. We coded a question as appropriate across four appropriate question asking categories (demonstrating research by the interviewee,

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 93

requiring elaboration by interviewer, checking for “good fit,” and relating to advancement opportunities) using definitions from Stocco et al. (2017). We also coded a question as inappropriate if they asked a question that did not meet the definition of an appropriate question and across two inappropriate question-asking categories (relating to general information available on website, demonstrating a lack of genuine interest). For example, questions that indicated interest in pay, vacation time, or proxi- mity to a large city instead of qualities of the company/program would lack genuine interest.

Non-vocal responses Non-vocal responses consisted of appropriate body language and environ- mental set-up in the room visible by the camera. These dependent variables differed from Stocco et al. (2017).

Body language. Body language variables were measured by a 30-s momentary time sampling with a 3-s observation window.

Facial-orientation. We defined appropriate face orientation as (1) maintain- ing a face orientation toward the screen when the interviewer is speaking and no complete face disorientation away from the screen (no portion of face orientation toward the screen) for the interview duration. We chose facial orientation instead of eye contact because eye contact would be difficult to measure in an online environment and may not be a culturally sensitive measure (Haensel et al., 2021; Hsu et al., 2019).

Posture. We defined posture as correct if the participant had (1) a visibly upright torso, with (2) the head and neck of the participant in line with their torso, while (3) shoulders were visibly relaxed with no observable tension, (4) where the camera was positioned at eye level, and (5) hands were not visible throughout the interview.

Environmental set up. Environmental set-up was measured in the first and last minute of each session because it was unlikely to vary much across the interview. To be scored as correct for each non-vocal variable, any sub- dichotomies of these variables must meet 100% of criteria at each interval, and at least 90% (mastery criterion) correct across all intervals in an interview.

Lighting. We defined appropriate lighting as (1) any daylight (natural light) or artificial warm or white light that (2) illuminated the partici- pant’s face more than the background allowing all facial features to be visible, and (3) originated from in front of the screen facing the partici- pant. Inappropriate light was insufficient lighting that resulted in the

94 D. E. SIMMONS ET AL.

darkening of the participant’s face in shadow or partial shadow and originated from behind the screen-facing participant, such as through a window.

Visual background. We defined an appropriate visual background as (1) a background that was visibly clean, orderly, and visuals (e.g., orderly book- shelf, neutral picture) that were not overly distracting (e.g., colorful posters), no bed or piles of laundry in the background, and not containing generally obscene, or polarizing images (e.g., political poster).

Body placement on screen. We defined appropriate body placement on-screen as (1) the visible placement of the interviewee’s torso from midriff or top of armpit to the top of their head, that was (2) screen facing, (3) centered on the screen (equal distances from left to the right), (4) where the camera was centered on the face (i.e., not above the head or below the head).

Interobserver agreement

Two trained observers independently viewed and coded interviews for inter- observer agreement (IOA). IOA was collected for all skills for 36.36% (24/66) of interviews. We calculated IOA by the trial-by-trial method for criterion data; specifically, we divided the number of items in agreement by the total number of items, then multiplied by 100 to obtain a percentage of agreement between the raters. IOA for answering questions was 81.83% (range; 60.87% to 100%), for asking questions 95.59% (range; 50.00% to 100%), for body lan- guage 84.77% (range; 61.12% to 100%), and for environmental set up 82.07% (range; 60.47% to 95.35%). The 50% score for question asking occurred during one IOA session with a discrepancy of one out of two questions recorded.

Procedures

All sessions were conducted and recorded over the video conferencing plat- form Zoom® by the first author in a separate room alone to ensure confidenti- ality. Participants received a link and password for entry into the Zoom® interview before the interview was scheduled to start. The training procedure was evaluated using a multiple-baseline design across performance targets. A follow-up and generalization probe were conducted 2 and 4 weeks following post-training.

Pre-work Following informed consent, participants completed the background ques- tionnaire with the experimenter and then were given the first social validity survey and interview homework.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 95

Background questionnaire. The informal background questionnaire consisted of a meeting and a brief baseline survey on interview confidence and anxiety ratings administered through Qualtrics®. During the meeting, the experimen- ter gathered relevant background information for the study (e.g., academic major, career interests, previous interview experiences, goals), which was used to develop the sequence of target skills and the individualized wording of interview questions for each participant. For example, participants were encouraged to use academic majors and career interests to guide completion of the homework requirements. Previous interview experience and participant goals were used to ensure certain components were included during intervention.

Interview homework materials. After completing the informal assessment, participants were given 30 min of homework through a link. Participants were asked to select at least three positions they would apply for from companies or academic programs of interest, and to provide a valid link to the entities’ websites. Information from the background questionnaire and homework materials were used to customize the interview questions. For example, if an individual was interested in pursuing graduate school, and received the question “Why are you interested in attending this graduate program?” for the first question category, the question was customized to include one of the specific graduate programs that was of interest. We would ask, “why are you interested in attending [specific program] at [specific school?]. The question format remained the same across all participants, but aspects were customized to the individual.

Evaluative simulated interviews (Baseline) Upon verified and submitted completion of the homework, baseline evaluative interviews were scheduled. Participants were directed to prepare for these interviews as they usually did and had at least a two-day preparation period before the first interview. All evaluative interviews were conducted as described by Stocco et al. (2017). The interviewer asked one question from each of the seven categories and provided time at the end of the interview for participants to ask questions. All questions came from a question bank and were randomly presented without repetition until all questions were asked of that question type across evaluative interviews. Following each component (i.e., appropriate answers, appropriate questions, environmental set-up, body language) of interview training, an evaluative simulated interview session was conducted to measure changes in performance from baseline to post-training.

Behavioral skills training The training was individualized based on the background questionnaire results, homework materials returned, and stable baseline interview

96 D. E. SIMMONS ET AL.

performance. If participants demonstrated skill deficits across all dependent variables, the three lowest-scoring variables were targeted during intervention first before moving to the other variables if time allowed. A deficit was defined as any performance at or below the 60% criterion for skill training to prevent a ceiling effect. Generally, dependent variables were targeted by order of complexity (i.e., vocal behavior before non-vocal behavior, answering ques- tions before asking questions, body language before environmental set-up), with easier skill groups targeted first. Individualized preference was also considered by the sequence in which dependent variables were targeted. Taken together, these three influences culminated in different dependent variables being targeted in a different order across participants providing an informal counterbalancing of exposure.

Training was implemented across vocal and non-vocal responding variables until participants a) reached a 90% criterion level across three consecutive brief interviews with no apparent downward trend or b) reached four training sessions for a single component. If participants demonstrated variable responding after full training had occurred for a skill, a booster session of rehearsal and feedback was conducted for that specific skill. We chose a 90% criterion level in hopes the higher criterion (compared to two sessions with no downward trend in Stocco et al., 2017) would enhance generalization and maintenance online (Fienup & Carr, 2021; McDougale et al., 2020). Similarly, three consecutive trials are typically used to demonstrate stability (Cooper et al., 2020). Breaks during training could occur at any time, with a minimum 5 min break between sessions or whenever the participant indicated they were ready to continue.

At the beginning of each training session, the interviewer provided a scripted rationale and screen-shared visual (pdf slide) for the target skill, written and verbal instruction on skill use, and then modeled the skill with exemplars (included scripted sample for vocal components). During rehearsal and feedback, the participant demonstrated the target skill and received feedback on their performance from the experimenter. Specifically, for vocal responses, the participant asked or answered an interview question by typing a response within the Zoom® chat, reading it aloud for the experimenter (similar to Stocco et al., 2017), and then received feedback on correct responding relative to the criteria for that vocal response. A record of the Zoom® chat was sent to the participant following this training. During appropriate answers training, we used at least one of two possible exemplars for each question type, and these pairs were the same for every participant. The modeling of correct responding could qualitatively vary across trainings, but all modeling fulfilled the criteria of the question type. This ensured the training format and pre- sentation remained the same across participants but allowed the model to differ if given to the same participant during repeated exposure to a specific

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 97

question type. This also ensured we did not train every single question in the question bank. The behavior of the experimenter during rehearsal was the same as simulated interviews. More specifically, each question type was trained separately during appropriate answers training, with a rationale, instruction, and rehearsal-feedback loop being implemented across all seven question types. Appropriate question asking, body language, and environmental set-up training were conducted similarly. Overall, the for- mat of training and presentation of training content was held constant across all participants.

Each training session ended with a brief simulated evaluative interview and self-evaluation process. The interviewees typed the self-evaluations by review- ing the main training topics that day, including their opinions of how they performed during the evaluative interviews. This evaluation process was con- ducted through the survey platform Qualtrics to receive and catalog responses in real-time across a secure platform and allow participants to save these evaluations for their records. The self-evaluation format was free response under the categories “Main Topics of Today’s Training,” and “Self-Evaluation of Performance.” In the latter category, participants were directed to evaluate what they did well and possible improvement. When completed, participants were prompted to screen share. The interviewer reviewed the participant’s evaluation and gave performance feedback, encouraging discussion and clar- ification of the main points and previous feedback. If the participant report conflicted with objective performance, we provided constructive feedback. Following this, participants were prompted to save their responses to their devices for storage and later use.

Post training interviews and follow-up plus generalization probe Following training, evaluative simulated interviews were conducted, including one interview (generalization probe) with the novel interviewer, similar to baseline. If performance was variable or returned to baseline levels, a booster session was conducted.

Procedural integrity

Procedural integrity was collected for all skills for 36.36% (24/66) of interviews using independent coders. We calculated procedural integrity using the trial- by-trial method for criterion data; specifically, we divided the number of items in agreement by the total number of items, then multiplied by 100 to obtain a percentage of agreement between the reviewers. Results of procedural integrity for baseline interviews was (M = 92.00%, range: 80.00% to 100%), for post-training interviews (M = 97.33%, range: 80.00% to 100%), for booster training interviews (M = 80.00%, range: 60.00% to 100%), and for follow up interviews (M = 95.00%, range: 80.00% to 100%).

98 D. E. SIMMONS ET AL.

Social validity

A pre-measure was administered at the beginning of the study as a part of the background questionnaire through Qualtrics. Like Stocco et al. (2017), the measure consisted of two ten-point Likert scales covering individuals’ initial confidence and anxiety ratings.

The social validity measure was administered immediately after the first follow-up evaluative interview or after the booster training and generalization evaluative interviews. The measures included the Likert scales covering con- fidence and anxiety from the first measure and consisted of three additional seven-point Likert scales over the acceptability of our assessment procedures, acceptability of our training procedures, and satisfaction with improvement in interview skills. Satisfaction with the improvement of interview skills covered satisfaction with each variable targeted.

Neutral rater

A neutral reviewer with a Ph.D. in Behavior Analysis who has experience interviewing job candidates and supervising employees served as a neutral rater and was asked to rate the quality of the interviews using the rating scale. The neutral reviewer viewed a pre-training interview and post-training inter- view for each participant, but was blind to whether the interview occurred in baseline or post-training, and the video order was randomized. The questions asked were adapted from Stocco et al. (2017). Interviews were scored using a Qualtrics® survey with 7 Likert scale measures consisting of “quality of answers to questions, quality of questions asked, appropriateness of body language, appropriateness of environment/setting, appearing confident, appearing anxious, and the likelihood of hiring this individual.” A free response question was included for any additional feedback. The neutral rater’s feedback was shared with participants.

Results

Interview skills

Figures 1 and 2 display baseline, post-training, and follow-up results for all participants. In baseline, relatively low and stable levels of interview behaviors were observed across all targeted variables for each participant (answering 0–3 questions appropriately, asking 0–1 questions appropriately, exhibiting about 50% percent intervals correct of body language for Participants 2 and 4; and exhibiting a mean of 33.33% intervals correct or less of environmental set up for Participants 1, 3, and 4). After post-training, interview behaviors improved across all targeted variables for all participants (answering 4–7 questions appropriately, asking at least two questions appropriately, exhibiting at least

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 99

a mean of 70.03% correct of body language for Participants 2, and 4; and exhibiting at least a mean of 52.78% correct of environmental set up for Participants 1, 3, and 4).

Due to the variability and change in level for environmental set-up and the decreasing trends observed for answering questions by Participants 1 and 4, booster sessions were conducted. Following booster training, appropriate answers and questions were maintained at follow-up and generalized to a novel interviewer, but environmental set up was not maintained for Participant 1. Participant 4”s performance suggests generalization to a novel interviewer and maintenance across all skill components following booster training. Participant 2”s follow-up performance suggested improvements gen- eralized to a novel interviewer and maintained across time. Participant 3”s follow-up performance suggested improvements generalized to a novel inter- viewer and maintained for appropriate questions and answers across time. However, performance was not maintained for environmental set-up, so we conducted a booster training session. After booster training, participant 3”s environmental setup improved to the previous levels observed post-training.

Figure 1. Change in Interview Skills as a Function of BST. Note. Interview performance for each dependent variable for participants 1 and 2 are depicted in graphs. BoT = Booster Training; FLU = Follow-up Probe; X = an inappropriate question; Open circles = generalization probe.

100 D. E. SIMMONS ET AL.

Answering question errors were variable within and across partici- pants, with each participant answering different criteria of different question types incorrectly across training sessions. The two inappropri- ate questions asked by Participants 2 and 3 in baseline were related to asking for information that was readily available on the company web- site. Having an upright torso with head and neck in line with the upright torso was the most frequent combination of error for body language across participants in baseline and for those participants whose body language was a target variable for intervention. Regarding environmental set-up, lighting was the most common error (partial shadowing across a portion of participants’ faces), with body placement on screen (being visually centered on screen) another common error across participants in baseline and for participants for whom environmental set-up was a target for intervention. Major differences in individual responding will be addressed in the discussion.

Figure 2. Change in Interview Skills as a Function of BST. Note. Interview performance for each dependent variable for participants 3 and 4 are depicted in graphs. BoT = Booster Training; FLU = Follow-up Probe; X = an inappropriate question; Open circles = Generalization Probe.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 101

Social validity measures

Tables 2 and 3 display participants’ social validity survey responses. In Table 1, all four participants rated the acceptability of training and assessment proce- dures at a 7 (out of 7). Overall satisfaction with skills was also rated highly (6, 6, 6, 7 out of 7). Table 2 shows subjective ratings of confidence and anxiety across participants before and after training. All four participants report either an increase in confidence scores (+1, +2, and + 4 points for Participants 1,3, and 4) or a decrease in anxiety scores (−2, −7, and −2 points for Participants 2, 3, and 4).

Table 4 displays the neutral rater’s scores across all participants. Despite the variation, all participants were rated as improving across multiple different indices.

Duration and number of simulated interviews

Table 5 depicts the training duration and number of brief interviews (trials) across each training component for each participant. Appropriate answers training took the longest to train at over 5 hours for all participants and close to 7 hours on average. Appropriate questions training took over an hour for all

Table 2. Intervention Acceptability Ratings.

Questions Participant

1 Participant

2 Participant

3 Participant

4

How acceptable were the training procedures to you? 7 7 7 7 How acceptable were the assessment procedures to

you? 7 7 7 7

How satisfied are you with your interview skills after taking part in this study? Answering questions appropriately 6 7 6 7 Asking questions appropriately 7 7 6 7 Displaying appropriate body language – 7 – 5 Displaying appropriate environmental manipulations 7 – 6 6 Overall satisfaction with skills 6 6 6 7

Note: Possible scores ranged from 1–7 with 7 being the most acceptable/satisfied.

Table 3. Confidence and Anxiety Ratings. Questionnaire Items by Participant Baseline Post Training Change

Participant 1 How confident do you feel about your performance during interviews? 7 8 +1 How anxious do you feel about your performance during interviews? 9 9 0 Participant 2 How confident do you feel about your performance during interviews? 7 7 0 How anxious do you feel about your performance during interviews? 8 6 −2 Participant 3 How confident do you feel about your performance during interviews? 6 8 +2 How anxious do you feel about your performance during interviews? 10 3 −7 Participant 4 How confident do you feel about your performance during interviews? 4 8 +4 How anxious do you feel about your performance during interviews? 9 7 −2

Note: Possible scores ranged from 1–7 with 7 being the most acceptable/satisfied.

102 D. E. SIMMONS ET AL.

participants. Body language training took close to 2 hours for Participants 2 and 4. Environmental set-up training took over 1 hour for Participants 1, 3, and 4. Simulated interviews including baseline and post-training sessions but not follow-up sessions, averaged 3 hr and 10 min across participants. Overall, the total time spent across training and simulated interviews averaged 13 hr, 53 min, and 3 s.

Table 4. Neutral Rater Scores. Neutral Rater Items by Participant Baseline Post Training Change

Participant 1 Quality of Answers to Questions 6 6 0 Quality of Questions Asked 5 7 +2 Appropriateness of Body Language 6 7 +1 Appropriateness of Environment 6 7 +1 Appearing Confident 6 7 +1 Appearing Nervous 6 6 0 Likelihood of Hiring Individual 6 6 0 Participant 2 Quality of Answers to Questions 2 6 +4 Quality of Questions Asked 4 7 +3 Appropriateness of Body Language 6 7 +1 Appropriateness of Environment 6 7 +1 Appearing Confident 2 6 +4 Appearing Nervous 4 6 −2 Likelihood of Hiring Individual 3 6 +3 Participant 3 Quality of Answers to Questions 2 4 +2 Quality of Questions Asked 1 6 +5 Appropriateness of Body Language 5 2 −3 Appropriateness of Environment 6 5 −1 Appearing Confident 2 3 +1 Appearing Nervous 2 3 −1 Likelihood of Hiring Individual 2 4 +2 Participant 4 Quality of Answers to Questions 5 5 0 Quality of Questions Asked 1 6 +5 Appropriateness of Body Language 6 6 0 Appropriateness of Environment 3 4 +1 Appearing Confident 5 6 +1 Appearing Nervous 6 6 0 Likelihood of Hiring Individual 5 6 +1

Note: Possible scores ranged from 1–7 with 7 being the most favorable.

Table 5. Duration of Training and Simulated Interviews. Participants

Components of Program Participant 1 Participant 2 Participant 3 Participant 4 Mean

Training Appropriate Answers 7:45:25 (2) 8:11:13 (6) 6:32:27 (8) 5:22:29 (8) 6:57:53 Appropriate Questions 1:52:55 (3) 1:05:05 (3) 1:58:00 (4) 1:17:12 (3) 1:37:22 Environmental Set Up 1:04:48 (5) – 1:06:15 (5) 1:25:52 (4) 1:12:56 Body Language – 2:20:41 (9) – 1:40:01 (6) 2:00:21 Total 10:43:08 (10) 11:36:59 (18) 9:36:42 (17) 9:45:34 (21) 10:25:36 Simulated interviews 3:06:14 (15) 3:02:44 (15) 2:47:42 (15) 4:53:11 (18) 3:10:06 Grand Total 13:49:22 14:39:43 12:24:24 14:38:45 13:53:03

Note. Number of brief interviews in parentheses.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 103

Discussion

This study provides additional evidence that BST is an effective training procedure for teaching interview skills to college students. Furthermore, this study extends prior research by conducting the training and interviews entirely online. Because of the online format, this study also included envir- onmental set-up and body language as dependent variables relevant to online video interactions. All participant performance improved from baseline to post-training across all targeted dependent variables (three dependent vari- ables for three participants, four dependent variables for one participant), demonstrating that BST can improve online interview skills when used in an online training format. The neutral rater scores corroborated this improve- ment. The training took 14 hr, which can be loosely compared to Stocco et al., who reported that participants spent an average of 11 hr in training and simulated interviews. The difference in training time could be due to the online arrangement, differences in variables targeted, general procedural dif- ferences in interview format and training, or the stricter mastery criterion. Future researchers could directly examine and compare variables that affect training time.

Participant 1 did not maintain their environmental set-up performance during follow-up, even after the booster training. Various factors could have contributed to this result, including the environment being limited for meet- ing the criterion (e.g., dorm room with low lighting), too much response effort, and not considering the target to be worthwhile. More generally, students may not be able to move to another location with better lighting or have insufficient funds or resources to purchase tools like a ring light. Ring lights are relatively inexpensive, and the use of such tools may be a viable strategy to improve poor lighting conditions in professional web-based envir- onments. Future researchers and interviewers could provide tools like a light and background screen to foster adherence. This is especially important for individuals with low-income to better ensure an equitable interview experi- ence. It might also be important to train interviewers to focus on the content of the interview and not background and lighting. If there is concern that back- ground variables could influence candidate ratings, prospective employers could provide local space for interviewees (e.g., hotel room, library room reservation) and ensure interviews occur during times the candidates will be working and have access to childcare and other support.

Three out of four participants reported a decrease in ratings of anxiety and an increase in confidence ratings, and participant 1 did not report a reduction in their anxiety score. It is worth noting that physiological variables may have influenced this in the environment outside the control of the experimental context, with this participant reporting a diagnosed emotional or psychologi- cal disability. Although this training may be effective on objective performance

104 D. E. SIMMONS ET AL.

measures, it may not ensure a change in subjective ratings of anxiety or confidence for all individuals. Future studies could explore how reducing physiological experiences of anxiety might improve training adherence and outcomes, especially for individuals who experience high levels of anxiety at baseline or in general. Despite no decrease in the anxiety rating, Participant 1 reported receiving a position they interviewed for during the study and described the training as helping prepare them with skills and experience to garner success. Relatedly, two of our four final participants anecdotally reported participating in job interviews during the study, both receiving and accepting these job offers.

The experimenters excluded one participant from this study, and two participants withdrew, and their data were not included. For the participant excluded by the experimenters, behavior change was demonstrated across two components of training; the third, appropriate answers, was not completed with training extending up to scheduling the fourth termination criteria session (see supplemental materials for graphs). After a continuous cycle of rescheduling (i.e., sessions occurring 4–6 weeks apart) and intermittent responses to e-mail, we decided to exclude the participant from the study. This portion of the study occurred during an academic semester and the participant may have had difficulty attending sessions while in school full- time. Additionally, this participant reported obtaining a job during their participation in the study, and thus, their interest in further developing inter- view skills after obtaining a job might have wanted. Finally, this participant reported high anxiety (8 out of 10) on the pre-measure, which might have contributed to their rescheduling.

Two participants withdrew from the study and several variables might explain their departure. First, those participants reported high anxiety during the pre-measure. It is possible regular interview practice during baseline, without feedback, was aversive. Furthermore, their sessions were conducted over summer break, and participation in this study involved a substantial amount of time and effort, which is a limitation of this study and procedure. Although Stocco et al. (2017) reported an average of 11 hr as an indication of a non-time intensive program of instruction, our replication, an average of 14 hr, was time-intensive, especially the sessions that ran the entire two-hour session length.

Session length likely depends on the format of the BST component, the specific skill component being trained, and individual differences across participants. Therefore, training adherence might be improved if the structure of baseline and training can be made more efficient while maintaining effec- tiveness. There are many possibilities regarding the overall format and struc- ture of BST components. For example, participants could be provided with the question prompts and criteria before the training to complete these steps outside the training session. They could video record their responses to

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 105

questions outside of the session and review them for feedback in session. Another possibility may be using video instructions and modeling as home- work before engaging in rehearsal and feedback via videoconference (Erath & DiGennaro Reed, 2020; Erath et al., 2021). This would allow participants to progress at their own pace and watch video models multiple times if needed. Finally, the training sessions could be scheduled for a shorter, more intensive time period (e.g., one week) to improve completion rates.

Modifying or changing skill components may also be beneficial. While this and prior studies used specific criteria for each question type, it might be more efficient and generalizable to teach a question answering strategy. The S.T.A. R. is a method for answering interview questions by describing a situation, a task, action, and the result (M. Higgins, 2014). The P.A.A.R.L.A. is another interview method for answering questions that involves describing a problem, their analysis and action, the result, what was learned, and how the learning was applied (Asher, 2009). Future research could compare the efficacy and efficiency of using BST to teach the S.T.A.R. or P.A.A.R.L.A. methods of answering questions to one another and the current methodology. If one approach could be applied to several types of questions, it could reduce training time. Relatedly, it could be worthwhile to explore whether teaching a method of answering questions instead of following numerous criteria differentially impacts confidence and anxiety ratings. For example, if a general answering strategy contributes to more generalizable and effective interviewee behavior this could enhance confidence ratings and reduce anxiety ratings. Including less rules by using a general strategy may also reduce anxiety ratings.

There are a few limitations of this study that could be addressed in future research. First, not all skills reached the mastery criterion in training, with two participants encountering the fourth training session termination criteria when training appropriate answers. Some participants took longer to type their practice answers in appropriate answers training, leading to fewer brief interviews and less rehearsal-feedback for these participants. Future iterations of the training should assign writing responses as homework. Holding training sessions closer together in time (e.g., daily) might also improve mastery responding outcomes. As a replication of Stucco, we attempted to get as close to the original study as possible, including typing responses to answers and a self-evaluation process transferred to fit the online environment. Future researchers could evaluate whether these aspects of the experimental set-up are artifacts that confound the effects of BST on these skills, or when and if they are necessary or most useful to include (i.e., only if training is spaced very far apart in time).

Second, answering questions at 100% mastery (7/7- all criteria correct) did not guarantee participants had a high-quality vocal response. For example, with appropriate answers, participant responses may have also included vocal

106 D. E. SIMMONS ET AL.

fillers, like “um,” or lacked concision. The neutral rater evaluation supports this concern, because while the rater scores improved for two participants following training, they were unchanged for the other two participants. Other than a question-answering method, an additional strategy to remedy this issue could be embedding habit reversal into training (Montes et al., 2021) and having participants score videos of their responses for clarity and concision before additional training sessions.

Another important consideration in a web-based format may be that participants had difficulty discriminating correct posture, despite real-time video feedback. Sigurdsson and Austin (2008) increased safe posture using real-time visual feedback and self-monitoring, but when engaging in multiple behaviors concurrently, real-time feedback may not be enough to self-monitor effectively. Multiple studies have used video feedback to increase skill perfor- mance requiring specific control of body movement, including posture (BenitezSantiago & Miltenberger, 2016; Boyer et al., 2009). A future direction may be video feedback for individuals struggling with poor body language.

The participants in this study were relatively homogenous, the sample size was small and attrition was high. Given the effectiveness of the procedure, a reasonable next step might be to refine the procedures to reduce the time required in an attempt to reduce attrition. Then, the study could be replicated with a larger and more diverse group of participants.

The following two limitations are broader, encompassing this research topic, including the present study, and are also important to consider gener- ally. A critical limitation is that the norms and expectations of professionalism are based on a White gendered cis heteronormative standard (Okun’s work- book 2000, as adapted in Clare et al., 2019; see also Huffcutt, 2011; McCarthy & Cheng, 2014). Because of the different positionalities of the interviewer and interviewee, there is the possibility that no matter how exceptional the inter- viewee performs, they will not be offered the position, irrespective of their objective performance in the interview or relevant qualifications. Bias is critical to consider in an unstructured job interview and is also essential to consider when selecting socially acceptable intervention targets. For example, for stereotypically gendered positions, a different amount of smiling may be important and be beneficial or harmful relative to hiring outcomes (McCarthy, & Cheng, 2018). This consideration also raises the potential for an inequitable selection of training targets for individuals of differing positionality when training interview skills and helps demonstrate the necessity for a closer analysis of the social importance of the effects of training these skills. Conversely, in the future, interviewer behaviors that lead to more equitable interviews could be a more socially appropriate target for change. In the present study, we attempted to target variables that had applicability across industries and jobs and individualize based on the participants’ career goals.

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 107

Although aligning with contemporary practice, the variables we selected may still be skewed by the normative standard in the workplace.

A final limitation is that most of what is used to identify behaviors and skills sought after in the job interview and job interview training is based on subjective professional ratings. There is scant research that includes a direct measure of interview behaviors and a causal relation to hiring outcomes in real-world envir- onments (Barrick et al., 2009; Posthuma et al., 2002; Van De Mieroop et al., 2019). In this study, we used a neutral rater to approximate the likelihood of hiring, but ongoing research should consider whether interview training leads to better job outcomes. Future researchers could use a group design to evaluate whether the group exposed to the training yielded better job outcomes. Ultimately, a job interview aims to identify the best candidate for the job, so training practices should facilitate interviewees’ ability to showcase their skills and potential and interviewers’ ability to use interview information to select the best job candidate.

One crucial area for further research is evaluating the procedures used in this study to train a variety of verbal behavior targets such as supervision, teaching, cultural responsiveness, bedside manner, and rapport building. Recent work has begun using BST to train soft skills and communication skills like active listening and assertiveness (Rohrer et al., 2021; Vostal et al., 2021). Improving these targets and other communicative skills may have multiple benefits for practitioners and researchers, including greater treatment acceptance and adherence by clients (Taylor et al., 2018), maintaining obser- vation accuracy, and improving treatment integrity for interventions involving interpersonal interactions (Matey et al., 2021). The research on interview skills offers a framework for training other complex workplace skills.Despite limita- tions, the current study successfully replicated Stocco et al. (2017) and extended teaching interview skills to college students into a web-based format. We echo Stocco in voicing the opportunity to address this skill deficit for students at the collegiate level. Furthermore, as our world moves toward employing more technology and web-based tools, our training procedures can incorporate those tools to build accessible and effective training for a variety of work-related skills.

Acknowledgments

The authors would like to thank Isabella Tassistro, Justin Santos, Anthony Mennella, Xikeria van der laan, Anise Wooten, Farah Contractor, and Andrew Smith for their assistance with interobserver agreement and treatment integrity. The authors would like to thank Jessica Nastasi for reviewing an earlier version of the manuscript.

Disclosure statement

No potential conflict of interest was reported by the authors.

108 D. E. SIMMONS ET AL.

ORCID

Davis E. Simmons http://orcid.org/0000-0001-6350-1986 Nicole Gravina http://orcid.org/0000-0001-8210-7159 Andressa Sleiman http://orcid.org/0000-0001-8927-6203 Faris R. Kronfli http://orcid.org/0000-0001-9301-2083

References

Asher, D. (2009). How to get any job: Life launch & relaunch for everyone under 30 (or how to avoid living in your parents’ basement). Ten Speed Press.

Barker, L.-K., Moore, J. W., Olmi, D. J., & Rowsey, K. (2019). A comparison of immediate and post-session feedback with behavioral skills training to improve interview skills in college students. Journal of Organizational Behavior Management, 39(3–4), 145–163. https://doi. org/10.1080/01608061.2019.1632240

Barrick, M. R., Shaffer, J. A., & DeGrassi, S. W. (2009). What you see may not be what you get: Relationships among self-presentation tactics and ratings of interview and Job Performance. Journal of Applied Psychology, 94(6), 1394–1411. https://doi.org/10.1037/a0016532

Basch, J. M., Melchers, K. G., Kurz, A., Krieger, M., & Miller, L. (2020). It takes more than a good camera: Which factors contribute to differences between face-to-face interviews and videoconference interviews regarding performance ratings and interviewee perceptions? Journal of Business & Psychology, 36(5), 921–940. https://doi.org/10.1007/s10869-020- 09714-3

BenitezSantiago, A., & Miltenberger, R. G. (2016). Using video feedback to improve martial arts performance. Behavioral Interventions, 31(1), 12–27. https://doi.org/10.1002/bin.1424

Boutain, A. R., Sheldon, J. B., & Sherman, J. A. (2020). Evaluation of a telehealth parent training program in teaching self‐care skills to children with autism. Journal of Applied Behavior Analysis, 53(3), 1259–1275. https://doi.org/10.1002/jaba.743

Boyer, E., Miltenberger, R. G., Batsche, C., Fogel, V., & LeBlanc, L. (2009). Video modeling by experts with video feedback to enhance gymnastics skills. Journal of Applied Behavior Analysis, 42(4), 855–860. https://doi.org/10.1901/jaba.2009.42-855

Brock, M. E., Cannella-Malone, H. I., Seaman, R. L., Andzik, N. R., Schaefer, J. M., Page, E. J., Barczak, M. A., & Dueker, S. A. (2017). Findings across practitioner training studies in special education: A comprehensive review and meta-analysis. Exceptional Children, 84(1), 7–26. https://doi.org/10.1177/0014402917698008

Bye, H. H., Horverak, J. G., Sandal, G. M., Sam, D. L., & van de Vijver, F. J. R. (2013). Cultural fit and ethnic background in the job interview. International Journal of Cross Cultural Management, 14(1), 7–26. https://doi.org/10.1177/1470595813491237

Campbell, S., & Roberts, C. (2007). Migration, ethnicity and competing discourses in the job interview: Synthesizing the institutional and personal. Discourse & Society, 18(3), 243–271. https://doi.org/10.1177/0957926507075474

Carnett, A., Hansen, S., Tullis, C., & Machalicek, W. (2020). Using behavioural skills training via telehealth to increase teachers use of communication interventions and increase student use of speech‐generating devices in a high school functional skills classroom. Journal of Intellectual Disability Research, 65(2), 133–148. https://doi.org/10.1111/jir.12794

Chapman, D. S., & Webster, J. (2003). The use of technologies in the recruiting, screening, and selection processes for job candidates. International Journal of Selection and Assessment, 11 (2–3), 113–120. https://doi.org/10.1111/1468-2389.00234

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 109

Clare, E. Y., Briones, E., Page, K., & Angers-Trottier, P. (2019). White Supremacy Culture in Organizations. The Centre for Community Organizations (COCo). Retrieved from https:// coco-net.org/wp-content/uploads/2019/11/Coco-WhiteSupCulture-ENG4.pdf

Clay, C. J., Schmitz, B. A., Balakrishnan, B., Hopfenblatt, J. P., Evans, A., & Kahng, S. W. (2021). Feasibility of virtual reality behavior skills training for preservice clinicians. Journal of Applied Behavior Analysis, 54(2), 547–565. https://doi.org/10.1002/jaba.809

Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied Behavior Analysis (3rd ed.). Pearson.

Davis, S., Thomson, K., & Connolly, M. (2019). A component analysis of behavioral skills training with volunteers teaching motor skills to individuals with developmental disabilities. Behavioral Interventions, 34(4), 431–450. https://doi.org/10.1002/bin.1688

Edgemon, A. K., Rapp, J. T., Brogan, K. M., Richling, S. M., Hamrick, S. A., Peters, R. J., & O’Rourke, S. A. (2020). Behavioral skills training to increase interview skills of adolescent males in a juvenile residential treatment facility. Journal of Applied Behavior Analysis, 53(4), 2303–2318. https://doi.org/10.1002/jaba.707

Erath, T. G., & DiGennaro Reed, F. D. (2020). A brief review of technology‐based Antecedent Training Procedures. Journal of Applied Behavior Analysis, 53(2), 1162–1169. https://doi. org/10.1002/jaba.633

Erath, T. G., DiGennaro Reed, F. D., & Blackman, A. L. (2021). Training Human Service staff to implement behavioral skills training using a video‐based intervention. Journal of Applied Behavior Analysis, 54(3), 1251–1264. https://doi.org/10.1002/jaba.827

Ferguson, J., Craig, E. A., & Dounavi, K. (2019). Telehealth as a model for providing behaviour analytic interventions to individuals with autism spectrum disorder: A systematic review. Journal of Autism & Developmental Disorders, 49(2), 582–616. https://doi.org/10.1007/ s10803-018-3724-5

Fienup, D. M., & Carr, J. E. (2021). The use of performance criteria for determining “mastery” in discrete‐trial instruction: A call for research. Behavioral Interventions, 36(4), 756–763. https://doi.org/10.1002/bin.1827

Fisher, W. W., Luczynski, K. C., Blowers, A. P., Vosters, M. E., Pisman, M. D., Craig, A. R., Hood, S. A., Machado, M. A., Lesser, A. D., & Piazza, C. C. (2020). A randomized clinical trial of a virtual‐training program for teaching applied‐behavior‐analysis skills to parents of children with autism spectrum disorder. Journal of Applied Behavior Analysis, 53(4), 1856–1875. https://doi.org/10.1002/jaba.778

Fisher, W. W., Luczynski, K. C., Hood, S. A., Lesser, A. D., Machado, M. A., & Piazza, C. C. (2014). Preliminary findings of a randomized clinical trial of a virtual training program for Applied Behavior Analysis technicians. Research in Autism Spectrum Disorders, 8(9), 1044–1054. https://doi.org/10.1016/j.rasd.2014.05.002

Fuller, J., Raman, M., Sage-Gavin, E., & Hines, K. (2021, September). Hidden Workers: Untapped Talent. Published by Harvard Business School Project on Managing the Future of Work and Accenture. https://www.hbs.edu/managing-the-future-of-work/Documents/ research/hiddenworkers09032021.pdf

Futurestep, & Kornferry. (2015, July 15). Korn Ferry Executive Survey: Video Interviewing becomes a mainstay; companies are implementing new video recruiting tactics. Korn Ferry. Retrieved September 27, 2021, from https://www.kornferry.com/about-us/press/Korn-Ferry -executive-survey-video-Interviewing-becomes-a-mainstay-companies-are-implementing- new-video-recruiting-tactics .

Gartner, Inc. (2020). Virtual interviews to hire candidates during COVID. Retrieved September 27, 2021, from https://www.gartner.com/en/newsroom/press-releases/2020-04-30-gartner- hr-survey-shows-86–of-organizations-are-cond .

110 D. E. SIMMONS ET AL.

Haensel, J. X., Smith, T. J., & Senju, A. (2021). Cultural differences in mutual gaze during face-to-face interactions: A dual head-mounted eye-tracking study. Visual Cognition, 29(10), 1–16. https://doi.org/10.1080/13506285.2021.1928354

Higgins, M. (2014). Using the star technique to shine at job interviews: A how-to guide. Retrieved from https://www.theguardian.com/careers/careers-blog/star-technique-competency-based -interview

Higgins, W. J., Luczynski, K. C., Carroll, R. A., Fisher, W. W., & Mudford, O. C. (2017). Evaluation of a telehealth training package to remotely train staff to conduct a preference assessment. Journal of Applied Behavior Analysis, 50(2), 238–251. https://doi.org/10.1002/ jaba.370

Hsu, C.-F., Wang, Y.-S., Lei, C.-L., & Chen, K.-T. (2019). Look at me! correcting eye gaze in live video communication. ACM Transactions on Multimedia Computing, Communications, and Applications, 15(2), 1–21. https://doi.org/10.1145/3311784

Huffcutt, A. I. (2011). An empirical review of the Employment Interview Construct Literature. International Journal of Selection and Assessment, 19(1), 62–81. https://doi.org/10.1111/j. 1468-2389.2010.00535.x

Kirkpatrick, M., Akers, J., & Rivera, G. (2019). Use of behavioral skills training with teachers: A systematic review. Journal of Behavioral Education, 28(3), 344–361. https://doi.org/10. 1007/s10864-019-09322-z

LeBlanc, L. A., Lerman, D. C., & Normand, M. P. (2020). Behavior analytic contributions to public health and Telehealth. Journal of Applied Behavior Analysis, 53(3), 1208–1218. https://doi.org/10.1002/jaba.749

Lloveras, L. A., Tate, S. A., Vollmer, T. R., King, M., Jones, H., & Peters, K. P. (2022). Training behavior analysts to conduct functional analyses using a remote group Behavioral Skills Training Package. Journal of Applied Behavior Analysis, 55(1), 290–304. https://doi.org/10. 1002/jaba.893

Macan, T. (2009). The employment interview: A review of current studies and directions for future research. Human Resource Management Review, 19(3), 203–218. https://doi.org/10. 1016/j.hrmr.2009.03.006

Magnacca, C., Thomson, K., Marcinkiewicz, A., Davis, S., Steel, L., Lunsky, Y., Fung, K., Vause, T., & Redquest, B. (2021). A telecommunication model to teach facilitators to deliver acceptance and commitment training. Behavior Analysis in Practice, 15(3), 730–751. https:// doi.org/10.1007/s40617-021-00628-x

Matey, N., Sleiman, A., Nastasi, J., Richard, E., & Gravina, N. (2021). Varying reactions to feedback and their effects on observer accuracy and feedback omission. Journal of Applied Behavior Analysis, 54(3), 1188–1198. https://doi.org/10.1002/jaba.840

McCarthy, J. M., & Cheng, B. H. (2014). Through the looking Glass: Employment interviews from the lens of job candidates. Oxford Handbooks Online. https://doi.org/10.1093/ oxfordhb/9780199764921.013.015

McDougale, C. B., Richling, S. M., Longino, E. B., & O’Rourke, S. A. (2020). Mastery criteria and maintenance: A descriptive analysis of applied research procedures. Behavior Analysis in Practice, 13(2), 402–410. https://doi.org/10.1007/s40617-019-00365-2

Michel, R. S., Belur, V., Naemi, B., & Kell, H. J. (2019). Graduate Admissions Practices: A targeted review of the literature. ETS Research Report Series, 2019(1), 1–18. https://doi. org/10.1002/ets2.12271

Montes, C. C., Heinicke, M. R., Guendulain, M. A., & Morales, E. (2021). A component analysis of awareness training for reducing speech disfluencies. Journal of Applied Behavior Analysis, 54(2), 770–782. https://doi.org/10.1002/jaba.795

JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 111

Paulhus, D. L., Westlake, B. G., Calvez, S. S., & Harms, P. D. (2013). Self-presentation style in job interviews: The role of personality and culture. Journal of Applied Social Psychology, 43 (10), 2042–2059. https://doi.org/10.1111/jasp.12157

Pellegrino, A. J., & DiGennaro Reed, F. D. (2020). Using telehealth to teach valued skills to adults with intellectual and developmental disabilities. Journal of Applied Behavior Analysis, 53(3), 1276–1289. https://doi.org/10.1002/jaba.734

Posthuma, R. A., Morgeson, F. P., & Campion, M. A. (2002). Beyond employment interview validity: A comprehensive narrative review of recent research and trends over time. Personnel Psychology, 55(1), 1–81. https://doi.org/10.1111/j.1744-6570.2002.tb00103.x

Roberts, K., DeQuinzio, J. A., Taylor, B. A., & Petroski, J. (2020). Using behavioral skills training to teach interview skills to young adults with autism. Journal of Behavioral Education, 30(4), 664–683. https://doi.org/10.1007/s10864-020-09389-z

Rohrer, J. L., Marshall, K. B., Suzio, C., & Weiss, M. J. (2021). Soft skills: The case for compassionate approaches or how behavior analysis keeps finding its heart. Behavior Analysis in Practice, 14(4), 1135–1143. https://doi.org/10.1007/s40617-021-00563-x

Sigurdsson, S. O., & Austin, J. (2008). Using real-time visual feedback to improve posture at computer workstations. Journal of Applied Behavior Analysis, 41(3), 365–375. https://doi. org/10.1901/jaba.2008.41-365

Slane, M., & Lieberman‐Betz, R. G. (2021). Using behavioral skills training to teach imple- mentation of behavioral interventions to teachers and other professionals: A systematic review. Behavioral Interventions, 36(4), 984–1002. https://doi.org/10.1002/bin.1828

Stocco, C. S., Thompson, R. H., Hart, J. M., & Soriano, H. L. (2017). Improving the interview skills of college students using behavioral skills training. Journal of Applied Behavior Analysis, 50(3), 495–510. https://doi.org/10.1002/jaba.385

Taylor, B. A., LeBlanc, L. A., & Nosik, M. R. (2018). Compassionate care in behavior analytic treatment: Can outcomes be enhanced by attending to relationships with caregivers? Behavior Analysis in Practice, 12(3), 654–666. https://doi.org/10.1007/s40617-018-00289-3

Tomlinson, S. R., Gore, N., & McGill, P. (2018). Training individuals to implement applied behavior analytic procedures via Telehealth: A systematic review of the literature. Journal of Behavioral Education, 27(2), 172–222. https://doi.org/10.1007/s10864-018-9292-0

Uhlmann, E. L., Heaphy, E., Ashford, S. J., Zhu, L., & Sanchez-Burks, J. (2013). Acting professional: An exploration of culturally bounded norms against nonwork role referencing. Journal of Organizational Behavior, 34(6), 866–886. https://doi.org/10.1002/ job.1874

U.S. Bureau of Labor Statistics. (2000-2020). Labor Force Statistics from the Current Population Survey. Retrieved February 14, 2022. from https://data.bls.gov/pdq/SurveyOutputServlet

Van De Mieroop, D., Clifton, J., & Schreurs, C. (2019). The interactional negotiation of the rules of the employment interview game: Negative remarks about third parties and “doing” trust. International Journal of Business Communication, 56(4), 560–585. https://doi.org/10. 1177/2329488416673816

Vostal, B. R., Mrachko, A. A., Vostal, M., & McCoy, A. (2021). Effects of group behavioral skills training on teacher candidates’ acquisition and maintenance of active listening. Journal of Behavioral Education, 31(4), 679–698. https://doi.org/10.1007/s10864-021-09431-8

Ward-Horner, J., & Sturmey, P. (2012). Component analysis of behavior skills training in functional analysis. Behavioral Interventions, 27(2), 75–92. https://doi.org/10.1002/bin.1339

Wirantana, V., Stocco, C. S., & Kohn, C. S. (2019). The implementation and adoptability of behavioral skills training in a University Career Center. Behavioral Interventions, 35(1), 84–98. https://doi.org/10.1002/bin.1692

112 D. E. SIMMONS ET AL.

  • Abstract
  • Method
    • Participants and setting
    • Dependent variables
      • Vocal responses
        • Appropriate responding to questions
        • Explaining interest
        • Explaining relevant experience
        • Conflict resolution
        • Soft skills
        • Get to know you
        • Future orientation
        • Interpersonal skills
        • Appropriate question asking
      • Non-vocal responses
        • Body language
        • Facial-orientation
        • Posture
        • Environmental set up
        • Lighting
        • Visual background
        • Body placement on screen
    • Interobserver agreement
    • Procedures
      • Pre-work
        • Background questionnaire
        • Interview homework materials
      • Evaluative simulated interviews (Baseline)
      • Behavioral skills training
      • Post training interviews and follow-up plus generalization probe
    • Procedural integrity
    • Social validity
    • Neutral rater
  • Results
    • Interview skills
    • Social validity measures
    • Duration and number of simulated interviews
  • Discussion
  • Acknowledgments
  • Disclosure statement
  • ORCID
  • References
Platinum Essays