1
|
Briggs AM, Zohr SJ, Harvey OB. Training individuals to implement discrete-trial teaching procedures using behavioral skills training: A scoping review with implications for practice and research. J Appl Behav Anal 2024; 57:86-103. [PMID: 37772639 DOI: 10.1002/jaba.1024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 09/01/2023] [Indexed: 09/30/2023]
Abstract
Behavioral skills training (BST) is an evidence-based approach for training individuals to implement discrete-trial teaching procedures. Despite the effectiveness of this approach, implementing BST can be time and resource intensive, which may interfere with a clinical organization's adoption of this training format. We conducted a scoping review of studies using BST components for training discrete-trial teaching procedures in peer-reviewed articles between 1977 and 2021. We identified 51 studies in 46 publications involving 354 participants. We coded descriptive data on (a) participant characteristics, (b) study characteristics, (c) training conditions (including instructions, modeling, rehearsal, and feedback), and (d) training outcomes. The results indicated that studies have primarily attempted to improve the efficacy and efficiency of BST by modifying or omitting common training components. We provide best-practice considerations for using BST to teach discrete-trial teaching procedures and offer a research agenda to guide future investigation in this area.
Collapse
Affiliation(s)
- Adam M Briggs
- Department of Psychology, Eastern Michigan University, Ypsilanti, MI, USA
| | - Samantha J Zohr
- Department of Psychology, Eastern Michigan University, Ypsilanti, MI, USA
| | - Olivia B Harvey
- Department of Psychology, Eastern Michigan University, Ypsilanti, MI, USA
| |
Collapse
|
2
|
Tamrazi S, Wiskow KM. Effects of omission and commission errors during tact instruction. J Appl Behav Anal 2023; 56:720-728. [PMID: 37644662 DOI: 10.1002/jaba.1020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 08/13/2023] [Indexed: 08/31/2023]
Abstract
The purpose of the current study was to compare the effects of omission and commission errors of reinforcement during tact instruction via telehealth with three children, 6 to 7 years of age, who were diagnosed with an autism spectrum disorder. We used an adapted alternating treatment design to evaluate skill acquisition of target stimuli across high-integrity, commission errors, and omission errors conditions. The high-integrity condition produced mastery criteria in fewer sessions compared with the integrity-error conditions in four of six comparisons, and the omission condition reached mastery criteria in fewer sessions than the commission condition in five of six comparisons.
Collapse
Affiliation(s)
| | - Katie M Wiskow
- California State University, Stanislaus, Turlock, CA, USA
| |
Collapse
|
3
|
Bergmann S, Long BP, St Peter CC, Brand D, Strum MD, Han JB, Wallace MD. A detailed examination of reporting procedural fidelity in the Journal of Applied Behavior Analysis. J Appl Behav Anal 2023; 56:708-719. [PMID: 37572025 DOI: 10.1002/jaba.1015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 07/09/2023] [Indexed: 08/14/2023]
Abstract
Few reviews on procedural fidelity-the degree to which procedures are implemented as designed-provide details to gauge the quality of fidelity reporting in behavior-analytic research. This review focused on experiments in the Journal of Applied Behavior Analysis (2006-2021) with "integrity" or "fidelity" in the abstract or body. When fidelity data were collected, the coders characterized measurement details (e.g., description of calculation, report of single or multiple values, frequency of fidelity checks, checklist use). The researchers found increasing trends in describing the calculation(s), reporting multiple values, and stating the frequency of measurement. Few studies described using a checklist. Most studies reported fidelity as a percentage, with high obtained values (M = 97%). When not collecting fidelity data was stated as a limitation, authors were unlikely to provide a rationale for the omission. We discuss recommendations for reporting procedural fidelity to increase the quality of and transparency in behavior-analytic research.
Collapse
Affiliation(s)
- Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Brian P Long
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Claire C St Peter
- Department of Psychology, West Virginia University, Morgantown, WV, USA
| | - Denys Brand
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Marcus D Strum
- Department of Behavior Analysis, University of North Texas, Denton, TX, USA
| | - Justin B Han
- Department of Child and Family Studies, University of South Florida, Tampa, FL, USA
| | - Michele D Wallace
- Department of Special Education & Counseling, California State University, Los Angeles, CA, USA
| |
Collapse
|
4
|
Han JB, Bergmann S, Brand D, Wallace MD, St. Peter CC, Feng J, Long BP. Trends in Reporting Procedural Integrity: A Comparison. Behav Anal Pract 2023; 16:388-398. [PMID: 37187851 PMCID: PMC10169953 DOI: 10.1007/s40617-022-00741-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/18/2022] [Indexed: 10/14/2022] Open
Abstract
Procedural integrity refers to the extent to which an independent variable is implemented as described. Measuring procedural integrity is one important factor when considering internal and external validity of experiments. Experimental articles in behavior-analytic journals have rarely reported procedural-integrity data. The purpose of this study was to update previous reviews of whether articles published in the Journal of Applied Behavior Analysis reported procedural integrity, spanning a period from 1980 to 2020, and compare reporting in JABA to recent reviews of studies published in Behavior Analysis in Practice (2008-2019) and the Journal of Organizational Behavior Management (2000-2020). Procedural integrity continues to be underreported across all three journals, but an increasing trend in reporting procedural integrity is evident in the Journal of Applied Behavior Analysis and Behavior Analysis in Practice. In addition to our recommendations and implications for research and practice, we provide examples and resources to assist researchers and practitioners with recording and reporting integrity data.
Collapse
Affiliation(s)
| | - Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203-5017 USA
| | - Denys Brand
- California State University, Sacramento, Sacramento, CA USA
| | | | | | - Jennifer Feng
- California State University, Los Angeles, Los Angeles, CA USA
| | | |
Collapse
|
5
|
Bergmann S, Niland H, Gavidia VL, Strum MD, Harman MJ. Comparing Multiple Methods to Measure Procedural Fidelity of Discrete-trial Instruction. EDUCATION & TREATMENT OF CHILDREN 2023; 46:1-20. [PMID: 37362029 PMCID: PMC10208552 DOI: 10.1007/s43494-023-00094-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 04/17/2023] [Indexed: 06/28/2023]
Abstract
Procedural fidelity is the extent to which an intervention is implemented as designed and is an important component of research and practice. There are multiple ways to measure procedural fidelity, and few studies have explored how procedural fidelity varies based on the method of measurement. The current study compared adherence to discrete-trial instruction protocols by behavior technicians with a child with autism when observers used different procedural-fidelity measures. We collected individual-component and individual-trial fidelity with an occurrence-nonoccurrence data sheet and compared these scores to global fidelity and all-or-nothing, 3-point Likert scale, and 5-point Likert scale measurement methods. The all-or-nothing method required all instances of a component or trial be implemented without error to be scored correct. The Likert scales used a rating system to score components and trials. At the component level, we found that the global, 3-point Likert, and 5-point Likert methods were likely to overestimate fidelity and mask component errors, and the all-or-nothing method was unlikely to mask errors. At the trial level, we found that the global and 5-point Likert methods approximated individual-trial fidelity, the 3-point Likert method overestimated fidelity, and the all-or-nothing method underestimated fidelity. The occurrence-nonoccurrence method required the most time to complete, and all-or-nothing by trial required the least. We discuss the implications of measuring procedural fidelity with different methods of measurement, including false positives and false negatives, and provide suggestions for practice and research. Supplementary Information The online version contains supplementary material available at 10.1007/s43494-023-00094-w.
Collapse
Affiliation(s)
- Samantha Bergmann
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
| | - Haven Niland
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
- Kristin Farmer Autism Center, University of North Texas, Denton, TX USA
| | - Valeria Laddaga Gavidia
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
- Kristin Farmer Autism Center, University of North Texas, Denton, TX USA
| | - Marcus D. Strum
- Department of Behavior Analysis, University of North Texas, 1155 Union Circle #310919, Denton, TX 76203 USA
| | - Michael J. Harman
- Department of Psychology, Briar Cliff University, Sioux City, IA USA
| |
Collapse
|
6
|
Falakfarsa G, Brand D, Bensemann J, Jones L, Miguel CF, Heinicke MR, Mason MA. A parametric analysis of procedural fidelity errors following mastery of a task: A translational study. J Appl Behav Anal 2023. [PMID: 37157109 DOI: 10.1002/jaba.992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Accepted: 04/11/2023] [Indexed: 05/10/2023]
Abstract
Procedural fidelity is defined as the extent to which the independent variable is implemented as prescribed. Research using computerized tasks has shown that fidelity errors involving consequences for behavior can hinder skill acquisition. However, studies examining the effects of these errors once skills have been mastered are lacking. Thus, this translational study investigated the effects of varying levels of fidelity following mastery of a computerized arbitrary matching-to-sample task. A group design (consisting of five groups) was used in which college students initially completed 250 trials during which no programmed errors (i.e., perfect fidelity) were arranged, followed by an additional 250 trials with consequences delivered across various levels of fidelity (i.e., 20, 40, 60, 80, and 100% of trials administered without errors). The results showed that participants assigned to higher fidelity conditions performed better (on average). These results extended the findings of previous studies by demonstrating how errors involving consequences affect behavior across various stages of learning.
Collapse
Affiliation(s)
- Galan Falakfarsa
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Denys Brand
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Joshua Bensemann
- University of Auckland, School of Computer Science, Auckland, New Zealand
| | - Lea Jones
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Caio F Miguel
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Megan R Heinicke
- Department of Psychology, California State University, Sacramento, CA, USA
| | - Makenna A Mason
- Department of Psychology, California State University, Sacramento, CA, USA
| |
Collapse
|
7
|
Jones SH, St Peter CC. Nominally acceptable integrity failures negatively affect interventions involving intermittent reinforcement. J Appl Behav Anal 2022; 55:1109-1123. [PMID: 35822271 DOI: 10.1002/jaba.944] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Accepted: 06/07/2022] [Indexed: 11/10/2022]
Abstract
The finding that differential reinforcement of alternative behavior (DRA) is efficacious at 80% integrity when continuous reinforcement is programmed for alternative responding may have contributed to a perception that integrity at 80% or above is acceptable. However, research also suggests that other interventions (e.g., noncontingent reinforcement) may not remain effective at 80% integrity. The conditions under which 80% integrity is acceptable for common behavioral interventions remains unclear. Therefore, we conducted two human-operant studies to evaluate effects of 80% integrity for interventions with contingent or noncontingent intermittent reinforcement schedules. During Experiment 1, we compared noncontingent reinforcement (NCR) and DRA when implemented with 80% integrity. During Experiment 2, we compared 2 variations of DRA, which included either a ratio or interval schedule to reinforce alternative behavior. Results replicated previous research showing that DRA with a FR-1 schedule programmed for alternative responding resulted in consistent target response suppression, even when integrity was reduced to 80%. In contrast, neither NCR nor interval-based DRA were consistently effective when implemented at 80% integrity. These results demonstrate that 80% integrity is not a uniformly acceptable minimal level of integrity.
Collapse
|
8
|
Kodak T, Bergmann S, Waite M. Strengthening the procedural fidelity research-to-practice loop in animal behavior. J Exp Anal Behav 2022; 118:215-236. [PMID: 35789486 DOI: 10.1002/jeab.780] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 06/15/2022] [Indexed: 11/07/2022]
Abstract
Procedural fidelity is the extent to which components of an intervention are implemented as designed. Procedural fidelity is measured as a dependent variable and manipulated as an independent variable. In research and practice, procedural-fidelity data should be collected, monitored, and reported. Procedural fidelity as an independent variable has been investigated in humans using parametric analyses, and the current article summarizes some of the research conducted on the effects of procedural-fidelity errors in behavior-reduction and skill-acquisition interventions. Connections were drawn to applied animal researchers and the work of animal behavior practitioners to produce implications for practice with human and animal clients and suggestions for future research. Further, there are multiple ways to measure procedural fidelity, and different conclusions can be drawn based on the measure and computation method. The current article describes procedural-fidelity measures that are most applicable to animal behavior researchers and professionals.
Collapse
Affiliation(s)
| | | | - Mindy Waite
- Department of Psychology, University of Wisconsin, Milwaukee
| |
Collapse
|
9
|
Berdeaux KL, Lerman DC, Williams SD. Effects of environmental distractions on teachers' procedural integrity with three function-based treatments. J Appl Behav Anal 2022; 55:832-850. [PMID: 35377494 DOI: 10.1002/jaba.918] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Accepted: 02/23/2022] [Indexed: 11/08/2022]
Abstract
Past research has demonstrated the effectiveness of teacher-implemented, function-based treatments for problem behavior, but no studies have evaluated the impact of distractions on teachers' procedural integrity. In this proof-of-concept study, the experimenters employed a laboratory analog to examine the impact of distractions on levels of integrity when 5 teachers implemented 3 different treatments. Although integrity was similar across treatments when the setting was free of distractions, integrity declined for all teachers in the presence of student-driven distractions. In general, distractions had a greater impact on the integrity of differential negative reinforcement of alternative behavior (DNRA) compared to differential negative reinforcement of other behavior (DNRO) and noncontingent escape (NCE), particularly for the delivery of reinforcement. However, teachers tended to have lower levels of integrity when responding to problem behavior during DNRO. These findings support the potential viability of this approach for studying factors that impede procedural integrity in the classroom.
Collapse
Affiliation(s)
- Kally L Berdeaux
- Department of Clinical, Health, and Applied Sciences, University of Houston, Clear Lake
| | - Dorothea C Lerman
- Department of Clinical, Health, and Applied Sciences, University of Houston, Clear Lake
| | | |
Collapse
|
10
|
Falakfarsa G, Brand D, Jones L, Godinez ES, Richardson DC, Hanson RJ, Velazquez SD, Wills C. Treatment Integrity Reporting in Behavior Analysis in Practice 2008–2019. Behav Anal Pract 2021; 15:443-453. [DOI: 10.1007/s40617-021-00573-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/11/2021] [Indexed: 10/21/2022] Open
|
11
|
Hranchuk KS, Williams MJ. Addressing the feasibility of the teacher performance rate and accuracy scale as a treatment integrity tool. BEHAVIORAL INTERVENTIONS 2021. [DOI: 10.1002/bin.1774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Kieva S. Hranchuk
- Scottsdale Children's Institute Scottsdale Arizona USA
- Department of Psychology Arizona State University Tempe Arizona USA
| | | |
Collapse
|
12
|
St Peter CC, Shuler NJ, Jones SH, Bradtke S, Hull SL, Browning B, VanGilder S, Petitto C. Comparing training methods to improve volunteer skills during therapeutic horseback riding: A randomized control trial. J Appl Behav Anal 2021; 54:1157-1174. [PMID: 33730397 DOI: 10.1002/jaba.823] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Revised: 01/20/2021] [Accepted: 01/20/2021] [Indexed: 12/28/2022]
Abstract
Although in-vivo behavioral skills training (BST) is often effective, it may be too resource-intensive for organizations that rely on volunteers. Alternatives to in-vivo BST include video models or interactive computer training (ICT), but the utility of these procedures for training volunteers remains largely unknown. We used a randomized control trial to teach multiple skills to new volunteers at a therapeutic riding center. A total of 60 volunteers were assigned to one of three groups using block randomization. Depending on group assignment, volunteers received instructions and modeling through in-vivo interactions, a video model, or ICT. All volunteers completed in-vivo role plays with feedback. Skills were measured by unblinded observers during role plays. There were no statistically significant differences in accuracy of role-play performance between volunteers in the in-vivo BST and ICT groups, but both outperformed the video-model group. The ICT and video model required statistically significantly less time from a live instructor than did in-vivo training. Thus, although in-vivo BST remains best practice, ICT may be a viable alternative when training resources are limited.
Collapse
|
13
|
Bottini S, Morton H, Gillis J, Romanczyk R. The use of mixed modeling to evaluate the impact of treatment integrity on learning. BEHAVIORAL INTERVENTIONS 2020. [DOI: 10.1002/bin.1718] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Affiliation(s)
- Summer Bottini
- Psychology DepartmentBinghamton University Binghamton New York USA
| | - Hannah Morton
- Psychology DepartmentBinghamton University Binghamton New York USA
| | - Jennifer Gillis
- Psychology DepartmentBinghamton University Binghamton New York USA
| | | |
Collapse
|
14
|
Clayton M, Headley A. The use of behavioral skills training to improve staff performance of discrete trial training. BEHAVIORAL INTERVENTIONS 2019. [DOI: 10.1002/bin.1656] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Affiliation(s)
- Michael Clayton
- Department of Psychology; Missouri State University; Springfield Missouri USA
| | - Ali Headley
- Department of Psychology; Missouri State University; Springfield Missouri USA
| |
Collapse
|
15
|
Neely L, Rispoli M, Boles M, Morin K, Gregori E, Ninci J, Hagan-Burke S. Interventionist Acquisition of Incidental Teaching Using Pyramidal Training via Telehealth. Behav Modif 2018; 43:711-733. [PMID: 29938528 DOI: 10.1177/0145445518781770] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
We investigated the effects of a telehealth pyramidal training package on participants' implementation of incidental teaching. A total of eight adults worked with eight children with autism. Coaches were first taught to implement incidental teaching and then taught subsequent interventionists. The training package consisted of an online module and delayed video-based feedback provided via videoconferencing. Following the telehealth training program, coaches and interventionists reached the preset performance criteria and implemented incidental teaching with high fidelity. All of the child participants increased mands above baseline levels. Results suggest that interventionists can be trained via telehealth in behavior analytic interventions.
Collapse
Affiliation(s)
- Leslie Neely
- 1 The University of Texas at San Antonio, San Antonio, TX, USA
| | | | - Margot Boles
- 1 The University of Texas at San Antonio, San Antonio, TX, USA
| | - Kristi Morin
- 3 The University of North Carolina at Chapel Hill, NC, USA
| | | | | | | |
Collapse
|
16
|
Brand D, Elliffe D, DiGennaro Reed FD. Using sequential analysis to assess component integrity of discrete-trial teaching programs. ACTA ACUST UNITED AC 2017. [DOI: 10.1080/15021149.2017.1404392] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Denys Brand
- Department of Applied Behavioral Science, University of Kansas, Lawrence, USA
| | - Douglas Elliffe
- School of Psychology, University of Auckland, Auckland, New Zealand
| | | |
Collapse
|
17
|
Bergmann SC, Kodak TM, LeBlanc BA. Effects of Programmed Errors of Omission and Commission During Auditory-Visual Conditional Discrimination Training With Typically Developing Children. PSYCHOLOGICAL RECORD 2016. [DOI: 10.1007/s40732-016-0211-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|