1
|
Pasula EY, Brown GG, McKenna BS, Mellor A, Turner T, Anderson C, Drummond SPA. Effects of sleep deprivation on component processes of working memory in younger and older adults. Sleep 2018; 41:4816035. [DOI: 10.1093/sleep/zsx213] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022] Open
Affiliation(s)
- Elissa Y Pasula
- Monash Institute of Cognitive and Clinical Neuroscience, School of Psychological Sciences, Monash University Clayton, Clayton, Australia
| | - Gregory G Brown
- Department of Psychiatry, University of California San Diego, San Diego
| | | | - Alix Mellor
- Monash Institute of Cognitive and Clinical Neuroscience, School of Psychological Sciences, Monash University Clayton, Clayton, Australia
| | - Travis Turner
- Department of Neurology, Medical University of South Carolina, Columbia
| | - Clare Anderson
- Monash Institute of Cognitive and Clinical Neuroscience, School of Psychological Sciences, Monash University Clayton, Clayton, Australia
| | - Sean P A Drummond
- Monash Institute of Cognitive and Clinical Neuroscience, School of Psychological Sciences, Monash University Clayton, Clayton, Australia
- Department of Psychiatry, University of California San Diego, San Diego
| |
Collapse
|
2
|
Brown GG, Thomas ML, Patt V. Parametric model measurement: reframing traditional measurement ideas in neuropsychological practice and research. Clin Neuropsychol 2017; 31:1047-1072. [PMID: 28617067 DOI: 10.1080/13854046.2017.1334829] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
OBJECTIVE Neuropsychology is an applied measurement field with its psychometric work primarily built upon classical test theory (CTT). We describe a series of psychometric models to supplement the use of CTT in neuropsychological research and test development. METHOD We introduce increasingly complex psychometric models as measurement algebras, which include model parameters that represent abilities and item properties. Within this framework of parametric model measurement (PMM), neuropsychological assessment involves the estimation of model parameters with ability parameter values assuming the role of test 'scores'. Moreover, the traditional notion of measurement error is replaced by the notion of parameter estimation error, and the definition of reliability becomes linked to notions of item and test information. The more complex PMM approaches incorporate into the assessment of neuropsychological performance formal parametric models of behavior validated in the experimental psychology literature, along with item parameters. These PMM approaches endorse the use of experimental manipulations of model parameters to assess a test's construct representation. Strengths and weaknesses of these models are evaluated by their implications for measurement error conditional upon ability level, sensitivity to sample characteristics, computational challenges to parameter estimation, and construct validity. CONCLUSION A family of parametric psychometric models can be used to assess latent processes of interest to neuropsychologists. By modeling latent abilities at the item level, psychometric studies in neuropsychology can investigate construct validity and measurement precision within a single framework and contribute to a unification of statistical methods within the framework of generalized latent variable modeling.
Collapse
Affiliation(s)
- Gregory G Brown
- a Psychology Service (116B) , VA San Diego Healthcare System , San Diego , CA , USA
| | - Michael L Thomas
- b Department of Psychiatry , University of California , San Diego , CA , USA
| | - Virginie Patt
- c San Diego State University/University of California San Diego Joint Doctoral Program in Clinical Psychology , San Diego , CA , USA
| |
Collapse
|
3
|
Thomas ML, Patt VM, Bismark A, Sprock J, Tarasenko M, Light GA, Brown GG. Evidence of systematic attenuation in the measurement of cognitive deficits in schizophrenia. JOURNAL OF ABNORMAL PSYCHOLOGY 2017; 126:312-324. [PMID: 28277736 PMCID: PMC5378601 DOI: 10.1037/abn0000256] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Cognitive tasks that are too hard or too easy produce imprecise measurements of ability, which, in turn, attenuates group differences and can lead to inaccurate conclusions in clinical research. We aimed to illustrate this problem using a popular experimental measure of working memory-the N-back task-and to suggest corrective strategies for measuring working memory and other cognitive deficits in schizophrenia. Samples of undergraduates (n = 42), community controls (n = 25), outpatients with schizophrenia (n = 33), and inpatients with schizophrenia (n = 17) completed the N-back. Predictors of task difficulty-including load, number of word syllables, and presentation time-were experimentally manipulated. Using a methodology that combined techniques from signal detection theory and item response theory, we examined predictors of difficulty and precision on the N-back task. Load and item type were the 2 strongest predictors of difficulty. Measurement precision was associated with ability, and ability varied by group; as a result, patients were measured more precisely than controls. Although difficulty was well matched to the ability levels of impaired examinees, most task conditions were too easy for nonimpaired participants. In a simulation study, N-back tasks primarily consisting of 1- and 2-back load conditions were unreliable, and attenuated effect size (Cohen's d) by as much as 50%. The results suggest that N-back tasks, as commonly designed, may underestimate patients' cognitive deficits as a result of nonoptimized measurement properties. Overall, this cautionary study provides a template for identifying and correcting measurement problems in clinical studies of abnormal cognition. (PsycINFO Database Record
Collapse
Affiliation(s)
- Michael L. Thomas
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
- VISN-22 Mental Illness, Research, Education and Clinical Center (MIRECC), VA San Diego Healthcare System, San Diego, CA, United States
| | - Virginie M. Patt
- San Diego State University/University of California, San Diego, Joint Doctoral Program in Clinical Psychology
| | - Andrew Bismark
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
- VISN-22 Mental Illness, Research, Education and Clinical Center (MIRECC), VA San Diego Healthcare System, San Diego, CA, United States
| | - Joyce Sprock
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
| | - Melissa Tarasenko
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
- VISN-22 Mental Illness, Research, Education and Clinical Center (MIRECC), VA San Diego Healthcare System, San Diego, CA, United States
| | - Gregory A. Light
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
- VISN-22 Mental Illness, Research, Education and Clinical Center (MIRECC), VA San Diego Healthcare System, San Diego, CA, United States
| | - Gregory G. Brown
- Department of Psychiatry, University of California San Diego, La Jolla, CA, United States
- VISN-22 Mental Illness, Research, Education and Clinical Center (MIRECC), VA San Diego Healthcare System, San Diego, CA, United States
| |
Collapse
|
5
|
Thomas ML, Brown GG, Gur RC, Moore TM, Patt VM, Nock MK, Naifeh JA, Heeringa S, Ursano RJ, Stein MB. Measurement of latent cognitive abilities involved in concept identification learning. J Clin Exp Neuropsychol 2015; 37:653-69. [PMID: 26147832 DOI: 10.1080/13803395.2015.1042358] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
INTRODUCTION We used cognitive and psychometric modeling techniques to evaluate the construct validity and measurement precision of latent cognitive abilities measured by a test of concept identification learning: the Penn Conditional Exclusion Test (PCET). METHOD Item response theory parameters were embedded within classic associative- and hypothesis-based Markov learning models and were fitted to 35,553 Army soldiers' PCET data from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). RESULTS Data were consistent with a hypothesis-testing model with multiple latent abilities-abstraction and set shifting. Latent abstraction ability was positively correlated with number of concepts learned, and latent set-shifting ability was negatively correlated with number of perseverative errors, supporting the construct validity of the two parameters. Abstraction was most precisely assessed for participants with abilities ranging from 1.5 standard deviations below the mean to the mean itself. Measurement of set shifting was acceptably precise only for participants making a high number of perseverative errors. CONCLUSIONS The PCET precisely measures latent abstraction ability in the Army STARRS sample, especially within the range of mildly impaired to average ability. This precision pattern is ideal for a test developed to measure cognitive impairment as opposed to cognitive strength. The PCET also measures latent set-shifting ability, but reliable assessment is limited to the impaired range of ability, reflecting that perseverative errors are rare among cognitively healthy adults. Integrating cognitive and psychometric models can provide information about construct validity and measurement precision within a single analytical framework.
Collapse
Affiliation(s)
- Michael L Thomas
- a Department of Psychiatry , University of California , San Diego , CA , USA
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
6
|
Patt VM, Thomas ML, Minassian A, Geyer MA, Brown GG, Perry W. Disentangling working memory processes during spatial span assessment: a modeling analysis of preferred eye movement strategies. J Clin Exp Neuropsychol 2014; 36:186-204. [PMID: 24499103 DOI: 10.1080/13803395.2013.877123] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
The neurocognitive processes involved during classic spatial working memory (SWM) assessment were investigated by examining naturally preferred eye movement strategies. Cognitively healthy adult volunteers were tested in a computerized version of the Corsi Block-Tapping Task--a spatial span task requiring the short term maintenance of a series of locations presented in a specific order--coupled with eye tracking. Modeling analysis was developed to characterize eye-tracking patterns across all task phases, including encoding, retention, and recall. Results revealed a natural preference for local gaze maintenance during both encoding and retention, with fewer than 40% fixated targets. These findings contrasted with the stimulus retracing pattern expected during recall as a result of task demands, with 80% fixated targets. Along with participants' self-reported strategies of mentally "making shapes," these results suggest the involvement of covert attention shifts and higher order cognitive Gestalt processes during spatial span tasks, challenging instrument validity as a single measure of SWM storage capacity.
Collapse
Affiliation(s)
- Virginie M Patt
- a Department of Psychiatry , University of California , San Diego , CA , USA
| | | | | | | | | | | |
Collapse
|
7
|
Thomas ML, Brown GG, Gur RC, Hansen JA, Nock MK, Heeringa S, Ursano RJ, Stein MB. Parallel psychometric and cognitive modeling analyses of the Penn Face Memory Test in the Army Study to Assess Risk and Resilience in Servicemembers. J Clin Exp Neuropsychol 2013; 35:225-45. [PMID: 23383967 PMCID: PMC3600160 DOI: 10.1080/13803395.2012.762974] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
OBJECTIVE The psychometric properties of the Penn Face Memory Test (PFMT) were investigated in a large sample (4,236 participants) of U.S. Army Soldiers undergoing computerized neurocognitive testing. Data were drawn from the initial phase of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS), a large-scale study directed towards identifying risk and resilience factors for suicidal behavior and other stress-related disorders in Army Soldiers. In this paper, we report parallel psychometric and cognitive modeling analyses of the PFMT to determine whether ability estimates derived from the measure are precise and valid indicators of memory in the Army STARRS sample. METHOD Single-sample cross-validation methodology combined with exploratory factor and multidimensional item response theory techniques were used to explore the latent structure of the PFMT. To help resolve rotational indeterminacy of the exploratory solution, latent constructs were aligned with parameter estimates derived from an unequal-variance signal detection model. RESULTS Analyses suggest that the PFMT measures two distinct latent constructs, one associated with memory strength and one associated with response bias, and that test scores are generally precise indicators of ability for the majority of Army STARRS participants. CONCLUSIONS These findings support the use of the PFMT as a measure of major constructs related to recognition memory and have implications for further cognitive-psychometric model development.
Collapse
Affiliation(s)
- Michael L Thomas
- Department of Psychiatry, University of California, San Diego, CA, USA
| | | | | | | | | | | | | | | |
Collapse
|