1
|
Nichols E, Markot M, Gross AL, Jones RN, Meijer E, Schneider S, Lee J. The added value of metadata on test completion time for the quantification of cognitive functioning in survey research. J Int Neuropsychol Soc 2025:1-10. [PMID: 39783174 DOI: 10.1017/s1355617724000742] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/12/2025]
Abstract
OBJECTIVE Information on the time spent completing cognitive testing is often collected, but such data are not typically considered when quantifying cognition in large-scale community-based surveys. We sought to evaluate the added value of timing data over and above traditional cognitive scores for the measurement of cognition in older adults. METHOD We used data from the Longitudinal Aging Study in India-Diagnostic Assessment of Dementia (LASI-DAD) study (N = 4,091), to assess the added value of timing data over and above traditional cognitive scores, using item-specific regression models for 36 cognitive test items. Models were adjusted for age, gender, interviewer, and item score. RESULTS Compared to Quintile 3 (median time), taking longer to complete specific items was associated (p < 0.05) with lower cognitive performance for 67% (Quintile 5) and 28% (Quintile 4) of items. Responding quickly (Quintile 1) was associated with higher cognitive performance for 25% of simpler items (e.g., orientation for year), but with lower cognitive functioning for 63% of items requiring higher-order processing (e.g., digit span test). Results were consistent in a range of different analyses adjusting for factors including education, hearing impairment, and language of administration and in models using splines rather than quintiles. CONCLUSIONS Response times from cognitive testing may contain important information on cognition not captured in traditional scoring. Incorporation of this information has the potential to improve existing estimates of cognitive functioning.
Collapse
Affiliation(s)
- Emma Nichols
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, USA
| | - Michael Markot
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
| | - Alden L Gross
- Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Richard N Jones
- Department of Psychiatry and Human Behavior, Warren Alpert Medical School, Brown University, Providence, RI, USA
| | - Erik Meijer
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
| | - Stefan Schneider
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, USA
- Department of Psychiatry, University of Southern California, Los Angeles, CA, USA
| | - Jinkook Lee
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
- Department of Economics, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|
2
|
Gao H, Schneider S, Hernandez R, Harris J, Maupin D, Junghaenel DU, Kapteyn A, Stone A, Zelinski E, Meijer E, Lee PJ, Orriens B, Jin H. Early Identification of Cognitive Impairment in Community Environments Through Modeling Subtle Inconsistencies in Questionnaire Responses: Machine Learning Model Development and Validation. JMIR Form Res 2024; 8:e54335. [PMID: 39536306 PMCID: PMC11602764 DOI: 10.2196/54335] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Revised: 06/18/2024] [Accepted: 09/23/2024] [Indexed: 11/16/2024] Open
Abstract
BACKGROUND The underdiagnosis of cognitive impairment hinders timely intervention of dementia. Health professionals working in the community play a critical role in the early detection of cognitive impairment, yet still face several challenges such as a lack of suitable tools, necessary training, and potential stigmatization. OBJECTIVE This study explored a novel application integrating psychometric methods with data science techniques to model subtle inconsistencies in questionnaire response data for early identification of cognitive impairment in community environments. METHODS This study analyzed questionnaire response data from participants aged 50 years and older in the Health and Retirement Study (waves 8-9, n=12,942). Predictors included low-quality response indices generated using the graded response model from four brief questionnaires (optimism, hopelessness, purpose in life, and life satisfaction) assessing aspects of overall well-being, a focus of health professionals in communities. The primary and supplemental predicted outcomes were current cognitive impairment derived from a validated criterion and dementia or mortality in the next ten years. Seven predictive models were trained, and the performance of these models was evaluated and compared. RESULTS The multilayer perceptron exhibited the best performance in predicting current cognitive impairment. In the selected four questionnaires, the area under curve values for identifying current cognitive impairment ranged from 0.63 to 0.66 and was improved to 0.71 to 0.74 when combining the low-quality response indices with age and gender for prediction. We set the threshold for assessing cognitive impairment risk in the tool based on the ratio of underdiagnosis costs to overdiagnosis costs, and a ratio of 4 was used as the default choice. Furthermore, the tool outperformed the efficiency of age or health-based screening strategies for identifying individuals at high risk for cognitive impairment, particularly in the 50- to 59-year and 60- to 69-year age groups. The tool is available on a portal website for the public to access freely. CONCLUSIONS We developed a novel prediction tool that integrates psychometric methods with data science to facilitate "passive or backend" cognitive impairment assessments in community settings, aiming to promote early cognitive impairment detection. This tool simplifies the cognitive impairment assessment process, making it more adaptable and reducing burdens. Our approach also presents a new perspective for using questionnaire data: leveraging, rather than dismissing, low-quality data.
Collapse
Affiliation(s)
- Hongxin Gao
- School of Health Sciences, University of Surrey, Guildford, United Kingdom
| | - Stefan Schneider
- Center for Self-Report Science, University of Southern California, Los Angeles, CA, United States
- Department of Psychology, University of Southern California, Los Angeles, CA, United States
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, United States
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Raymond Hernandez
- Center for Self-Report Science, University of Southern California, Los Angeles, CA, United States
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Jenny Harris
- School of Health Sciences, University of Surrey, Guildford, United Kingdom
| | - Danny Maupin
- School of Health Sciences, University of Surrey, Guildford, United Kingdom
| | - Doerte U Junghaenel
- Center for Self-Report Science, University of Southern California, Los Angeles, CA, United States
- Department of Psychology, University of Southern California, Los Angeles, CA, United States
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, United States
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Arie Kapteyn
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Arthur Stone
- Center for Self-Report Science, University of Southern California, Los Angeles, CA, United States
- Department of Psychology, University of Southern California, Los Angeles, CA, United States
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Elizabeth Zelinski
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, United States
| | - Erik Meijer
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Pey-Jiuan Lee
- Center for Self-Report Science, University of Southern California, Los Angeles, CA, United States
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Bart Orriens
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, United States
| | - Haomiao Jin
- School of Health Sciences, University of Surrey, Guildford, United Kingdom
| |
Collapse
|
3
|
Giaquinto F, Assecondi S, Leccese G, Romano DL, Angelelli P. Normative study of SATURN: a digital, self-administered, open-source cognitive assessment tool for Italians aged 50-80. Front Psychol 2024; 15:1456619. [PMID: 39539307 PMCID: PMC11557479 DOI: 10.3389/fpsyg.2024.1456619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2024] [Accepted: 10/14/2024] [Indexed: 11/16/2024] Open
Abstract
Introduction This study aimed to establish normative data for the Self-Administered Tasks Uncovering Risk of Neurodegeneration (SATURN), a brief computer-based test for global cognitive assessment through accuracy and response times on tasks related to memory, attention, temporal orientation, visuo-constructional abilities, math (calculation), executive functions, and reading speed. Methods A sample of 323 Italian individuals with Montreal Cognitive Assessment (MoCA) equivalent score ≥1 (180 females; average age: 61.33 years; average education: 11.32 years), stratified by age, education, and sex, completed SATURN using PsychoPy, and a paper-and-pencil protocol consisting of Mini-Mental State Examination (MMSE) and MoCA. Data analyses included: (i) correlations between the total accuracy scores of SATURN and those of MMSE and MoCA; (ii) multiple regressions to determine the impact of sex, age, and education, along with the computation of adjusted scores; (iii) the calculation of inner and outer tolerance limits, equivalent scores, and the development of correction grids. Results The mean total time on tasks was 6.72 ± 3.24 min. Age and education significantly influence the SATURN total accuracy, while sex influences the total time on tasks. Specific sociodemographic characteristics influence subdomain accuracies and times on task differently. For the adjusted SATURN total score, the outer limit corresponds to 16.56 out of 29.00 (cut-off), while the inner limit is 18.57. SATURN significantly correlates with MMSE and MoCA. Discussion In conclusion, SATURN is the first open-source digital tool for initial cognitive assessment in Italy, showing potential for self-administration in primary care, and remote administration. Future studies need to assess its sensitivity and specificity in detecting pathological cognitive decline.
Collapse
Affiliation(s)
- Francesco Giaquinto
- Laboratory of Applied Psychology and Intervention, Department of Human and Social Sciences, University of Salento, Lecce, Italy
| | - Sara Assecondi
- Center for Mind/Brain Sciences – CIMeC, University of Trento, Rovereto, Italy
| | - Giuliana Leccese
- Laboratory of Applied Psychology and Intervention, Department of Medicine, University of Salento, Lecce, Italy
| | - Daniele Luigi Romano
- Department of Psychology and Milan Center for Neuroscience (NeuroMi), University of Milano-Bicocca, Milan, Italy
| | - Paola Angelelli
- Laboratory of Applied Psychology and Intervention, Department of Medicine, University of Salento, Lecce, Italy
| |
Collapse
|
4
|
Kapteyn A, Angrisani M, Darling J, Gutsche T. The Understanding America Study (UAS). BMJ Open 2024; 14:e088183. [PMID: 39448221 PMCID: PMC11499792 DOI: 10.1136/bmjopen-2024-088183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/30/2024] [Accepted: 09/09/2024] [Indexed: 10/26/2024] Open
Abstract
PURPOSE The Understanding America Study (UAS) is a probability-based Internet panel housed at the Center for Economic and Social Research at the University of Southern California (USC). The UAS serves as a social and health sciences infrastructure for collecting data on the daily lives of US families and individuals. The collected information includes survey data, DNA from saliva samples, information from wearables, contextual and administrative linkages, ecological momentary assessments, self-recorded narratives and electronic records of financial transactions. The information collected focuses on a defining challenge of our time-identifying factors explaining racial, ethnic, geographic and socioeconomic disparities over the life course, including racial discrimination, inequalities in access to education and healthcare, differences in physical, economic and social environments, and, more generally, the various opportunities and obstacles one encounters over the life course. The UAS infrastructure aims to optimise engagement with the wider research community both in data dissemination and in soliciting input on content and methods. To encourage input from the research community, we have reserved 100 000 min of survey time per year for outside researchers, who can propose to add survey questions four times a year. PARTICIPANTS The UAS currently comprises about 15 000 US residents (including a 3500-person California oversample) recruited by Address-Based Sampling and provided with Internet-enabled tablets if needed. Surveys are conducted in English and Spanish. FINDINGS TO DATE Since the founding of the UAS in 2014, we have conducted more than 600 surveys, including a sequence of surveys collecting biennial information on health and retirement (the complete Health and Retirement Study instrument), 11 cognitive assessments, personality, knowledge and use of information on Social Security programme rules, work disability and subjective well-being. Several hundreds of papers have been published based on the collected data in the UAS. Studies include documentations of the mental health effects of the COVID-19 pandemic and how this varied across socioeconomic groups; comparisons of physical activity measured with accelerometers and by self-reports showing the dramatic biases in the latter; extensive studies have shown the power of using paradata in gauging cognitive change over time; several messaging experiments have shown the effectiveness of information provision on the quality of decision-making affecting well-being at older ages. FUTURE PLANS The UAS national sample is planned to grow to 20 000 respondents by 2025, with subsamples of about 2500 African American, 2000 Asian and 3000 Hispanic participants and an oversample of rural areas. An increasing amount of non-interview data (contextual information, data from a suite of wearables and administrative linkages) is continually being added to the data files.
Collapse
Affiliation(s)
- Arie Kapteyn
- Center for Economic and Social Research, University of Southern California, Los Angeles, California, USA
- Department of Economics, University of Southern California, Los Angeles, California, USA
| | - Marco Angrisani
- Center for Economic and Social Research, University of Southern California, Los Angeles, California, USA
- Department of Economics, University of Southern California, Los Angeles, California, USA
| | - Jill Darling
- Center for Economic and Social Research, University of Southern California, Los Angeles, California, USA
| | - Tania Gutsche
- Center for Economic and Social Research, University of Southern California, Los Angeles, California, USA
| |
Collapse
|
5
|
Schneider S, Hernandez R, Junghaenel DU, Jin H, Lee PJ, Gao H, Maupin D, Orriens B, Meijer E, Stone AA. Can you tell people's cognitive ability level from their response patterns in questionnaires? Behav Res Methods 2024; 56:6741-6758. [PMID: 38528247 PMCID: PMC11362444 DOI: 10.3758/s13428-024-02388-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/02/2024] [Indexed: 03/27/2024]
Abstract
Questionnaires are ever present in survey research. In this study, we examined whether an indirect indicator of general cognitive ability could be developed based on response patterns in questionnaires. We drew on two established phenomena characterizing connections between cognitive ability and people's performance on basic cognitive tasks, and examined whether they apply to questionnaires responses. (1) The worst performance rule (WPR) states that people's worst performance on multiple sequential tasks is more indicative of their cognitive ability than their average or best performance. (2) The task complexity hypothesis (TCH) suggests that relationships between cognitive ability and performance increase with task complexity. We conceptualized items of a questionnaire as a series of cognitively demanding tasks. A graded response model was used to estimate respondents' performance for each item based on the difference between the observed and model-predicted response ("response error" scores). Analyzing data from 102 items (21 questionnaires) collected from a large-scale nationally representative sample of people aged 50+ years, we found robust associations of cognitive ability with a person's largest but not with their smallest response error scores (supporting the WPR), and stronger associations of cognitive ability with response errors for more complex than for less complex questions (supporting the TCH). Results replicated across two independent samples and six assessment waves. A latent variable of response errors estimated for the most complex items correlated .50 with a latent cognitive ability factor, suggesting that response patterns can be utilized to extract a rough indicator of general cognitive ability in survey research.
Collapse
Affiliation(s)
- Stefan Schneider
- Dornsife Center for Self-Report Science, and Center for Economic & Social Research, University of Southern California, 635 Downey Way, Los Angeles, CA, 90089-3332, USA.
- Department of Psychology, University of Southern California, Los Angeles, CA, USA.
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, USA.
| | - Raymond Hernandez
- Dornsife Center for Self-Report Science, and Center for Economic & Social Research, University of Southern California, 635 Downey Way, Los Angeles, CA, 90089-3332, USA
| | - Doerte U Junghaenel
- Dornsife Center for Self-Report Science, and Center for Economic & Social Research, University of Southern California, 635 Downey Way, Los Angeles, CA, 90089-3332, USA
- Department of Psychology, University of Southern California, Los Angeles, CA, USA
- Leonard Davis School of Gerontology, University of Southern California, Los Angeles, CA, USA
| | - Haomiao Jin
- School of Health Sciences, Faculty of Health and Medical Sciences, University of Surrey, Guildford, UK
| | - Pey-Jiuan Lee
- Dornsife Center for Self-Report Science, and Center for Economic & Social Research, University of Southern California, 635 Downey Way, Los Angeles, CA, 90089-3332, USA
| | - Hongxin Gao
- School of Health Sciences, Faculty of Health and Medical Sciences, University of Surrey, Guildford, UK
| | - Danny Maupin
- School of Health Sciences, Faculty of Health and Medical Sciences, University of Surrey, Guildford, UK
| | - Bart Orriens
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
| | - Erik Meijer
- Center for Economic and Social Research, University of Southern California, Los Angeles, CA, USA
| | - Arthur A Stone
- Dornsife Center for Self-Report Science, and Center for Economic & Social Research, University of Southern California, 635 Downey Way, Los Angeles, CA, 90089-3332, USA
- Department of Psychology, University of Southern California, Los Angeles, CA, USA
| |
Collapse
|