1
|
Does deferral for high-risk behaviors improve the safety of the blood supply? Transfusion 2019; 59:2334-2343. [PMID: 30964551 DOI: 10.1111/trf.15286] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2019] [Revised: 03/19/2019] [Accepted: 03/19/2019] [Indexed: 11/26/2022]
Abstract
BACKGROUND Predonation donor deferral is used to select donors with presumed lower risk for transfused transmitted infections. The contribution to blood safety from this practice has not been reported previously for Brazil. STUDY DESIGN AND METHODS At four large Brazilian blood centers from September 2010 to March 2011, donors who were deferred due to responses on eligibility questions were invited to provide a blood sample to test for HIV, hepatitis C virus, hepatitis B virus, human T-lymphotropic virus, syphilis, and Trypanosoma cruzi and complete an audio computer-assisted structured interview on risk behaviors. RESULTS Of 299,848 potential donors during the study period, 66,870 were deferred with 10,453 (15.6%) for high-risk behaviors. Of those, 4860 (46.5%) were consecutively approached and 4013 (82.5%) participated. Disclosed risk behaviors by audio computer-assisted structured interview included 4 or more sexual partners in the past 12 months (15.0% of females [F] and 34.5% of males [M]), unprotected sex (62.0% F and 44.0% M), other high-risk sexual exposure (85.0% F and 73.0% M), being a person who injects drugs (3.0% F and 10.0% M), and test-seeking (17.0% F and 22.0% M). Eleven percent of deferred males reported male-to-male sex. Individuals who reported other high-risk sexual exposure, sexual partner risk, or male-to-male sex had the highest frequency of confirmed HIV: 1.2, 0.7, and 0.7%, respectively. Individuals who reported male-to-male sex, sexual partner risk, test seeking, and unprotected sex had the highest frequency of confirmed syphilis: 3.8, 3.3, 2.4, and 2.0%, respectively. CONCLUSION Donor deferral deters donation by individuals with risk behaviors and elevated rates of infectious disease markers.
Collapse
|
2
|
Design for success: Identifying a process for transitioning to an intensive online course delivery model in health professions education. MEDICAL EDUCATION ONLINE 2018; 23:1415617. [PMID: 29277143 PMCID: PMC5757231 DOI: 10.1080/10872981.2017.1415617] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Accepted: 12/04/2017] [Indexed: 05/30/2023]
Abstract
UNLABELLED Intensive courses (ICs), or accelerated courses, are gaining popularity in medical and health professions education, particularly as programs adopt e-learning models to negotiate challenges of flexibility, space, cost, and time. In 2014, the Department of Clinical Research and Leadership (CRL) at the George Washington University School of Medicine and Health Sciences began the process of transitioning two online 15-week graduate programs to an IC model. Within a year, a third program also transitioned to this model. A literature review yielded little guidance on the process of transitioning from 15-week, traditional models of delivery to IC models, particularly in online learning environments. Correspondingly, this paper describes the process by which CRL transitioned three online graduate programs to an IC model and details best practices for course design and facilitation resulting from our iterative redesign process. Finally, we present lessons-learned for the benefit of other medical and health professions' programs contemplating similar transitions. ABBREVIATIONS CRL: Department of Clinical Research and Leadership; HSCI: Health Sciences; IC: Intensive course; PD: Program director; QM: Quality Matters.
Collapse
|
3
|
Comparing student outcomes in traditional vs intensive, online graduate programs in health professional education. BMC MEDICAL EDUCATION 2018; 18:240. [PMID: 30342525 PMCID: PMC6196001 DOI: 10.1186/s12909-018-1343-7] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/03/2018] [Accepted: 10/05/2018] [Indexed: 05/22/2023]
Abstract
BACKGROUND Health professions' education programs are undergoing enormous changes, including increasing use of online and intensive, or time reduced, courses. Although evidence is mounting for online and intensive course formats as separate designs, literature investigating online and intensive formats in health professional education is lacking. The purpose of the study was to compare student outcomes (final grades and course evaluation ratings) for equivalent courses in semester long (15-week) versus intensive (7-week) online formats in graduate health sciences courses. METHODS This retrospective, observational study compared satisfaction and performance scores of students enrolled in three graduate health sciences programs in a large, urban US university. Descriptive statistics, chi square analysis, and independent t-tests were used to describe student samples and determine differences in student satisfaction and performance. RESULTS The results demonstrated no significant differences for four applicable items on the final student course evaluations (p values range from 0.127 to 1.00) between semester long and intensive course formats. Similarly, student performance scores for final assignment and final grades showed no significant differences (p = 0.35 and 0.690 respectively) between semester long and intensive course formats. CONCLUSION Findings from this study suggest that 7-week and 15-week online courses can be equally effective with regard to student satisfaction and performance outcomes. While further study is recommended, academic programs should consider intensive online course formats as an alternative to semester long online course formats.
Collapse
|
4
|
Combining practice and theory to assess strategic thinking. JOURNAL OF STRATEGY AND MANAGEMENT 2017. [DOI: 10.1108/jsma-02-2017-0012] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Purpose
The purpose of this paper is to describe the process used to develop and test the Individual Behavioral Assessment Tool for Strategic Thinking.
Design/methodology/approach
The instrument was developed using literature that identifies practices in use in organizations to assess strategic thinking competency and recommendations of scholars and practitioners to define strategic thinking and suggest how it could be assessed. Processes defined in the literature to develop competency measurements, both generally and for leadership and strategic management concepts specifically, were applied. A Delphi panel of experts reviewed the initial draft of the instrument which, with their refinements, was administered to participants in an executive leadership program.
Findings
Cronbach’s α and principal component analysis indicated that the instrument is internally consistent and unidimensional. Rasch analysis suggested a possible reduction in items that maintains good overall instrument performance.
Research limitations/implications
The study provides methodology for developing a measurement tool that fuses practice and theory. Further applications of the instrument across organizational levels and in single sectors would enhance its generalizability.
Practical implications
The instrument provides a consistent tool for use by practitioners to identify gaps in their own or another’s strategic thinking behaviors, specify a job-specific competency model, and direct professional development.
Originality/value
The instrument fills a gap in the theoretical literature by extending the descriptions of strategic thinking to include a comprehensive set of required individual behaviors. As such, it is the first theoretically based instrument to detail the specific competencies required to think strategically.
Collapse
|
5
|
The strategies to reduce iron deficiency in blood donors randomized trial: design, enrolment and early retention. Vox Sang 2014; 108:178-85. [PMID: 25469720 DOI: 10.1111/vox.12210] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2014] [Revised: 09/08/2014] [Accepted: 09/16/2014] [Indexed: 12/13/2022]
Abstract
BACKGROUND AND OBJECTIVES Repeated blood donation produces iron deficiency. Changes in dietary iron intake do not prevent donation-induced iron deficiency. Prolonging the interdonation interval or using oral iron supplements can mitigate donation-induced iron deficiency. The most effective operational methods for reducing iron deficiency in donors are unknown. MATERIALS AND METHODS 'Strategies To Reduce Iron Deficiency' (STRIDE) was a two-year, randomized, placebo-controlled study in blood donors. 692 donors were randomized into one of two educational groups or one of three interventional groups. Donors randomized to educational groups either received letters thanking them for donating, or, suggesting iron supplements or delayed donation if they had low ferritin. Donors randomized to interventional groups either received placebo, 19-mg or 38-mg iron pills. RESULTS Iron deficient erythropoiesis was present in 52·7% of males and 74·6% of females at enrolment. Adverse events within 60 days of enrolment were primarily mild gastrointestinal symptoms (64%). The incidence of de-enrolment within 60 days was more common in the interventional groups than in the educational groups (P = 0·002), but not more common in those receiving iron than placebo (P = 0·68). CONCLUSION The prevalence of iron deficient erythropoiesis in donors enrolled in the STRIDE study is comparable to previously described cohorts of regular blood donors. De-enrolment within 60 days was higher for donors receiving tablets, although no more common in donors receiving iron than placebo.
Collapse
|
6
|
Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems. DECISION SUPPORT SYSTEMS 2013; 55:728-739. [PMID: 23729945 PMCID: PMC3665424 DOI: 10.1016/j.dss.2013.02.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.
Collapse
|
7
|
Hepcidin level predicts hemoglobin concentration in individuals undergoing repeated phlebotomy. Haematologica 2013; 98:1324-30. [PMID: 23445875 DOI: 10.3324/haematol.2012.070979] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Dietary iron absorption is regulated by hepcidin, an iron regulatory protein produced by the liver. Hepcidin production is regulated by iron stores, erythropoiesis and inflammation, but its physiology when repeated blood loss occurs has not been characterized. Hepcidin was assayed in plasma samples obtained from 114 first-time/reactivated (no blood donations in preceding 2 years) female donors and 34 frequent (≥3 red blood cell donations in preceding 12 months) male donors as they were phlebotomized ≥4 times over 18-24 months. Hepcidin levels were compared to ferritin and hemoglobin levels using multivariable repeated measures regression models. Hepcidin, ferritin and hemoglobin levels declined with increasing frequency of donation in the first-time/reactivated females. Hepcidin and ferritin levels correlated well with each other (Spearman's correlation of 0.74), but on average hepcidin varied more between donations for a given donor relative to ferritin. In a multivariable repeated measures regression model the predicted inter-donation decline in hemoglobin varied as a function of hepcidin and ferritin; hemoglobin was 0.51 g/dL lower for subjects with low (>45.7 ng/mL) or decreasing hepcidin and low ferritin (>26 ng/mL), and was essentially zero for other subjects including those with high (>45.7 ng/mL) or increasing hepcidin and low ferritin (>26 ng/mL) levels (P<0.001). In conclusion, hepcidin levels change rapidly in response to dietary iron needed for erythropoiesis. The dynamic regulation of hepcidin in the presence of a low levels of ferritin suggests that plasma hepcidin concentration may provide clinically useful information about an individual's iron status (and hence capacity to tolerate repeated blood donations) beyond that of ferritin alone. Clinicaltrials.gov identifier: NCT00097006.
Collapse
|
8
|
Does Rh immune globulin suppress HLA sensitization in pregnancy? Transfusion 2012; 53:2069-77. [PMID: 23252646 DOI: 10.1111/trf.12049] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2012] [Revised: 10/17/2012] [Accepted: 10/30/2012] [Indexed: 12/29/2022]
Abstract
BACKGROUND How Rh immune globulin (RhIG) prevents sensitization to D antigen is unclear. If RhIG Fc delivers a nonspecific immunosuppressive signal, then RhIG may inhibit sensitization to antigens other than D. HLA antibody prevalence was compared in previously pregnant D- versus D+ women to investigate whether RhIG suppresses HLA sensitization. STUDY DESIGN AND METHODS In the Leukocyte Antibody Prevalence Study (LAPS), 7920 volunteer blood donors were screened for anti-HLA and surveyed about prior pregnancies and transfusions. A secondary analysis of the LAPS database was performed. RESULTS D- women not more than 40 years old (presumed to have received antenatal with or without postpartum RhIG in all pregnancies) had a significantly lower HLA sensitization rate than D+ women (relative risk, 0.58; 95% confidence interval [CI], 0.40-0.83). When stratified by deliveries (one, two, three, or four or more), D- women not older than 40 were HLA sensitized less often than D+ women in every case. In contrast, a clear relationship between D type and HLA sensitization was not seen in older previously pregnant women whose childbearing years are presumed to have preceded the use of routine RhIG prophylaxis. In a multivariable logistic regression model, D- women not more than 40 years old remained significantly less likely to be HLA sensitized compared with D+ women after adjusting for parity, time from last pregnancy, lost pregnancies, and transfusions (odds ratio [OR], 0.55; 95% CI, 0.34-0.88). CONCLUSION Consistent with a nonspecific immunosuppressive effect of RhIG, younger previously pregnant D- women were less likely than previously pregnant D+ women to be HLA sensitized.
Collapse
|
9
|
Abstract
BACKGROUND The safety of the blood supply is ensured through several procedures from donor selection to testing of donated units. Examination of the donor deferrals at different centers provides insights into the role that deferrals play in transfusion safety. STUDY DESIGN AND METHODS A cross-sectional descriptive study of prospective allogeneic blood donors at three large blood centers located in São Paulo, Belo Horizonte, and Recife, Brazil, from August 2007 to December 2009 was conducted. Deferrals were grouped into similar categories across the centers, and within each center frequencies out of all presentations were determined. RESULTS Of 963,519 prospective blood donors at the three centers, 746,653 (77.5%) were accepted and 216,866 (22.5%) were deferred. Belo Horizonte had the highest overall deferral proportion of 27%, followed by Recife (23%) and São Paulo (19%). Females were more likely to be deferred than males (30% vs. 18%, respectively). The three most common deferral reasons were low hematocrit or hemoglobin, medical diagnoses, and higher-risk behavior. CONCLUSION The types and frequencies of deferral vary substantially among the three blood centers. Factors that may explain the differences include demographic characteristics, the order in which health history and vital signs are taken, the staff training, and the way deferrals are coded by the centers among other policies. The results indicate that blood donor deferral in Brazil has regional aspects that should be considered when national policies are developed.
Collapse
|
10
|
The impact of HFE mutations on haemoglobin and iron status in individuals experiencing repeated iron loss through blood donation*. Br J Haematol 2011; 156:388-401. [PMID: 22118647 DOI: 10.1111/j.1365-2141.2011.08952.x] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Frequent blood donors become iron deficient. HFE mutations are present in over 30% of donors. A 24-month study of 888 first time/reactivated donors and 1537 frequent donors measured haemoglobin and iron status to assess how HFE mutations impact the development of iron deficiency erythropoiesis. Donors with two HFE mutations had increased baseline haemoglobin and iron stores as did those with one mutation, albeit to a lesser extent. Over multiple donations haemoglobin and iron status of donors with HFE mutations paralleled those lacking mutations. The prevalence of HFE mutations was not increased in higher intensity donors. Thus, in general, HFE mutations do not temper donation-induced changes in haemoglobin and iron status. However, in Black donors there was an increase of H63D carriers at baseline, from 3·7% in first time/reactivated donors to 15·8% in frequent donors, suggesting that the relative effects of HFE mutations on iron absorption may vary between racial/ethnic groups. In secondary analyses, venous haemoglobin decreased more slowly in donors with ferritin ≥12μg/l; and haemoglobin recovery time was shorter in donors with reticulocyte haemoglobin (CHr) ≥32·6pg, indicating that these biochemical measures are better indicators of a donor's response to phlebotomy than their HFE mutation status.
Collapse
|
11
|
Abstract
BACKGROUND In Brazil little is known about adverse reactions during donation and the donor characteristics that may be associated with such events. Donors are offered snacks and fluids before donating and are required to consume a light meal after donation. For these reasons the frequency of reactions may be different than those observed in other countries. STUDY DESIGN AND METHODS A cross-sectional study was conducted of eligible whole blood donors at three large blood centers located in Brazil between July 2007 and December 2009. Vasovagal reactions (VVRs) along with donor demographic and biometric data were collected. Reactions were defined as any presyncopal or syncopal event during the donation process. Multivariable logistic regression was performed to identify predictors of VVRs. RESULTS Of 724,861 donor presentations, 16,129 (2.2%) VVRs were recorded. Rates varied substantially between the three centers: 53, 290, and 381 per 10,000 donations in Recife, São Paulo, and Belo Horizonte, respectively. Although the reaction rates varied, the donor characteristics associated with VVRs were similar (younger age [18-29 years], replacement donors, first-time donors, low estimated blood volume [EBV]). In multivariable analysis controlling for differences between the donor populations in each city younger age, first-time donor status, and lower EBV were the factors most associated with reactions. CONCLUSION Factors associated with VVRs in other locations are also evident in Brazil. The difference in VVR rates between the three centers might be due to different procedures for identifying and reporting the reactions. Potential interventions to reduce the risk of reactions in Brazil should be considered.
Collapse
|
12
|
Abstract
BACKGROUND This study investigated the effect of blood donation environment, fixed or mobile with differing sponsor types, on donation return time. STUDY DESIGN AND METHODS Data from 2006 through 2009 at six US blood centers participating in the Retrovirus Epidemiology Donor Study-II (REDS-II) were used for analysis. Descriptive statistics stratified by whole blood (WB), plateletpheresis (PP), and double red blood cell (R2) donations were obtained for fixed and mobile locations, including median number of donations and median interdonation interval. A survival analysis estimated median return time at fixed and mobile sites, while controlling for censored return times, demographics, blood center, and mandatory recovery times. RESULTS Two-thirds (67.9%) of WB donations were made at mobile sites, 97.4% of PP donations were made at fixed sites, and R2 donations were equally distributed between fixed and mobile locations. For donations at fixed sites only or alternating between fixed and mobile sites, the highest median numbers of donations were nine and eight, respectively, and the shortest model-adjusted median return times (controlling for mandatory eligibility times of 56 and 112 days) were 36 and 30 days for WB and R2 donations, respectively. For PP donations, the shortest model-adjusted median return time was 23 days at a fixed location and the longest was 693 days at community locations. CONCLUSION WB, PP, and R2 donors with the shortest time between donations were associated with fixed locations and those alternating between fixed and mobile locations, even after controlling for differing mandatory recovery times for the different blood donation procedures.
Collapse
|
13
|
Abstract
BACKGROUND Blood centers are interested in understanding determinants of frequent blood donation. We hypothesized that participation in uncompensated research could result in higher donation rates. STUDY DESIGN AND METHODS Donation rates for 2425 subjects from six US blood centers enrolled in the Retrovirus Epidemiology Donor Study-II Donor Iron Status Evaluation Study were compared to those of nonenrolled donors (n = 202,383). Over 15 months, we compared mean donation rates and adjusted rate ratios (RRs) between enrolled and nonenrolled for three subgroups, first-time, reactivated, and frequent donors, and donation rates before and after the study enrollment period for frequent donors only. RESULTS Enrolled donors had higher 15-month mean donation rates than nonenrolled donors (first-time, 1.21 [RR = 1.91]; reactivated, 1.68 [RR = 1.83]; frequent, 3.40 [RR = 1.12]). However, frequent donors donated at approximately the same rate after enrollment as they did before enrollment in the study (3.62 per 15 months [RR = 1.12]). CONCLUSION Donors enrolled in the study donated at a higher rate than nonenrolled donors, but frequent donors remained consistent in their donation frequency both before and after enrollment. Although increased donation rates could have been causally related to study enrollment, we cannot rule out an enrollment bias whereby more committed donors were more likely to enroll in the study.
Collapse
|
14
|
Blood donations from previously transfused or pregnant donors: a multicenter study to determine the frequency of alloexposure. Transfusion 2010; 51:1197-206. [PMID: 21182532 DOI: 10.1111/j.1537-2995.2010.02991.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
BACKGROUND Transfusion-related acute lung injury (TRALI) mitigation strategies include the deferral of female donors from apheresis platelet (PLT) donations and the distribution of plasma for transfusion from male donors only. We studied the implications of these policies in terms of component loss at six blood centers in the United States. STUDY DESIGN AND METHODS We collected data from allogeneic blood donors making whole blood and blood component donations during calendar years 2006 through 2008. We analyzed the distribution of donations in terms of the sex, transfusion and pregnancy histories, and blood type. RESULTS A TRALI mitigation policy that would not allow plasma from female whole blood donors to be prepared into transfusable plasma components would result in nearly a 50% reduction in the units of whole blood available for plasma manufacturing and would decrease the number of type AB plasma units that could be made from whole blood donations by the same amount. Deferral of all female apheresis PLT donors, all female apheresis PLT donors with histories of prior pregnancies, or all female apheresis PLT donors with histories of prior pregnancies and positive screening test results for antibodies to human leukocyte antigens (HLAs) will result in a loss of 37.1, 22.5, and 5.4% of all apheresis PLT donations, respectively. CONCLUSION A TRALI mitigation policy that only defers female apheresis PLT donors with previous pregnancies and HLAs would result in an approximately 5% decrease in the inventory of apheresis PLTs, but would eliminate a large proportion of components that are associated with TRALI.
Collapse
|
15
|
Abstract
BACKGROUND The consequences of temporary predonation deferral are unsatisfactorily understood. Studies have found that deferral negatively impacts future donor return. However, the applicability of these findings across centers has not been established. STUDY DESIGN AND METHODS Using a cohort design, presenting donors with a temporary deferral in 2006 to 2008 in one of six categories (low hematocrit [Hct], blood pressure or pulse, feeling unwell, malaria travel, tattoos or piercing and related exposures, or could not wait or second thoughts) were passively followed for up to a 3-year period for the time to first return after deferral expiration at six US blood centers. Time-to-event methods were used to assess return. We also analyzed which donor characteristics were associated with return using multivariable logistic regression. RESULTS Of 3.9 million donor presentations, 505,623 resulted in deferral in the six categories. Low Hct was the most common deferral, had the shortest median time to return (time in days when 50% of deferred donors had returned), and had the largest cumulative proportion of donors returning. Deferrals of shorter duration had better return. Longer-term deferrals (up to 1 year in length) had the lowest cumulative return proportion, which did not exceed 50%. Return was associated with previously identified factors such as repeat donor status, older age, and higher educational attainment regardless of the type of deferral. In addition, return was associated with having been born in the United States and donation at fixed sites. CONCLUSION The category of temporary deferral influences the likelihood of future return, but the demographic and donation factors associated with return are largely consistent regardless of the deferral.
Collapse
|
16
|
Abstract
BACKGROUND Sponsored by the National Heart, Lung, and Blood Institute, the Retrovirus Epidemiology Donor Studies (REDS-I/-II) have conducted epidemiologic, laboratory, and survey research on volunteer blood donors. Some studies request additional permission to store biospecimens for future studies. The representativeness and applicability of studies performed using repositories may be reduced by low participation rates. STUDY DESIGN AND METHODS Demographics from subjects consenting to participate in the 2007 REDS-II Leukocyte Antibodies Prevalence Study (LAPS) repository were compared to "study-only" subjects. Data from the 1998 REDS-I survey of donor opinion regarding storage and use of biospecimens were also explored. RESULTS Overall, 91% of LAPS subjects agreed to participate in the repository. Odds of repository participation were lower among African American and Hispanic donors, 35- to 44-year-olds, donors who had not completed high school, and donors from one geographic location, regardless of other variables. Survey data from 1998 revealed that 97% of respondents approved of long-term storage of biospecimens, although only 87% indicated that they would personally participate. Many respondents would require notification or their permission be obtained before participation. Minority respondents would require permission or notification more often and were less certain they would personally participate in a repository. CONCLUSION Blood donors are quite willing to participate in biospecimen repositories. Regional differences and lower odds of participation in the minority blood donor population may result in a reduced number of biospecimens available for study and a decreased ability to definitely answer specific research questions in these populations.
Collapse
|
17
|
Abstract
BACKGROUND Approximately 10% of attempted blood donations are not allowed because of low hemoglobin (Hb) deferral. STUDY DESIGN AND METHODS Low Hb deferrals were tracked in more 715,000 whole blood donors at six blood centers across the United States. A multivariable logistic regression model was developed to comprehensively assess demographic correlates for low Hb deferral. RESULTS Demographic factors significantly associated with low Hb deferral include female sex (11 times greater odds than males), increasing age in men (men over 80 have 29 times greater odds than men under 20), African American race (2-2.5 times greater odds than Caucasians), Hispanic ethnicity in women (1.29 times greater odds than Caucasian women), and weight in men (men under 124 pounds have 2.5 times greater odds than men over 200 pounds). Interestingly, increasing donation frequency is associated with decreased odds for low Hb deferral (women with one donation in the previous 12 months have two times greater odds than those with six donations). CONCLUSIONS Low Hb deferral is associated with female sex, older age, African American race/ethnicity, and lower body weight in men. An inverse association with donation frequency suggests a selection bias in favor of donors able to give more frequently. These data provide useful information that can be utilized to manage blood donors to limit low Hb deferrals and assist in policy decisions such as changing the Hb cutoff or permissible frequency of donation. They also generate hypotheses for new research of the causes of anemia in defined groups of donors.
Collapse
|
18
|
Abstract
BACKGROUND To predict future blood donation behavior and improve donor retention, it is important to understand the determinants of donor return. STUDY DESIGN AND METHODS A self-administered questionnaire was completed in 2003 by 7905 current donors. With data mining methods, all factors measured by the survey were ranked as possible predictors of actual return within 12 months. Significant factors were analyzed with logistic regression to determine predictors of intention and of actual return. RESULTS Younger and minority donors were less likely to return in 12 months. Predictors of donor return were higher prior donation frequency, higher intention to return, a convenient place to donate, and having a good donation experience. Most factors associated with actual donor return were also associated with a high intention to return. Although not significant for actual return, feeling a responsibility to help others, higher empathetic concern, and a feeling that being a blood donor means more than just donating blood were related to high intention to return. CONCLUSION Prior donation frequency, intention to return, donation experience, and having a convenient location appear to significantly predict donor return. Clearly, donor behavior is dependent on more than one factor alone. Altruistic behavior, empathy, and social responsibility items did not enter our model to predict actual return. A donor's stated intention to give again is positively related to actual return and, while not a perfect measure, might be a useful proxy when donor return cannot be determined.
Collapse
|
19
|
role of altruistic behavior, empathetic concern, and social responsibility motivation in blood donation behavior. Transfusion 2007; 48:43-54. [PMID: 17894795 DOI: 10.1111/j.1537-2995.2007.01481.x] [Citation(s) in RCA: 53] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
BACKGROUND Blood donation can be described as a prosocial behavior, and donors often cite prosocial reasons such as altruism, empathy, or social responsibility for their willingness to donate. Previous studies have not quantitatively evaluated these characteristics in donors or examined how they relate to donation frequency. STUDY DESIGN AND METHODS As part of a donor motivation study, 12,064 current and lapsed donors answered questions used to create an altruistic behavior, empathetic concern, and social responsibility motivation score for each donor. Analysis of variance was used to compare mean scores by demographics and donor status and to determine the influence of each variable on the mean number of donations in the past 5 years. RESULTS The mean score for each prosocial characteristic appeared high, with lower scores in male and younger donors. Higher altruistic behavior and social responsibility motivation scores were associated with increased past donation frequency, but the effects were minor. Empathetic concern was not associated with prior donation. The largest differences in prior donations were by age and donor status, with older and current donors having given more frequently. CONCLUSION Most blood donors appear to have high levels of the primary prosocial characteristics (altruism, empathy, and social responsibility) commonly thought to be the main motivators for donation, but these factors do not appear to be the ones most strongly related to donation frequency. Traditional donor appeals based on these characteristics may need to be supplemented by approaches that address practical concerns like convenience, community safety, or personal benefit.
Collapse
|
20
|
Abstract
BACKGROUND To prevent donor loss and improve retention, it is important to understand the major deterrents to blood donation and to identify factors that can be effectively addressed by blood centers. STUDY DESIGN AND METHODS A 30-item self-administered questionnaire was completed in 2003 by 1705 first-time and 2437 repeat US donors who had not donated in 2 to 3 years. Asian, Hispanic, black, and white first-time and repeat donors rated the importance of deterrents to donation in their decision to not return with a 1 to 5 scale. Categorical analysis of variance methods were used to compare the importance of deterrents between first-time and repeat donors of different race or ethnicity. RESULTS Not having a convenient place to donate was most commonly cited as an important or very important reason for not returning by 32 to 42 percent of first-time and 26 to 43 percent of repeat respondents. Although bad treatment and poor staff skills were less of a barrier than convenience, they were more important for minority donors. Other factors such as physical side effects, foreign travel, or length of the process appeared less important. CONCLUSION Inconvenience is a major barrier to donating, suggesting that mobile collections and increased hours of operation might help recapture lapsed donors. The finding that lapsed minority donors were more likely to give bad treatment and poor staff skills as important reasons to not donate is disconcerting in light of the changing donor demographics and increased efforts to recruit these donors.
Collapse
|
21
|
Genetic linkage analysis of familial amyotrophic lateral sclerosis using human chromosome 21 microsatellite DNA markers. AMERICAN JOURNAL OF MEDICAL GENETICS 1994; 51:61-9. [PMID: 7913294 DOI: 10.1002/ajmg.1320510114] [Citation(s) in RCA: 43] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
Amyotrophic lateral sclerosis (ALS: Lou Gehrig's Disease) is a lethal neurodegenerative disease of upper and lower motorneurons in the brain and spinal cord. We previously reported linkage of a gene for familial ALS (FALS) to human chromosome 21 using 4 restriction fragment length polymorphism DNA markers [Siddique et al.: N Engl J Med 324:1381-1384, 1991] and identified disease-associated mutations in the superoxide dismutase (SOD)-1 gene in some ALS families [Rosen et al.: Nature 362:59-62, 1993]. We report here the genetic linkage data that led us to examine the SOD-1 gene for mutations. We also report a new microsatellite DNA marker for D21S63, derived from the cosmid PW517 [VanKeuren et al.: Am J Hum Genet 38:793-804, 1986]. Ten microsatellite DNA markers, including the new marker D21S63, were used to reinvestigate linkage of FALS to chromosome 21. Genetic linkage analysis performed with 13 ALS families for these 10 DNA markers confirmed the presence of a FALS gene on chromosome 21. The highest total 2-point LOD score for all families was 4.33, obtained at a distance of 10 cM from the marker D21S223. For 5 ALS families linked to chromosome 21, a peak 2-point LOD score of 5.94 was obtained at the DNA marker D21S223. A multipoint score of 6.50 was obtained with the markers D21S213, D21S223, D21S167, and FALS for 5 chromosome 21-linked ALS families. The haplotypes of these families for the 10 DNA markers revealed recombination events that further refined the location of the FALS gene to a segment of approximately 5 megabases (Mb) between D21S213 and D21S219.(ABSTRACT TRUNCATED AT 250 WORDS)
Collapse
|