1
|
Breast ductal carcinoma in situ with micro-invasion versus ductal carcinoma in situ: a comparative analysis of clinicopathological and mammographic findings. Clin Radiol 2021; 76:787.e1-787.e7. [PMID: 34052010 DOI: 10.1016/j.crad.2021.04.011] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Accepted: 04/21/2021] [Indexed: 12/24/2022]
Abstract
AIM To determine the differences in clinicopathological and mammographic findings between ductal carcinoma in situ (DCIS) and ductal carcinoma in situ with micro-invasion (DCIS-MI) and explore clinicopathological and mammographic factors associated with DCIS-MI. MATERIALS AND METHODS All DCIS patients with or without micro-invasion who underwent preoperative mammography at The Affiliated Hospital of Qingdao University from January 2016 through June 2020 were identified retrospectively. The correlations of clinicopathological findings with DCIS-MI were evaluated using univariate and multivariate binary logistic regression analyses. Imaging findings were compared between the groups by using the Pearson chi-square test. RESULTS A total of 445 DCIS lesions and 151 DCIS-MI lesions were included in the final analysis. Large extent (≥2.7 cm), high nuclear grade, comedo-type, negative progesterone receptor (PR), negative oestrogen receptor (ER), high Ki-67 and axillary lymph node metastasis were more frequently found in DCIS-MI than in DCIS (all p<0.05), and the first four of these were found to be independent predictors of DCIS-MI in the multivariate analysis (all p<0.05). Regarding imaging findings, compared to DCIS, DCIS-MI showed fewer occult lesions and more lesions with calcifications in mass, asymmetry, and architectural distortion (p=0.004). Grouped calcifications were usually associated with DCIS, while regional calcifications were commonly found in DCIS-MI (p<0.05). CONCLUSION Large extent, high nuclear grade, comedo-type and negative PR were found to be independent predictors of DCIS-MI. Lesions with calcifications and regional calcifications were more likely associated with DCIS-MI on mammography.
Collapse
|
2
|
Fabrication of epidermal growth factor imprinted and demethylcantharidin loaded dendritic mesoporous silica nanoparticle: An integrated drug vehicle for chemo-/antibody synergistic cancer therapy. J Drug Deliv Sci Technol 2021. [DOI: 10.1016/j.jddst.2021.102387] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
3
|
[Effect of autologous skin paste on repairing wound of donor site of medium-thickness skin graft]. ZHONGHUA SHAO SHANG ZA ZHI = ZHONGHUA SHAOSHANG ZAZHI = CHINESE JOURNAL OF BURNS 2021; 37:1-5. [PMID: 33706433 DOI: 10.3760/cma.j.cn501120-20200304-00121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
Objective: To explore the effect of autologous skin paste on repairing wound of medium-thickness skin donor site. Methods: The prospective randomized controlled research method was applied. From October 2018 to December 2019, 18 patients with flame burn or hydrothermal scald, met the inclusion criteria were admitted to Jinhua Hospital Affiliated to Zhejiang University School of Medicine, including 15 males and 3 females, aged (45±6) years, and the wounds were repaired with medium-thickness skin grafts. The wound area after medium-thickness skin grafting was (121±33) cm2. The wound of donor site of medium-thickness skin graft in each patient was divided into 2 wounds in equal area and included into autologous skin paste group and conventional treatment group with random number table, with 18 wounds in each group.The wounds in autologous skin paste group were repaired with skin paste prepared with remaining skin fragments after autologous medium-thickness skin grafting, and the wounds in conventional treatment group were repaired with petroleum jelly gauze and sterile gauze. On 3, 7, 14, 21 d after operation, the wound healing in 2 groups was observed, and the wound healing rate was calculated. The wound healing time in 2 groups was recorded. Occurrences of subcutaneous effusion and infection on 3, 7, 14, 21 d after operation and wound rupture in 3 months after operation were observed. In 6 months after operation, the Vancouver scar scale (VSS) was used to evaluate the scar formation of wounds in 2 groups. Data were statistically analyzed with analysis of variance for repeated measurement, chi-square test, and group t test. Results: The wounds in 2 groups did not heal on 3 and 7 d after operation. The wound healing rate in autologous skin paste group was (29.8±2.5)% and (95.6±4.7)% on 14 and 21 d after operation, which were significantly higher than (25.8±2.9)% and (82.6±8.9)% in conventional treatment group (t=4.3, 5.6, P<0.01). The wound healing time in autologous skin paste group was (21.8±1.4) d, which was significantly shorter than (25.6±2.0) d in conventional treatment group (t=6.24, P<0.01). On 3, 7, 14, 21 d after operation, there were no complications such as subcutaneous effusion and infection in wounds of 2 groups. In 3 months after operation, ulceration occurred in wounds of 2 patients in autologous skin paste group, which was significantly less than 12 patients in conventional treatment group (χ2=11.688, P<0.01). The wounds with ulceration healed after dressing change. In 6 months after operation, the VSS score of wounds in autologous skin paste group was (9.1±1.1) points, which was significantly lower than (11.3±1.2) points in conventional treatment group (t=-5.75, P<0.01). Conclusion: The remaining fragments after autologous medium-thickness skin grafting are prepared into skin paste to repair wound of donor site of medium-thickness skin graft can shorten wound healing time, improve wound healing quality, reduce degree of scar hyperplasia, which has a good clinical effect.
Collapse
|
4
|
Unconventional Transverse Transport above and below the Magnetic Transition Temperature in Weyl Semimetal EuCd_{2}As_{2}. PHYSICAL REVIEW LETTERS 2021; 126:076602. [PMID: 33666464 DOI: 10.1103/physrevlett.126.076602] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Revised: 11/13/2020] [Accepted: 01/25/2021] [Indexed: 06/12/2023]
Abstract
As exemplified by the growing interest in the quantum anomalous Hall effect, the research on topology as an organizing principle of quantum matter is greatly enriched from the interplay with magnetism. In this vein, we present a combined electrical and thermoelectrical transport study on the magnetic Weyl semimetal EuCd_{2}As_{2}. Unconventional contribution to the anomalous Hall and anomalous Nernst effects were observed both above and below the magnetic transition temperature of EuCd_{2}As_{2}, indicating the existence of significant Berry curvature. EuCd_{2}As_{2} represents a rare case in which this unconventional transverse transport emerges both above and below the magnetic transition temperature in the same material. The transport properties evolve with temperature and field in the antiferromagnetic phase in a different manner than in the paramagnetic phase, suggesting different mechanisms to their origin. Our results indicate EuCd_{2}As_{2} is a fertile playground for investigating the interplay between magnetism and topology, and potentially a plethora of topologically nontrivial phases rooted in this interplay.
Collapse
|
5
|
Abstract
Canine kobuvirus (CaKoV) is a newly emerging virus in dogs, which relates to the diarrhea of dogs. To investigate the CaKoV infection in dog population, fecal samples of dogs were collected from three provinces of China in 2015. The results of genetic analysis based on the complete VP1 gene showed that six CaKoVs isolates in this study were closely related with the Chinese canine originated isolate CH1 (90.6%-91.9% nucleotide identities). The phylogenetic analysis demonstrated that the Chinese isolates clustered into a unique branch compared with isolates from other countries. The present study suggested that the CaKoVs had established infection in Chinese dog population. The systematic epidemiological investigation should be further carried out to evaluate the prevalence of the CaKoV infection in China.
Collapse
|
6
|
Spin fluctuation induced Weyl semimetal state in the paramagnetic phase of EuCd 2As 2. SCIENCE ADVANCES 2019; 5:eaaw4718. [PMID: 31309151 PMCID: PMC6625818 DOI: 10.1126/sciadv.aaw4718] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2018] [Accepted: 06/10/2019] [Indexed: 05/22/2023]
Abstract
Weyl fermions as emergent quasiparticles can arise in Weyl semimetals (WSMs) in which the energy bands are nondegenerate, resulting from inversion or time-reversal symmetry breaking. Nevertheless, experimental evidence for magnetically induced WSMs is scarce. Here, using photoemission spectroscopy, we observe that the degeneracy of Bloch bands is already lifted in the paramagnetic phase of EuCd2As2. We attribute this effect to the itinerant electrons experiencing quasi-static and quasi-long-range ferromagnetic fluctuations. Moreover, the spin-nondegenerate band structure harbors a pair of ideal Weyl nodes near the Fermi level. Hence, we show that long-range magnetic order and the spontaneous breaking of time-reversal symmetry are not essential requirements for WSM states in centrosymmetric systems and that WSM states can emerge in a wider range of condensed matter systems than previously thought.
Collapse
|
7
|
Dirac nodal surfaces and nodal lines in ZrSiS. SCIENCE ADVANCES 2019; 5:eaau6459. [PMID: 31058219 PMCID: PMC6499591 DOI: 10.1126/sciadv.aau6459] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/02/2018] [Accepted: 03/21/2019] [Indexed: 05/30/2023]
Abstract
Topological semimetals are characterized by symmetry-protected band crossings, which can be preserved in different dimensions in momentum space, forming zero-dimensional nodal points, one-dimensional nodal lines, or even two-dimensional nodal surfaces. Materials harboring nodal points and nodal lines have been experimentally verified, whereas experimental evidence of nodal surfaces is still lacking. Here, using angle-resolved photoemission spectroscopy (ARPES), we reveal the coexistence of Dirac nodal surfaces and nodal lines in the bulk electronic structures of ZrSiS. As compared with previous ARPES studies on ZrSiS, we obtained pure bulk states, which enable us to extract unambiguously intrinsic information of the bulk nodal surfaces and nodal lines. Our results show that the nodal lines are the only feature near the Fermi level and constitute the whole Fermi surfaces. We not only prove that the low-energy quasiparticles in ZrSiS are contributed entirely by Dirac fermions but also experimentally realize the nodal surface in topological semimetals.
Collapse
|
8
|
Development of an indirect ELISA with epitope on nonstructural protein of Muscovy duck parvovirus for differentiating between infected and vaccinated Muscovy ducks. Lett Appl Microbiol 2014; 59:631-5. [DOI: 10.1111/lam.12323] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2014] [Revised: 08/12/2014] [Accepted: 08/23/2014] [Indexed: 11/30/2022]
|
9
|
Screening and analyzing genes associated with Amur tiger placental development. GENETICS AND MOLECULAR RESEARCH 2014; 13:7869-78. [DOI: 10.4238/2014.september.26.25] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
10
|
Diversity of nitrogenase (nifH) genes pool in soybean field soil after continuous and rotational cropping. J Basic Microbiol 2010; 50:373-9. [PMID: 20473958 DOI: 10.1002/jobm.200900317] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Diazotrophs diversity in soybean is a topic requiring thorough investigation since the previous researches have focused on only rice, forest, grass, water, etc. In this research, iron-only nitrogenase nifH gene was as genetic marker. PCR-RFLP was used to investigate the difference of diazotrophs community diversity in the soil from the continuous cropping (CC) (the 5-yr tilling of soybean) and the rotational cropping (RC) (soybean-corn) soils in the northeast of China. A total of 36 isolates were genetically characterized. Most of the isolates closely related to Azospirillum and Azotobacter. Eighty-six unique nifH gene sequences were obtained by cloning of the respective PCR products in two soil samples. It was found that the diversity of nifH genes in CC changed obviously compared with RC. Phylogenetic analysis indicated that most of the clones clustered together in a high homogeneity with some sequence retrieved from environmental representatives. The sequence diversity of nifH genes was high and the members of the Alphaproteobacteria were predominant in both samples. The experimental study also revealed the two non-proteobacterial diazotrophs, firmicutes and euryarchaeota. Through this study, it can be assumed that different tillage perhaps affected the nifH gene-containing population diversity.
Collapse
|
11
|
Establishment and characterization of a fibroblast cell line derived from Jining Black Grey goat for genetic conservation. Small Rumin Res 2009. [DOI: 10.1016/j.smallrumres.2009.09.028] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
12
|
Left ventricular function in physiologic and pathologic hypertrophy in Sprague–Dawley rats. Sci Sports 2008. [DOI: 10.1016/j.scispo.2008.04.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
13
|
Bitter taste receptor gene polymorphisms are an important factor in the development of nicotine dependence in African Americans. J Med Genet 2008; 45:578-82. [PMID: 18524836 DOI: 10.1136/jmg.2008.057844] [Citation(s) in RCA: 62] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
CONTEXT Bitter sensitivity varies among individuals and ethnic groups partly due to polymorphisms in taste receptor genes (TAS2Rs). Although previous psychophysical studies suggest that taste status plays a role in nicotine dependence (ND), genetic evidence is lacking. OBJECTIVES To determine whether single nucleotide polymorphisms (SNPs) in TAS2R16 and TAS2R38 are associated with ND and if the effects differ by sex and ethnicity. DESIGN, SETTING, AND PARTICIPANTS 2037 individuals from 602 nuclear families of African American (AA) or European American (EA) origin were recruited from the US mid-south states during 1999-2004. MAIN OUTCOME MEASURES ND was assessed by three measures: indexed Smoking Quantity (SQ), Heaviness of Smoking Index (HSI), and the Fagerström Test for Nicotine Dependence (FTND). Peripheral blood samples were obtained for DNA extraction and genotyping. RESULTS The TAS2R38 taster haplotype PAV was inversely associated (p = 0.0165), and the non-taster haplotype AVI was positively associated (p = 0.0120), with SQ in AA smokers. The non-taster haplotype was positively associated with all ND measures in AA female smokers (p = 0.01 approximately 0.003). No significant associations were observed in the EA sample. CONCLUSIONS TAS2R38 polymorphisms are an important factor in determining ND in AAs. Heightened oral sensitivity confers protection against ND. Conversely, decreased sensitivity represents a risk factor for ND, especially in AA females. Together, our findings suggest that taster status plays a role in governing the development of ND and may represent a way to identify individuals at risk for developing ND, particularly in AA smokers.
Collapse
|
14
|
Abstract
On the basis of our previous identified linkage regions for nicotine dependence (ND), we selected seven and four single nucleotide polymorphisms (SNPs) in the beta-arrestins 1 (ARRB1) and 2 (ARRB2), respectively, to determine the associations of the two genes with ND in a total of 2037 subjects from 602 nuclear families of European American (EA) and African American (AA) origin. ND was assessed by Smoking Quantity (SQ), the Heaviness of Smoking Index (HSI) and the Fagerström Test for ND (FTND) score. Individual SNP analysis indicated that SNPs rs472112 within ARRB1 and rs4790694 within ARRB2 in the EA sample was significantly associated with HSI and FTND score, and the association of rs4790694 for ARRB2 remained significant after correction for multiple testing. Haplotype analysis revealed that haplotype C-G-C-G-G-T within ARRB1 at a frequency of 20%, formed by SNPs rs528833, rs1320709, rs480174, rs5786130, rs611908 and rs472112, was positively associated with HSI and FTND in EAs. We also found a haplotype within ARRB2, C-C-A-T at a frequency of 10.7%, formed by SNPs rs3786047, rs4522461, rs1045280 and rs4790694, that showed a significant positive association with HSI and FTND in the EA sample. No significant associations for either individual SNPs or major haplotype of both ARRB1 and ARRB2 were found in the AA sample. Further, the strength of these associations increased after removing the SQ component from HSI and FTND scores in both the EA and AA samples, suggesting that ARRB1 and ARRB2 play an important role in biological processes involved in the regulation of smoking urgency (that is time to smoke first cigarette). In summary, our results provide the first evidence of a significant association for ARRB1 and ARRB2 variants with ND in an EA sample.
Collapse
|
15
|
Genome-wide linkage scan for nicotine dependence in European Americans and its converging results with African Americans in the Mid-South Tobacco Family sample. Mol Psychiatry 2008; 13:407-16. [PMID: 17579606 DOI: 10.1038/sj.mp.4002038] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Previously, we reported a genome-wide scan for nicotine dependence (ND) in the African American (AA) sample of the Mid-South Tobacco Family (MSTF) cohort. In this study, we conducted a genome-wide scan in 629 individuals representing 200 nuclear families of European American (EA) origin of the MSTF cohort with the goals of identifying vulnerability loci for ND in the EAs and determining converging regions across the ethnic groups. We examined 385 autosomal microsatellite markers for ND, which was assessed by smoking quantity (SQ), the Heaviness of Smoking Index (HSI) and the Fagerström test for ND (FTND). After performing linkage analyses using various methods implemented in the GENEHUNTER and SAGE programs, we found eight regions on chromosomes 2, 4, 9-12, 17 and 18 that met the criteria for suggestive linkage to at least one ND measure in the EA sample. Of these, the region on chromosome 4 at 43 cM showed suggestive linkage to indexed SQ, the HSI and the FTND, and the region on chromosome 9 at 24 cM showed suggestive linkage to the HSI and the FTND. To increase detection power, we analyzed a combined AA and EA sample using age, gender and ethnicity as covariates and found that the region on chromosome 12 near marker D12S372 showed significant linkage to SQ. Additionally, we found six regions on chromosomes 9-11, 13 and 18 that showed suggestive linkage to at least one ND measure in the combined sample. When we compared the linkage peaks detected for ND among the two samples and a combined sample, we found that four regions on chromosomes 9 (two regions), 11 and 18 overlapped. On the other hand, we identified five regions on chromosomes 2, 4, 10, 12 and 17 that showed linkage to ND only in the EA sample, and two regions on chromosomes 10 and 13 that showed linkage to ND only in the AA sample. For those linkages identified in only one sample, we found that the combined analysis of AA plus EA samples actually decreased the linkage signal. This indicates that some chromosomal regions may be more homogenous than others across the ethnic samples. All regions except for the one on chromosome 12 have been detected at nominally significant levels in other studies, providing independent replication of ND loci in different populations.
Collapse
|
16
|
Antibiotic cycling to decrease bacterial antibiotic resistance: a 5-year experience on a bone marrow transplant unit. Bone Marrow Transplant 2007; 40:151-5. [PMID: 17530005 DOI: 10.1038/sj.bmt.1705704] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Multidrug-resistant pathogens have important effects on clinical outcomes. Antibiotic cycling is one approach to control anti-microbial resistance, but few studies have examined cycling in hematology-oncology units. Antibiotic cycling was implemented in January 1999 at our hematology-oncology unit, alternating piperacillin-tazobactam (pip-tazo) and cefepime in 3 months periods, until June 2004. Clinical isolates were compared in post- and pre-intervention periods and with the susceptibility among the solid organ transplant intensive care unit (TICU) isolates. The rate of Gram-negative isolates remained stable. Among Gram-negatives, susceptibility to cefepime and pip-tazo remained stable. There was an increase in Enterococcus spp. (P=0.007), and susceptibility to ampicillin and vancomycin decreased (odds ratio (OR): 0.04, 95% confidence interval (CI): 0.17-0.89 and OR: 0.23, 95% CI: 0.09-0.58). Compared with the TICU, there was increased susceptibility to pip-tazo and cefepime among enterics (OR: 7.32, 95% CI: 4.44-12.07 and OR: 8.82, 95% CI: 2.1-37.13) and Pseudomonas aeruginosa (OR: 4.27, 95% CI: 1.47-12.4 and OR: 4.61, 95% CI: 1.75-12.1) and decreased susceptibility to ampicillin and vancomycin among enterococci (OR: 0.44, 95% CI: 0.30-0.63 and OR: 0.38, 95% CI: 0.26-0.56). Cycling was associated with preserved antibiotic susceptibility among Gram-negatives, but with an increase in Enterococcus spp. and vancomycin and ampicillin resistance among enterococci.
Collapse
|
17
|
Strain- and region-specific gene expression profiles in mouse brain in response to chronic nicotine treatment. GENES BRAIN AND BEHAVIOR 2007; 7:78-87. [PMID: 17504244 DOI: 10.1111/j.1601-183x.2007.00328.x] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
A pathway-focused complementary DNA microarray and gene ontology analysis were used to investigate gene expression profiles in the amygdala, hippocampus, nucleus accumbens, prefrontal cortex (PFC) and ventral tegmental area of C3H/HeJ and C57BL/6J mice receiving nicotine in drinking water (100 mug/ml in 2% saccharin for 2 weeks). A balanced experimental design and rigorous statistical analysis have led to the identification of 3.5-22.1% and 4.1-14.3% of the 638 sequence-verified genes as significantly modulated in the aforementioned brain regions of the C3H/HeJ and C57BL/6J strains, respectively. Comparisons of differential expression among brain tissues showed that only a small number of genes were altered in multiple brain regions, suggesting presence of a brain region-specific transcriptional response to nicotine. Subsequent principal component analysis and Expression Analysis Systematic Explorer analysis showed significant enrichment of biological processes both in C3H/HeJ and C57BL/6J mice, i.e. cell cycle/proliferation, organogenesis and transmission of nerve impulse. Finally, we verified the observed changes in expression using real-time reverse transcriptase polymerase chain reaction for six representative genes in the PFC region, providing an independent replication of our microarray results. Together, this report represents the first comprehensive gene expression profiling investigation of the changes caused by nicotine in brain tissues of the two mouse strains known to exhibit differential behavioral and physiological responses to nicotine.
Collapse
|
18
|
Linkage and association studies in African- and Caucasian-American populations demonstrate that SHC3 is a novel susceptibility locus for nicotine dependence. Mol Psychiatry 2007; 12:462-73. [PMID: 17179996 DOI: 10.1038/sj.mp.4001933] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Our previous linkage study demonstrated that the 9q22-q23 chromosome region showed a 'suggestive' linkage to nicotine dependence (ND) in the Framingham Heart Study population. In this study, we provide further evidence for the linkage of this region to ND in an independent sample. Within this region, the gene encoding Src homology 2 domain-containing transforming protein C3 (SHC3) represents a plausible candidate for association with ND, assessed by smoking quantity (SQ), the Heaviness of Smoking Index (HSI) and the Fagerström Test for ND (FTND). We utilized 11 single-nucleotide polymorphisms within SHC3 to examine the association with ND in 602 nuclear families of either African-American (AA) or European-American (EA) origin. Individual SNP-based analysis indicated three SNPs for AAs and one for EAs were significantly associated with at least one ND measure. Haplotype analysis revealed that the haplotypes A-C-T-A-T-A of rs12519-rs3750399-rs4877042-rs2297313-rs1547696-rs1331188, with a frequency of 27.8 and 17.6%, and C-T-A-G-T of rs3750399-rs4877042-rs2297313-rs3818668-rs1547696, at a frequency of 44.7 and 30.6% in the AA and Combined samples, respectively, were significantly inversely associated with the ND measures. In the EA sample, another haplotype with a frequency of 10.6%, A-G-T-G of rs1331188-rs1556384-rs4534195-rs1411836, showed a significant inverse association with ND measures. These associations remained significant after Bonferroni correction. We further demonstrated the SHC3 contributed 40.1-59.2% (depending on the ND measures) of the linkage signals detected on chromosome 9. As further support, we found that nicotine administered through infusion increased the Shc3 mRNA level by 60% in the rat striatum, and decreased it by 22% in the nucleus accumbens (NA). At the protein level, Shc3 was decreased by 38.0% in the NA and showed no change in the striatum. Together, these findings strongly implicate SHC3 in the etiology of ND, which represents an important biological candidate for further investigation.
Collapse
|
19
|
Effects of alendronate on bone mineral density in men with prostate cancer treated with androgen deprivation therapy. J Clin Densitom 2006; 9:431-7. [PMID: 17097529 DOI: 10.1016/j.jocd.2006.07.005] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/09/2006] [Revised: 07/17/2006] [Accepted: 07/19/2006] [Indexed: 10/24/2022]
Abstract
Bone mineral density (BMD) is low in men with prostate cancer treated with androgen deprivation therapy (ADT). Intravenous bisphosphonates have been shown to prevent the bone loss, however, the effectiveness of oral bisphosphonates have not been studied in this population. In this retrospective cohort study, we examine the effect of alendronate on BMD in men with prostate cancer receiving ADT. We reviewed the charts of patients receiving ADT referred from the VA Urology Clinic for BMD measurements. Forty seven patients had follow up BMD measurements (17.6+8.3 months). Twenty-two men (47%) were also receiving alendronate 70 mg every week. There was a statistically significant difference (p<0.05) in the percent change of BMD per year at the spine (-1.29+/-0.7% vs. +1.41+/-0.7%), total hip (-0.94+/-0.6% vs. +0.97+/-0.5%), femoral neck (-2.17+/-0.7% vs. +0.32+/-0.6%) and trochanter (-2.01+/-0.7% vs. +0.79+/-0.8%) in the patients not treated compared to those treated with alendronate. In the four other measured sites at the radius (proximal, mid, ultra distal and total), there were no statistically significant differences (p>0.05). These findings confirm that bone loss occurs in men receiving ADT at all sites measured. The use of alendronate prevents bone loss at the spine and hip, but does not seem to have the same protective effect at the radius.
Collapse
|
20
|
State-level adjusted ESRD incident rates: use of observed vs model-predicted category-specific rates. Kidney Int 2006; 69:1459-63. [PMID: 16531980 DOI: 10.1038/sj.ki.5000299] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
Because of differences in case-mix across states, state-level case-mix-adjusted end-stage renal disease (ESRD) incident rates are reported in each United States Renal Data System Annual Data Report to make the across-state comparisons valid. The adjusted rates were estimated by the direct adjustment method, a widely used method for adjusted event rate calculation, based on observed category-specific ESRD incident rates in each state (called the observation-based method). However, when some adjusting categories in a state are small, the adjusted rate and the standard error for this state as estimated by this method may be inaccurate. This report proposes a model-based method that can overcome the disadvantages of the observation-based method and can be extended to continuous adjusting variables. National ESRD incident data and national population data from 1990 to 1999 were used. State-level adjusted ESRD incident rates were estimated by both the observation- and the model-based methods. For the model-based method, a Poisson regression model was used to estimate category-specific ESRD incident rates. For large-population states, both observation- and model-based methods produced similar estimates for adjusted ESRD incident rates. For small-population states, however, the observation-based method produced year-to-year estimates of adjusted ESRD incident rates that varied considerably and also had very large standard errors. In contrast, the model-based method produced stable estimates. The model-based method can overcome the disadvantages of the observation-based method for estimating state-level adjusted ESRD incident rates, especially for small states.
Collapse
|
21
|
Mapping and verification of susceptibility loci for smoking quantity using permutation linkage analysis. THE PHARMACOGENOMICS JOURNAL 2005; 5:166-72. [PMID: 15724146 DOI: 10.1038/sj.tpj.6500304] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Nicotine dependence is the most prevalent form of drug addiction in the US and throughout the world. Epidemiological studies demonstrate that genetics accounts for at least 50% of the liability to nicotine dependence. However, there have been very limited linkage studies providing convincing evidence of susceptibility genomic loci for this disorder. In this study, we conducted genome-wide permutation linkage analyses on the smoking data collected between 1970 and 1972 of the Framingham Heart Study (FHS) to account for the abnormality associated with the smoking quantity (defined as the number of cigarettes smoked per day). We used empirical thresholds obtained from permutation tests to determine the significance of each genomic region. The variance component method implemented in SOLAR was used for the analysis. Under the empirical genome-wide thresholds determined specifically for the FHS smoking data, we found two highly or near-highly significant linkages of nicotine dependence on chromosomes 1 and 4 (P=0.001) and eight significant linkages on chromosomes 3, 7, 8, 9, 11, 16, 17, and 20 (P<0.05). These findings strongly indicate that some of these regions may harbor susceptibility loci for nicotine dependence. Further analysis of these positive regions by fine mapping and/or association analysis is warranted. To our knowledge, this study presents the most convincing linkage evidence for nicotine dependence in the field.
Collapse
|
22
|
Abstract
Smoking behaviour is influenced by both genetic and environmental factors. Many years of twin and adoption studies have demonstrated that heritability is at least 50% responsible for both smoking initiation and smoking persistence. Furthermore, the extent, to which genetic and environmental factors contribute to smoking behaviour, is significantly different in men and women. Linkage analyses from several independent studies provide evidences for suggested linkage of smoking behaviour to chromosomes 1, 2, 4, 5, 6, 9, 10, 11, 14, 17, 18 and 21. However, almost none of these loci have been replicated yet. Furthermore, numerous population-based association studies have been performed to examine the effects of a number of candidate genes, such as cytochrome P450, dopamine receptor (DR) and transporter, serotonin transporter and nicotinic acetylcholine receptor, on smoking behaviour. However, many of these reports have not yet received independent confirmation. Of these candidate genes, the D2 dopamine receptor (DRD2) gene has been extensively studied. Meta-analysis of 12 reported studies showed a significantly higher prevalence of the DRD2 TaqI A1 allele in smokers than that in non-smokers (p < 0.0001; pooled OR = 1.50; 95% CI = 1.33-1.70). For other candidate genes, insufficient published studies are available to allow a meta-analysis to be performed, or meta-analysis showed no significant difference between smokers and non-smokers. More studies are necessary to determine whether these genes play a significant role in smoking behaviour.
Collapse
|
23
|
|
24
|
Efficient decoding strategies for conversational speech recognition using a constrained nonlinear state-space model. ACTA ACUST UNITED AC 2003. [DOI: 10.1109/tsa.2003.818075] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
25
|
Comparison of marker intervals and number of sib pairs used for linkage analysis on simulated nuclear family data. Genet Epidemiol 2002; 21 Suppl 1:S748-53. [PMID: 11793772 DOI: 10.1002/gepi.2001.21.s1.s748] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Using a two-stage global scan design, we analyzed general population replicates 1 and 42 of the Genetic Analysis Workshop (GAW) 12 simulated data set using three methods: revisited Haseman-Elston (HER), maximum likelihood variance estimation (ML), and variance components (VC). Three marker densities, 5-, 10-, and 15-cM intervals, were examined in the first-stage scan. We found that the 10-cM interval appears to be the most cost-effective approach in genotyping without sacrificing power when using a first stage significance level of 0.01. Subsequently, we performed the second-stage scan at 1-cM intervals for those putative positive regions identified in the first-stage scan at a significance level of 0.01. We also compared the power to detect linkage using different numbers of sib pairs for a genome-wide scan at a 10-cM interval and found that power decreases nonlinearly as the number of sib pairs decreases.
Collapse
|
26
|
Bone or cartilage invasion by advanced head and neck cancer: intra-arterial supradose cisplatin chemotherapy and concomitant radiotherapy for organ preservation. ARCHIVES OF OTOLARYNGOLOGY--HEAD & NECK SURGERY 2001; 127:1451-6. [PMID: 11735813 DOI: 10.1001/archotol.127.12.1451] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
BACKGROUND Invasion of bony or cartilaginous structures by advanced upper aerodigestive tract cancer has been considered an indication for surgery on the basis of historic experience of poor responsiveness to radiation therapy. At University of Tennessee-Memphis, patients with advanced head and neck cancer have been treated on a protocol of concomitant intra-arterial (targeted) cisplatin and conventional radiation therapy. OBJECTIVE To compare the efficacy, in terms of disease control and survival, of this protocol in patients with T4 squamous cell cancers and invasion of bony or cartilaginous structures (group 1; n = 45) vs those with T4 disease but no bone or cartilage involvement (group 2; n = 90). DESIGN Subset analysis of protocol database and retrospective chart review. METHODS Treatment consisted of 4 weekly intra-arterial infusions of cisplatin (150 mg/m(2) per week), with simultaneous systemic neutralization by intravenous sodium thiosulfate (9 mg/m(2)), and concurrent radiation therapy at 180 rad (1.8 Gy) or 200 rad (2 Gy) per fraction to a planned total of 6600 to 7400 rad (66-74 Gy) to the primary site or overt nodal disease. Presence of bone or cartilage invasion was established by review of tumor diagrams of clinical findings and computed tomography or magnetic resonance imaging reports. RESULTS Of 135 patients who had T4 disease and a minimum follow-up of 9 months (median, 40 months), 45 had clinical or radiologic evidence of bone (n = 29: mandible, 12; maxilla, 9; sphenoid, 3; hyoid, 6) and/or cartilage (n = 18: thyroid, 16; cricoid, 4) invasion (some patients had involvement of more than 1 site). The rate of complete response in group 1 (66.7%) was not significantly different from that in group 2 (71.1%) (chi(2) test, P = .79). The 2-year overall actuarial survival for group 1 (46.3%; 95% confidence interval, 30.3%-62.3%) was not significantly different (generalized Wilcoxon test, P = .36) from that of group 2 (36.9%; 95% confidence interval, 25.5%-48.4%). A marked trend was noted for higher response rates in cases of cartilage invasion (81.2%) than in those with bone invasion (58.6%) (P = .15). CONCLUSION Equivalent efficacy of treatment in the 2 groups suggests that targeted chemoradiation can be a definitive therapeutic option in patients with advanced head and neck cancer invading bony or cartilaginous structures.
Collapse
|
27
|
Abstract
This study evaluates risk factor monitoring in end-stage renal disease (ESRD) patients with cardiovascular disease. Death rates from cardiovascular disease in ESRD patients are 20 to 40 times higher than in the general population, and 72% of ESRD patients with an acute myocardial infarction (AMI) are dead within 2 years of follow-up. Patients who have sustained an AMI rarely receive definitive testing to assess coronary circulation, and cardiac catheterization rates and revascularization rates are low, even after the high-risk event of an AMI. Risk factor intervention to treat lipid disorders in the ESRD population has received little attention, with the USRDS reporting that in 1998, 58% of dialysis and 64% of transplant patients had no lipid monitoring performed within a year. Of those tested, only 33% of dialysis and 27% of transplant patients had two or more tests within 1 year. Glycemic control monitoring in the form of HbA1c, recommended for diabetes management, is also underutilized in ESRD patients, with fewer than half receiving a single test within 1 year and only 10% receiving three or more tests. This raises concerns that diabetic glycemic control monitoring may be suboptimal in the ESRD population. The use of diabetic eye examinations and diabetic glucose monitoring is also low, as are influenza vaccination rates. These data suggest that the clinical care of cardiovascular disease in the ESRD patients needs more attention.
Collapse
|
28
|
|
29
|
Hospitalization risks between Renagel phosphate binder treated and non-Renagel treated patients. Clin Nephrol 2000; 54:334-41. [PMID: 11076110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/18/2023] Open
Abstract
AIM We evaluated 152 sevelamer hydrochloride treated Medicare patients on hemodialysis in a case-controlled study matching 152 randomly selected non-sevelamer hydrochloride treated Medicare patients from the same dialysis facilities and time period. The main outcomes evaluated were the risk of all-cause hospitalization and per-member per-month (PMPM) Medicare expenditures in the follow-up period. PATIENTS AND METHODS Medicare patients were identified from a total of 195 patients who were included in a long-term safety and efficacy clinical trial evaluating sevelamer hydrochloride [Chertow et al. 1999a]. The average serum calcium-phosphorus product as well as lipid profiles improved in the sevelamer hydrochloride treated group during the trial. Sevelamer treated patients were matched with randomly selected Medicare patients for age, gender, race, diabetic status, and geographic location. Comorbid conditions were characterized and sequential Cox regression models were applied with the outcome being risk of first hospitalization in a 17- month follow-up period. RESULTS Across all four models, the relative risk of hospitalization was 46% to 54% less in the sevelamer hydrochloride treated group, as compared to the case control group (significant at the p-value 0.03 level). Overall, Medicare expenditures for the control patients per-member per-month were US-$4,745, compared to US-$3,368 in the sevelamer hydrochloride treated patients. CONCLUSION Sevelamer hydrochloride treated patients had a 50% lower likelihood of hospitalization in the follow-up period after adjustments for the differences in the population. Potential bias may exist between groups because of differences in baseline characteristics that could not be adjusted for within the study design. We feel that to further advance this area, a randomized clinical trial should be performed.
Collapse
|
30
|
[Comparative research on mixed and 488 nm argon laser PDT for port wine stain]. SHANGHAI KOU QIANG YI XUE = SHANGHAI JOURNAL OF STOMATOLOGY 2000; 9:173-4. [PMID: 15014797] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Subscribe] [Scholar Register] [Indexed: 04/29/2023]
Abstract
OBJECTIVE In order to search a safer argon laser photodynamic method, this study used 488 nm wavelength and common mixed spectrum argon laser PDT for treating Port Wine Stain. The therapeutic effects and side effect were compared. METHODS Fifty two cases of PWS were divided into two groups randomly. Argon laser PDT by two ways of laser irradiation were used. RESULTS Our data revealed 19.23% cure rate in 488 nm laser PDT group, while that of mixed wavelength laser group was 11.54%. There was no significant difference in the two groups. Also the effective rate of 488 nm laser group was 92.31% and that of mixed group was 88.46%. But the difference was not significant statistically. Side effect occurred often in mixed spectrum laser group. That was 57.69% for mixed group, 38.46% for 488 nm group. CONCLUSION Argon laser PDT at 488 nm had the similar therapeutic result contrasted to mixed spectrum laser group, but it was safe.
Collapse
|
31
|
Abstract
Clinical studies and the National Kidney Foundation-Dialysis Outcomes Quality Initiative guidelines suggest that a target hematocrit of 33% to less than 36% is appropriate for patient benefit. Previous studies have shown an association of lower risks for death and hospitalization when hematocrits were 33% to less than 36%. In this study, we assessed the relationship between hematocrit value and associated Medicare expenditures, analyzing incident Medicare hemodialysis patients from January 1, 1991, through June 30, 1995. All patients survived at least 90 days to normalize eligibility and an additional 6-month entry period to assess comorbidity and hematocrit values. All patients were followed up from July 1, 1991, through December 31, 1996. We assessed the association between hematocrit values in the 6-month entry period and the Medicare-allowable Part A and Part B per-member-per-month (PMPM) expenditures in the follow-up period, controlling for other variables, including patient demographic characteristics, comorbid conditions, and severity of disease. We found that hematocrits of 33% to less than 36% and 36% and higher were associated with lower Medicare-allowable payments in the follow-up period. Compared with reference patients with hematocrits of 30% to less than 33%, the Medicare-allowable PMPM expenditures were significantly greater for patients with hematocrits less than 27% and 27% to less than 30% (12. 7% and 5.3%, respectively), and the Medicare-allowable PMPMs were significantly less for patients with hematocrits of 33% to less than 36% and 36% and higher (6.0% and 8.2%, respectively). Although these findings suggest that the treatment of anemia may be associated with significant savings in total patient Medicare expenditures, caution should be considered because these findings are associations and should not be deemed as showing causality.
Collapse
|
32
|
Long-term survival of renal transplant recipients in the United States after acute myocardial infarction. Am J Kidney Dis 2000; 36:145-52. [PMID: 10873884 DOI: 10.1053/ajkd.2000.8287] [Citation(s) in RCA: 45] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Cardiac disease is a major cause of death in renal transplant recipients. One third of the cardiac deaths are attributed to acute myocardial infarction (AMI). Few data exist on predictors of long-term survival of renal transplant recipients after AMI. The purpose of this study is to determine predictors of survival (including treatment era) for renal transplant recipients in the United States after AMI. The US Renal Data System database of 783, 171 patients was used to retrospectively examine outcomes of renal transplant recipients hospitalized during 1977 to 1996 for a first AMI after initiation of renal replacement therapy. Long-term survival was estimated by life-table method, and independent predictors of survival were examined in a comorbidity-adjusted Cox model. There were 4,250 renal transplant recipients with AMI. The in-hospital death rate was 12.8%. Overall 2-year cardiac and all-cause mortality rates were 11.8% +/- 0.6% (SE) and 33.6% +/- 0. 8%, respectively. The poorest survival after AMI occurred in patients with diabetic end-stage renal disease (ESRD), with 2-year cardiac and all-cause mortality rates of 14.9% +/- 1.1% and 40.5% +/- 1.4%, respectively. In the Cox model, the risks for cardiac and all-cause death from AMI were 51% (P = 0.0003) and 45% less (P < 0. 0001) in 1990 to 1996 compared with 1977 to 1984, respectively. The long-term survival of renal transplant recipients in the United States after AMI has markedly improved in the modern treatment era. Patients with diabetic ESRD experience the worst outcome.
Collapse
|
33
|
Impact of hematocrit on morbidity and mortality. Semin Nephrol 2000; 20:345-9. [PMID: 10928336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/17/2023]
Abstract
It has been 10 years since epoetin-alpha was approved by the federal Food and Drug Administration for use in end-stage renal disease patients. Over this period of time, clinical studies have shown a relationship between the correction of anemia and improved cardiac function, cognitive ability, sexual function, and exercise capacity. Recent large epidemiological studies have shown that mortality and morbidity are reduced when the hematocrit (Hct) level is in the range 33% to 36%, and the National Kidney Foundation's Dialysis Outcomes Quality Initiative (NKF-DOQI) guidelines recommend a target Hct of 33% to 36% to enhance patient outcomes. The most recent mortality studies show that Hcts less than 30% (or hemoglobins less than 110 gm/L) are associated with an 18% to 40% increased associated risk of death and hospitalizations. Higher Hcts in the 33% to 36% range appear to be associated with a 7% reduced risk of death and hospitalizations compared with patients with Hcts of 30% to less than 33%. Patients with sustained Hcts of 33% to 36% over 1 year appear to have the best outcome compared with patients with Hcts that fall. These studies suggest that the factors that may influence patients' ability to move into higher Hct ranges need to be determined to enhance patient outcomes. Dramatic improvement in hemodialysis patient Hct levels has occurred since 1989. Mortality and hospitalization studies support the NKF-DOQI target Hct range of 33% to 36% as providing the best associated outcomes.
Collapse
|
34
|
Abstract
Prior studies on reuse-associated mortality have presented conflicting results and included few adjustments for disease severity or hematocrit levels. To evaluate the impact of patient and provider characteristics on reuse-associated mortality, we developed a period-prevalent model with a 6-month entry period. Five cohorts of Medicare hemodialysis patients surviving from July 1 through December 31 of the entry year (1991, 60,985 patients; 1992, 63,081 patients; 1993, 76,018 patients; 1994, 82,899 patients; 1995, 91,761 patients) were followed up for the next year. Using a basic Cox regression survival model (M-1) including age, sex, race, renal diagnosis, prior end-stage renal disease time, unit age, unit size, water treatment, dialysate, and germicide, results were compared with those using a more inclusive model (M-4) adding dialyzer type (conventional or high efficiency/high flux), unit designation (hospital based or freestanding), unit profit status, comorbidity, disease severity, and hematocrit. The previous association of for-profit units with increased mortality was not present after 1994. Whereas the M-1 analysis showed better survival in reuse units after 1991, the more complete M-4 analysis showed no difference in the risk for mortality between reuse and no-reuse units. We conclude that mortality rates in the United States from 1991 to 1995, when adjusted comprehensively for patient and unit characteristics, were not different in units that practiced reuse and those that did not.
Collapse
|
35
|
Abstract
Studies of outcomes associated with dialysis therapies have yielded conflicting results. Bloembergen et al showed that prevalent patients on continuous ambulatory peritoneal dialysis (CAPD) or continuous cycling peritoneal dialysis (CCPD) had a 19% higher mortality risk than hemodialysis patients, and Fenton et al, analyzing Canadian incident patients, found a 27% lower risk. Attempting to reconcile these differences, we evaluated incident Medicare patients (99,048 on hemodialysis, 18,110 on CAPD/CCPD) from 1994 through 1996, following up to June 30, 1997. Patients were followed to transplantation, death, loss to follow-up, 60 days after modality change, or end of the study period. For each 3-month survival period, we used an interval Poisson regression to compare death rates, adjusting for age, gender, race, and primary renal diagnosis. A Cox regression was used to evaluate cause-specific mortality, and proportionality was addressed in both regressions by separating diabetic and nondiabetic patients. The Poisson regressions showed CAPD/CCPD to have outcomes comparable with or significantly better than hemodialysis, although results varied over time. The Cox regression found a lower mortality risk in nondiabetic CAPD/CCPD patients (women younger than 55 years: risk ratio [RR] = 0. 61; Cl, 0.59 to 0.66; women age 55 years or older: RR = 0.87; Cl, 0. 84 to 0.91; men younger than 55 years: RR = 0.72; Cl, 0.67 to 0.77; men age 55 years or older: RR = 0.87; Cl, 0.83 to 0.92) and in diabetic CAPD/CCPD patients younger than 55 (women: RR = 0.88; Cl, 0. 82 to 0.94; men: RR = 0.86; Cl, 0.81 to 0.92). The risk of all-cause death for female diabetics 55 years of age and older, in contrast, was 1.21 (Cl, 1.17 to 1.24) for CAPD/CCPD, and in cause-specific analyses, these patients had a significantly higher risk of infectious death. We conclude that, overall, within the first 2 years of therapy, short-term CAPD/CCPD appears to be associated with superior outcomes compared with hemodialysis. It also appears that patients on the two therapies have different mortality patterns over time, a nonproportionality that makes survival analyses vulnerable to the length of follow-up. Further investigation is needed to evaluate both the potential explanations for these findings and the use of more advanced statistical methods in the analysis of mortality rates associated with these dialytic therapies.
Collapse
|
36
|
Abstract
BACKGROUND The optimal method of coronary revascularization in dialysis patients is controversial, as previous small retrospective studies have reported increased cardiac events after percutaneous transluminal coronary angioplasty (PTCA) compared with coronary artery bypass (CAB) surgery. The purpose of this study was to compare the long-term survival of chronic dialysis patients in the United States following PTCA or CAB surgery. METHODS Dialysis patients hospitalized from 1978 to 1995 for first coronary revascularization procedure after initiation of renal replacement therapy were retrospectively identified from the United States Renal Data System database. Survival for the endpoints of all-cause death, cardiac death, myocardial infarction, and cardiac death or myocardial infarction was estimated by the life-table method and was compared by the log-rank test. The impact of independent predictors on survival was examined in a Cox regression model with comorbidity adjustment. RESULTS The in-hospital mortality was 5.4% for 6887 PTCA patients and 12.5% for 7419 CAB patients. The two-year event-free survival (+/-SE) of PTCA patients was 52.9 +/- 0.7% for all-cause death, 72.5 +/- 0.7% for cardiac death, and 62.0 +/- 0.7% for cardiac death or myocardial infarction. In CAB patients, the comparable survivals were 56.9 +/- 0.6, 75.8 +/- 0.6, and 71.3 +/- 0. 6%, respectively (P < 0.02 for PTCA vs. CAB surgery for all endpoints). After comorbidity adjustment, the relative risk of CAB surgery (vs. PTCA) performed 1990 to 1995 for all-cause death was 0. 91 (95% CI, 0.86 to 0.97); cardiac death, 0.85 (95% CI, 0.78 to 0. 92); myocardial infarction, 0.37 (95% CI, 0.32 to 0.43); and cardiac death or myocardial infarction 0.69 (95% CI, 0.64 to 0.74). CONCLUSIONS In this retrospective study, dialysis patients in the United States had better survival after CAB surgery compared with PTCA, but our study does not exclude the possibility of more unfavorable coronary anatomy in the PTCA patients at baseline. Our data support the need for prospective trials of newer percutaneous coronary revascularization procedures in dialysis patients.
Collapse
|
37
|
Abstract
The association between hematocrit level and future hospitalization risks in hemodialysis patients has not been fully investigated on a national level. A total of 71,717 prevalent Medicare hemodialysis patients who survived a 6-mo entry period from July 1 through December 31, 1993 were studied, and their risk of hospitalizations was evaluated the next year. Five hematocrit groups were defined from Medicare recombinant human erythropoietin-treated patients: <27%, 27 to <30%, 30 to <33%, 33 to <36%, and > or =36%. A Cox regression model was used to investigate the association between hematocrit level and the risk of first hospitalization, and the Andersen-Gill regression model evaluated multiple hospitalizations during the next year, adjusting for patient comorbidity and severity of disease. Compared with the baseline group of 30 to <33%, patients with hematocrit levels <30% had a 14 to 30% increased risk of hospitalization without disease severity adjustment (p = 0.0001) and a 7 to 18% increased risk with disease severity adjustment (p = 0.0001). Patients in the 33 to <36% group had the lowest risk at 0.93 and 0.88 (p = 0.0001), with and without adjustment for disease severity. It is concluded that patients with hematocrits of <30% have an increased risk of future hospitalization, with hematocrit levels between 33 and 36% having the lowest associated risks.
Collapse
|
38
|
Abstract
Although a number of clinical studies have shown that increased hematocrits are associated with improved outcomes in terms of cognitive function, reduced left ventricular hypertrophy, increased exercise tolerance, and improved quality of life, the optimal hematocrit level associated with survival has yet to be determined. The association between hematocrit levels and patient mortality was retrospectively studied in a prevalent Medicare hemodialysis cohort on a national scale. All patients survived a 6-mo entry period during which their hematocrit levels were assessed, from July 1 through December 31, 1993, with follow-up from January 1 through December 31, 1994. Patient comorbid conditions relative to clinical events and severity of disease were determined from Medicare claims data and correlated with the entry period hematocrit level. After adjusting for medical diseases, our results showed that patients with hematocrit levels less than 30% had significantly higher risk of all-cause (12 to 33%) and cause-specific death, compared to patients with hematocrits in the 30% to less than 33% range. Without severity of disease adjustment, patients with hematocrit levels of 33% to less than 36% appear to have the lowest risk for all-cause and cardiac mortality. After adjusting for severity of disease, the impact of hematocrit levels of 33% to less than 36% is vulnerable to the patient sample size but also demonstrates a further 4% reduced risk of death. Overall, these findings suggest that sustained increases in hematocrit levels are associated with improved patient survival.
Collapse
|
39
|
Statistical analysis and the equivalent of a Thouless energy in lattice QCD Dirac spectra. Int J Clin Exp Med 1999. [DOI: 10.1103/physrevd.59.054501] [Citation(s) in RCA: 20] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
40
|
Abstract
Recombinant erythropoietin, first approved for Medicare reimbursement in June 1989, was prescribed at initial doses for dialysis patients of 2,500 to 2,700 U per administration independent of hematocrit level. By 1997, however, patients with hematocrits less than 30% were administered 6,000 U/dose, compared with 4,500 U administered to patients with hematocrits of 33% to 36%. Since 1990, the percentage of patients with hematocrits less than 30% decreased from 60% to 22% in 1997, whereas the percentage of patients with hematocrits of 33% to 36% increased from 10% to 30%. In 1997, Medicare initiated the Hematocrit Measurement Audit (HMA) policy, which was directed at reducing the percentage of claims for hematocrits greater than 36% and increasing the stability of the hematocrit levels. The policy change achieved the initial effect but resulted in a reduction of the mean hematocrit as well. The policy was changed in 1998 in response to patient and provider concerns. Mortality studies show that hematocrits less than 30% (or hemoglobin levels < 110 g/L) are associated with an 18% to 40% increased associated risk for death. Higher hematocrits of 33% to 36% appear to be associated with a 7% reduced risk for death. The risk for hospitalization parallels that of mortality. Patients with sustained hematocrits of 33% to 36% over 1 year appear to have the best outcome compared with patients with hematocrits that decrease. The latter are at greater risk than those patients in whom the hematocrits increase. In conclusion, dramatic improvements in hemodialysis patient hematocrits have occurred since 1989. Mortality and hospitalization studies support the National Kidney Foundation Dialysis Outcomes Quality Initiative (NKF DOQI) target hematocrit range of 33% to 36% as providing the best associated outcomes.
Collapse
|
41
|
Dialysis unit and patient characteristics associated with reuse practices and mortality: 1989-1993. J Am Soc Nephrol 1998; 9:2108-17. [PMID: 9808098 DOI: 10.1681/asn.v9112108] [Citation(s) in RCA: 43] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
The diverse patient and dialysis unit characteristics in the United States pose challenges for assessing the safety and efficacy of reuse practices. A 10% random sample of period-prevalent hemodialysis patients from units practicing conventional dialysis (<25% of patients with high-efficiency/high-flux dialysis) were analyzed. The data included 13,926 patient observations in 1989-1990 and 20,422 in 1991-1993. Centers for Disease Control and Prevention and Health Care Financing Administration facility survey Medicare data were analyzed with a Cox regression model, evaluating the risk of reuse compared with no reuse and adjusting for comorbidity, unit characteristics, and profit status. In 1989-1990, freestanding and hospital-based units that did not reuse dialyzers were not significantly different from each other in mortality rates. In 1991-1993, however, no-reuse, freestanding, for-profit units had higher risks (relative risk [RR] = 1.23, P = 0.003) compared with no-reuse, hospital-based, nonprofit units. No-reuse, hospital-based, for-profit units, in contrast, were associated with a lower mortality risk (RR = 0.70, P = 0.0001). An isolated higher risk associated with peracetic acid manual reuse in freestanding units (1989-1990) was identified in for-profit units only. In the 1991-1993 period, an increased mortality risk was noted in hospital-based, nonprofit units practicing formaldehyde automatic reuse, and in freestanding, for-profit units using glutaraldehyde, which accounted for <5% of all units. All other interactions of reuse germicide and technique were not different from no-reuse. The varying mortality rates identified in both no-reuse and reuse units using conventional dialysis suggest that other factors, such as dialysis therapy and anemia correction (both known predictors of patient survival), have a greater influence on U.S. mortality than reuse germicides and techniques.
Collapse
|
42
|
Abstract
BACKGROUND Cardiovascular disease is common in patients on long-term dialysis, and it accounts for 44 percent of overall mortality in this group. We undertook a study to assess long-term survival after acute myocardial infarction among patients in the United States who were receiving long-term dialysis. METHODS Patients on dialysis who were hospitalized during the period from 1977 to 1995 for a first myocardial infarction after the initiation of renal-replacement therapy were retrospectively identified from the U.S. Renal Data System data base. Overall mortality and mortality from cardiac causes (including all in-hospital deaths) were estimated by the life-table method. The effect of independent predictors on survival was examined in a Cox regression model with adjustment for existing illnesses. RESULTS The overall mortality (+/-SE) after acute myocardial infarction among 34,189 patients on long-term dialysis was 59.3+/-0.3 percent at one year, 73.0+/-0.3 percent at two years, and 89.9+/-0.2 percent at five years. The mortality from cardiac causes was 40.8+/-0.3 percent at one year, 51.8+/-0.3 percent at two years, and 70.2+/-0.4 percent at five years. Patients who were older or had diabetes had higher mortality than patients without these characteristics. Adverse outcomes occurred even in patients who had acute myocardial infarction in 1990 through 1995. Also, the mortality rate after myocardial infarction was considerably higher for patients on long-term dialysis than for renal-transplant recipients. CONCLUSIONS Patients on dialysis who have acute myocardial infarction have high mortality from cardiac causes and poor long-term survival.
Collapse
|
43
|
A meta-analysis of the effects of dietary protein restriction on the rate of decline in renal function. Am J Kidney Dis 1998; 31:954-61. [PMID: 9631839 DOI: 10.1053/ajkd.1998.v31.pm9631839] [Citation(s) in RCA: 303] [Impact Index Per Article: 11.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Dietary protein restriction has been reported to delay the need for renal replacement therapy in clinical trials and meta-analyses. However, less clear is what effect dietary protein has on the rate of decline in renal function. We pooled the results of 13 randomized controlled trials (n = 1,919 patients) and found that dietary protein restriction reduced the rate of decline in estimated glomerular filtration rate by only 0.53 mL/min/yr (95% confidence interval [CI], 0.08 to 0.98 mL/min/yr). We also used weighted regression analysis to determine the reasons for the differences in the results of these 13 randomized trials along with 11 other nonrandomized controlled trials (n = 2,248 patients). The effect of dietary protein restriction (glomerular filtration rate decline in treatment minus control) was substantially less in randomized versus nonrandomized trials (regression coefficient, -5.2 mL/min/yr; 95% CI, -7.8 to -2.5 mL/min/yr; P < 0.05) and relatively greater among diabetic versus nondiabetic patients (5.4 mL/min/yr; 95% CI, 0.3 to 10.5 mL/min/yr; P < 0.05), while there was a trend toward a greater effect with each additional year of follow-up (2.1 mL/min/yr; 95% CI, -0.05 to 4.2 mL/min/yr; P = NS). However, the number of diabetic patients studied was small and the duration of follow-up was short in most trials. No other patient or study characteristics altered the effect of dietary protein restriction on the rate of decline in renal function. Thus, although dietary protein restriction retards the rate of renal function decline, the relatively weak magnitude of this effect suggests that better therapies are needed to slow the rate of renal disease progression.
Collapse
|
44
|
Abstract
Earlier studies indicated that the incidence rates for bladder cancers rose rapidly in both the United States and Europe. Tobacco smoking is considered to be the major risk factor for urinary bladder cancer, and recent studies from Connecticut show that several smoking-related cancers have started leveling off or decreasing. The time trend for bladder cancer, however, is not clear in Connecticut. The current study examined the long-term trend of bladder cancer in Connecticut. Our results show that urinary bladder cancer has been increasing, with a marked increase among males. The rate of increase, however, has slowed since the early 1980s. Birth-cohort examination shows that the rates have leveled off for those born after about 1935 in both males and females. Age-period-cohort modeling results also show that the birth-cohort patterns of bladder cancer are somewhat similar to those observed for lung cancer in Connecticut, thus supporting the findings from analytical epidemiologic studies which indicate that cigarette smoking is one of the major risk factors for urinary bladder cancer. Our results also suggest that the difference in environmental and occupational exposures between males and females may be responsible for the large difference in the incidence rate of bladder cancer seen between the sexes.
Collapse
|
45
|
Abstract
Recent studies from Europe suggest a continuing increase in thyroid cancer, but it is unclear whether this trend also applies to the United States. The current study examined the long-term trend of thyroid cancer in Connecticut. Our results show that the overall age-adjusted incidence rate of thyroid cancer has been increasing in Connecticut, from 1.30/100,000 in 1935-1939 to 5.78/100,000 in 1990-1992 in females, and from 0.30/100,000 in 1935-1939 to 2.77/100,000 in 1990-1992 in males. The increase mainly comes from papillary carcinoma of the thyroid. The birth cohort analyses indicate that the increase in thyroid cancer occurred among cohorts born between 1915 and 1945, which experienced an increase of 31.4% every 5 years in males and 17.3% in females over the period 1960-1979. For those born since the 1945 cohort, the incidence has been decreasing, at rates of 9.3% and 8.3% every 5 years over the period 1975-1992 in males and females, respectively. Age-period-cohort modeling results also suggest a strong birth cohort effect on the observed time trend in both sexes, which closely follows the introduction of radiation treatment of benign childhood conditions in the head and neck between 1920 and the 1950s in the United States. Our results are consistent with the suggested radiation hypothesis, indicating that radiation treatment of benign childhood conditions in the head and neck is largely responsible for the observed increase of thyroid cancer in Connecticut.
Collapse
|
46
|
Abstract
The pathogenesis of chronic renal allograft rejection is unknown. It is also unclear why cyclosporine has failed to prevent chronic rejection. We examined possible risk factors for graft loss to chronic rejection among 706 renal transplants using the Cox proportional hazards model with fixed and time-dependent covariates. Both the number and the severity of acute rejection episodes were independent risk factors for chronic rejection [relative risk (95% confidence interval) 2.31 (2.04 to 2.60) and 1.53 (1.27 to 1.84), respectively]. Cyclosporine and cyclosporine withdrawal had no effect on chronic rejection. Acute rejections occurring within the first three months after transplantation, when cyclosporine most effectively prevented acute rejection, also had no effect on chronic rejection. Risk factors that were independent of acute rejection and not clearly attributable to immune mechanisms included serum albumin [0.20 (0.10 to 0.38) for each g/dl], proteinuria [1.42 (1.29 to 1.57) for each g/24 hr], and serum triglycerides -1.09 (1.03 to 1.16) for each 100 mg/dl-. These results suggest that the reduction in acute rejection episodes from cyclosporine has failed to reduce graft failure from chronic rejection, possibly because the early (within the first 3 months) and mild acute rejection episodes that are most effectively prevented by cyclosporine do not cause chronic rejection. In addition, the results suggest that there may be a number of nonimmunologic risk factors for chronic rejection.
Collapse
|
47
|
Abstract
The incidence, causes, and consequences of hypoalbuminemia after renal transplantation are not well defined. We examined clinical correlates of serum albumin measured at 3 months, 6 months, 1 year, and annually thereafter in 706 renal transplant recipients who survived at least 6 months with a functioning allograft. Follow-up was 7.0 +/- 4.2 years. Hypoalbuminemia (< or = 3.5 g/dL) was most common at 3 months (31%, n = 692), least common at 1 year (12%, n = 656), and then became increasingly common among survivors, for example, 14% (n = 466) at 4 years, 20% (n = 204) at 8 years, and 29% (n = 77) at 12 years after transplantation. By multiple linear regression, variables that correlated (P < 0.05) with lower serum albumin at 3, 6, 12, and 24 months included age, diabetes, proteinuria, and cytomegalovirus infection. Other independent correlates on at least one of these occasions included renal function and chronic disease (malignancy, liver disease, and cardiovascular disease). Serum albumin, as a time-averaged and time-dependent covariate, was a strong independent risk factor for death using Cox proportional hazards analysis (relative risk for each g/dL increment, 0.26; 95% confidence interval, 0.16 to 0.44 [1.00 = no risk]). The effects of albumin on mortality were independent of age, diabetes, serum lipids, renal function, chronic liver disease, malignancies, and cardiovascular disease. The effects of albumin on mortality were evident even when the analysis was restricted to patients dying several years after albumin was measured. Thus, hypoalbuminemia is common and serum albumin is a strong independent risk factor for all-cause mortality after renal transplantation.
Collapse
|
48
|
Abstract
Although cardiovascular disease is a major cause of morbidity and mortality after renal transplantation, its pathogenesis and treatment are poorly understood. We conducted separate analyses of risk factors for ischemic heart disease, cerebral, and peripheral vascular disease after 706 renal transplants, all of which functioned for at least 6 months. We used Cox proportional hazards analysis to examine the effects of multiple pretransplant and posttransplant risk factors and included time-dependent variables measured at 3, 6, and 12 months, and annually to last follow-up at 7.0 +/- 4.2 yr. The independent relative risk (RR) of diabetes was 3.25 for ischemic heart disease, 3.21 for cerebral vascular disease, and 28.18 peripheral vascular disease (P < 0.05). The RR of each acute rejection episode was 1.40 for ischemic heart disease and 1.24 for cerebral vascular disease. Among serum lipid levels, high-density lipoprotein cholesterol was the best predictor of ischemic heart disease (RR = 0.80 for each 10 mg/dL). Posttransplant ischemic heart disease was strongly predictive of cerebral (5.80) and peripheral vascular disease (5.22), whereas ischemic heart disease was predicted by posttransplant cerebral (8.25) and peripheral vascular disease (4.58). Other risk factors for vascular disease included age, gender, cigarette smoking, pretransplant splenectomy, and serum albumin. Hypertension and low-density lipoprotein cholesterol had no effect, perhaps because of aggressive pharmacologic treatment. Thus, the incidence of cardiovascular disease continues to be high after renal transplantation, and multiple risk factors suggest a number of possible strategies for more effective treatment and prevention.
Collapse
|
49
|
Chronic renal allograft rejection and clinical trial design. KIDNEY INTERNATIONAL. SUPPLEMENT 1995; 52:S116-9. [PMID: 8587273] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Both the choice of endpoints and the selection of patients will be critical study design features in randomized controlled trials needed to test the effectiveness of treatments for chronic renal allograft rejection. We examined the feasibility of carrying out clinical trials with different endpoints and patient inclusion criteria by analyzing data from a population of 627 cadaveric kidney transplant recipients who survived with a functioning allograft for at least six months. Among those who lost grafts to chronic rejection, decreases in renal function of 30% and 60% preceded graft loss by a median of only 1.1 and 0.7 years, respectively, suggesting that little would be gained in a clinical trial that used a predetermined reduction in renal function as a surrogate endpoint. Less clear is whether histologic changes could be used as a surrogate endpoint. At present, graft loss to chronic rejection and graft failure from any cause are the most reliable endpoints. Unfortunately, large numbers of patients are needed to demonstrate clinically relevant therapeutic effects on these endpoints. Limiting enrollment to patients who are at high risk for developing chronic rejection, by selecting patients who already have a decline in renal function, for example, may reduce the number of patients needed in a clinical trial. On the other hand, selecting patients with disease that is too advanced may diminish the effectiveness of therapy. In any case, it is impossible to accurately determine the number of patients needed for a definitive clinical trial without preliminary data demonstrating the expected magnitude of the treatment effect. Thus, well-designed pilot studies are needed to measure possible treatment effects before conducting large-scale clinical trials for chronic renal allograft rejection.
Collapse
|
50
|
Abstract
The long-term risks of kidney donation have not been well defined. We carried out meta-analysis of investigations that examined the long-term effects of reduced renal mass in humans. We used multiple linear regression to combine studies and adjust for differences in the duration of follow-up, the reason for reduced renal mass, the type of controls, age and gender. We analyzed 48 studies with 3124 patients and 1703 controls. Unilateral nephrectomy caused a decrement in glomerular filtration rate (-17.1 ml/min; 95% confidence interval -20.2 to -14.0 ml/min) that tended to improve with each 10 years of follow-up (1.4 ml/min/decade; 0.3 to 2.4 ml/min/decade). Patients with single kidneys had small, progressive increases in proteinuria (76 mg/day/decade; 52 to 101 mg/day/decade), but proteinuria was negligible after nephrectomy for trauma or kidney donation. Nephrectomy did not affect the prevalence of hypertension, but there was a small increase in systolic blood pressure (2.4 mm Hg; -0.3 to 5.1 mm Hg, P > 0.05) which rose further with duration of follow-up (1.1 mm Hg/decade; 0.0 to 2.2 mm Hg/decade). Diastolic blood pressure was higher after nephrectomy (3.1 mm Hg; 1.8 to 4.4 mm Hg), but this increment did not change with duration of follow-up. Thus, in normal individuals, unilateral nephrectomy does not cause progressive renal dysfunction, but may be associated with a small increase in blood pressure.
Collapse
|