1
|
Pipecolate and Taurine are Rat Urinary Biomarkers for Lysine and Threonine Deficiencies. J Nutr 2023; 153:2571-2584. [PMID: 37394117 DOI: 10.1016/j.tjnut.2023.06.039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Revised: 06/26/2023] [Accepted: 06/28/2023] [Indexed: 07/04/2023] Open
Abstract
BACKGROUND The consumption of poor-quality protein increases the risk of essential amino acid (EAA) deficiency, particularly for lysine and threonine. Thus, it is necessary to be able to detect easily EAA deficiency. OBJECTIVES The purpose of this study was to develop metabolomic approaches to identify specific biomarkers for an EAA deficiency, such as lysine and threonine. METHODS Three experiments were performed on growing rats. In experiment 1, rats were fed for 3 weeks with lysine (L30), or threonine (T53)-deficient gluten diets, or nondeficient gluten diet (LT100) in comparison with the control diet (milk protein, PLT). In experiments 2a and 2b, rats were fed at different concentrations of lysine (L) or threonine (T) deficiency: L/T15, L/T25, L/T40, L/T60, L/T75, P20, L/T100 and L/T170. Twenty-four-hour urine and blood samples from portal vein and vena cava were analyzed using LC-MS. Data from experiment 1 were analyzed by untargeted metabolomic and Independent Component - Discriminant Analysis (ICDA) and data from experiments 2a and 2b by targeted metabolomic and a quantitative Partial Least- Squares (PLS) regression model. Each metabolite identified as significant by PLS or ICDA was then tested by 1-way ANOVA to evaluate the diet effect. A two-phase linear regression analysis was used to determine lysine and threonine requirements. RESULTS ICDA and PLS found molecules that discriminated between the different diets. A common metabolite, the pipecolate, was identified in experiments 1 and 2a, confirming that it could be specific to lysine deficiency. Another metabolite, taurine, was found in experiments 1 and 2b, so probably specific to threonine deficiency. Pipecolate or taurine breakpoints obtained give a value closed to the values obtained by growth indicators. CONCLUSIONS Our results showed that the EAA deficiencies influenced the metabolome. Specific urinary biomarkers identified could be easily applied to detect EAA deficiency and to determine which AA is deficient.
Collapse
|
2
|
Role of liver AMPK and GCN2 kinases in the control of postprandial protein metabolism in response to mid-term high or low protein intake in mice. Eur J Nutr 2023; 62:407-417. [PMID: 36071290 DOI: 10.1007/s00394-022-02983-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 08/03/2022] [Indexed: 02/07/2023]
Abstract
PURPOSE Protein synthesis and proteolysis are known to be controlled through mammalian target of rapamycin, AMP-activated kinase (AMPK) and general control non-derepressible 2 (GCN2) pathways, depending on the nutritional condition. This study aimed at investigating the contribution of liver AMPK and GCN2 on the adaptation to high variations in protein intake. METHODS To evaluate the answer of protein pathways to high- or low-protein diet, male wild-type mice and genetically modified mice from C57BL/6 background with liver-specific AMPK- or GCN2-knockout were fed from day 25 diets differing in their protein level as energy: LP (5%), NP (14%) and HP (54%). Two hours after a 1 g test meal, protein synthesis rate was measured after a 13C valine flooding dose. The gene expression of key enzymes involved in proteolysis and GNC2 signaling pathway were quantified. RESULTS The HP diet but not the LP diet was associated with a decrease in fractional synthesis rate by 29% in the liver compared to NP diet. The expression of mRNA encoding ubiquitin and Cathepsin D was not sensitive to the protein content. The deletion of AMPK or GCN2 in the liver did not affect nor protein synthesis rates and neither proteolysis markers in the liver or in the muscle, whatever the protein intake. In the postprandial state, protein level alters protein synthesis in the liver but not in the muscle. CONCLUSIONS Taken together, these results suggest that liver AMPK and GCN2 are not involved in this adaptation to high- and low-protein diet observed in the postprandial period.
Collapse
|
3
|
In vitro comparison of whey protein isolate and hydrolysate for their effect on glucose homeostasis markers. Food Funct 2023; 14:4173-4182. [PMID: 37066543 DOI: 10.1039/d3fo00467h] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/31/2023]
Abstract
Research about new strategies to regulate glucose homeostasis to prevent or manage type 2 diabetes is a critical challenge. Several studies have shown that protein-rich diets could improve glucose homeostasis....
Collapse
|
4
|
Evaluation of Perceived Stress and Sleep Improvement With the Dairy Bioactive Lactium®. Curr Dev Nutr 2022. [PMCID: PMC9193472 DOI: 10.1093/cdn/nzac053.059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Objectives To evaluate Lactium® perceived effectiveness in reducing symptoms related to stress and anxiety, a consumer study has been performed. The satisfaction data has been objectivize using scientific validated questionnaires. Methods This consumer study has been performed in three countries: France (n = 122), USA (n = 111) and China (n = 105) with subjects suffering of moderate stress and sleep disorders. Lactium® 300 mg has been taken for 1 month, before bedtime. Satisfaction survey and three validated questionnaires were used to evaluate the impact of Lactium® on stress and sleep disorders: PSS-10 to assess stress levels, Spiegel questionnaire to estimate the quantity and quality of sleep, and the PSQI to evaluate sleep habits. Results After 1 month of supplementation, the overall satisfaction of the subjects was of 75% for France, 70% for China, and almost 90% for the USA. Lactium® significantly improves stress and sleep in France (73% and 69% of improvement, respectively), China (75% and 76%) and USA (85% and 86%). These results of perceived effectiveness have been confirmed by the results of the scientific validated questionnaires (PSS-10, PSQI and Spiegel questionnaire) showing a significant reduction in stress and sleep disorders. Conclusions After 1 month of treatment, Lactium® significantly improves stress and sleep disorders in subjects suffering of moderate stress and sleep disorders. Four out of five consumers are satisfied by the effectiveness of Lactium®. Funding Sources INGREDIA S.A.
Collapse
|
5
|
Erector spinae plane block vs. peri-articular injection for pain control after arthroscopic shoulder surgery: a randomised controlled trial. Anaesthesia 2021; 77:301-310. [PMID: 34861745 DOI: 10.1111/anae.15625] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/28/2021] [Indexed: 11/28/2022]
Abstract
Interscalene brachial plexus block is the standard regional analgesic technique for shoulder surgery. Given its adverse effects, alternative techniques have been explored. Reports suggest that the erector spinae plane block may potentially provide effective analgesia following shoulder surgery. However, its analgesic efficacy for shoulder surgery compared with placebo or local anaesthetic infiltration has never been established. We conducted a randomised controlled trial to compare the analgesic efficacy of pre-operative T2 erector spinae plane block with peri-articular infiltration at the end of surgery. Sixty-two patients undergoing arthroscopic shoulder repair were randomly assigned to receive active erector spinae plane block with saline peri-articular injection (n = 31) or active peri-articular injection with saline erector spinae plane block (n = 31) in a blinded double-dummy design. Primary outcome was resting pain score in recovery. Secondary outcomes included pain scores with movement; opioid use; patient satisfaction; adverse effects in hospital; and outcomes at 24 h and 1 month. There was no difference in pain scores in recovery, with a median difference (95%CI) of 0.6 (-1.9-3.1), p = 0.65. Median postoperative oral morphine equivalent utilisation was significantly higher in the erector spinae plane group (21 mg vs. 12 mg; p = 0.028). Itching was observed in 10% of patients who received erector spinae plane block and there was no difference in the incidence of significant nausea and vomiting. Patient satisfaction scores, and pain scores and opioid use at 24 h were similar. At 1 month, six (peri-articular injection) and eight (erector spinae plane block) patients reported persistent pain. Erector spinae plane block was not superior to peri-articular injection for arthroscopic shoulder surgery.
Collapse
|
6
|
Severe protein deficiency induces hepatic expression and systemic level of FGF21 but inhibits its hypothalamic expression in growing rats. Sci Rep 2021; 11:12436. [PMID: 34127689 PMCID: PMC8203610 DOI: 10.1038/s41598-021-91274-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 05/11/2021] [Indexed: 02/05/2023] Open
Abstract
To study, in young growing rats, the consequences of different levels of dietary protein deficiency on food intake, body weight, body composition, and energy balance and to assess the role of FGF21 in the adaptation to a low protein diet. Thirty-six weanling rats were fed diets containing 3%, 5%, 8%, 12%, 15% and 20% protein for three weeks. Body weight, food intake, energy expenditure and metabolic parameters were followed throughout this period. The very low-protein diets (3% and 5%) induced a large decrease in body weight gain and an increase in energy intake relative to body mass. No gain in fat mass was observed because energy expenditure increased in proportion to energy intake. As expected, Fgf21 expression in the liver and plasma FGF21 increased with low-protein diets, but Fgf21 expression in the hypothalamus decreased. Under low protein diets (3% and 5%), the increase in liver Fgf21 and the decrease of Fgf21 in the hypothalamus induced an increase in energy expenditure and the decrease in the satiety signal responsible for hyperphagia. Our results highlight that when dietary protein decreases below 8%, the liver detects the low protein diet and responds by activating synthesis and secretion of FGF21 in order to activate an endocrine signal that induces metabolic adaptation. The hypothalamus, in comparison, responds to protein deficiency when dietary protein decreases below 5%.
Collapse
|
7
|
Lower Synthesis and Higher Catabolism of Liver and Muscle Protein Compensate for Amino Acid Deficiency in Severely Protein-Restricted Growing Rat. Curr Dev Nutr 2021. [DOI: 10.1093/cdn/nzab041_033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Abstract
Objectives
Severely low-protein (LP) diets induce a decrease in body weight and an increase in relative food including intake (FI) in rat. In the liver, changes in anabolic and catabolic protein pathways could transitorily participate to compensate for amino acid (AA) deficiency. The present study investigated these liver and muscle protein metabolic pathways on LP diet fed growing rats.
Methods
Growing rats were fed for three weeks different diets containing 3–5–8–12–15 or 20% energy from milk protein. Body weight and FI were measured daily. At the end of the experiment, rats were injected with 13C valine and tissues and biological fluids were collected for gene expression measurement, blood AA UPLC analysis and protein
synthesis rate determination in liver and muscle. Statistical analysis was done by 1- or 2-factor ANOVA, when data were repeated.
Results
P3, P5 and P8% diets resulted in significant growth retardation and significant decrease in lean mass. Severe protein deficiency induced a decrease in the rate of protein synthesis in the liver and muscle. In addition, the results showed activation of the GCN2 pathway, via ATF4-CHOP-TRB3 both in the liver and in the muscle, which suggests the inhibition of the initiation of translation at the level of the binding of the RNAt-Met. Liver proteolytic pathways were up-regulated including the ubiquitin-proteasome, the caspase system and the autophagy. In muscle, both the ubiquitin-proteasome pathway, and autophagy were increased as well as the calpain system. The GCN2 pathway, via ATF4-CHOP-TRB3 was activated in both liver and muscle, confirming the activation of protein degradation by the ubiquitin-proteasome pathways, and autophagy. In portal vein, indispensable AA were lower in severe protein deficient diet whereas in vena cava no difference was observed.
Conclusions
Severe protein restriction lowered protein synthesis and activated protein catabolism in both liver and muscle whereas no effect was observed for moderate protein restriction. These results confirm that the liver and muscle play a major role in supplying the body with indispensable AA in response to severe protein restriction.
Funding Sources
This study was funded by the doctoral school ABIES and AlimH-INRAE department.
Collapse
|
8
|
Lysine and Threonine Restriction Reproduced the Lower Synthesis but Not the Higher Catabolism of Liver and Muscle Protein of Severely Protein Restricted Growing Rats. Curr Dev Nutr 2021. [DOI: 10.1093/cdn/nzab041_034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Abstract
Objectives
The availability of indispensable amino acids (IAA) modulates protein turnover. More particularly AAI deficiency reduces protein synthesis while the consequence on proteolysis remains unclear. The present study aims to evaluate the specific response of both protein synthesis and proteolysis to a diet restricted on one strictly indispensable IAA, either lysine or threonine
Methods
Sixty-four growing rats were divided into 8 groups (n = 8/group). They were fed for 3 weeks isocaloric diets composed with different levels of lysine or threonine (L or T), 15, 25, 40, 60, 75, 100 or 170% of the theoretical lysine/threonine requirements. At the end of the experiment, rats were injected with valine13C and tissues and biological fluids were collected for gene expression measurement
and blood amino acids (AA). Protein synthesis rate (Fractional and Absolute rate synthesis, ie FSR, ASR) were determined in liver and muscle. Statistical analysis was done by 1- or 2-factor ANOVA, when data were repeated.
Results
Severe (L/T15, L/T25) and moderate (T40) lysine or threonine deficiency resulted in a decrease in body weight gain due to a decrease in lean body mass. Severe restriction (L15, T15, T25) decreased the muscle FSR whereas no effect was observed in the liver. When the rate of protein synthesis was expressed per tissue, the ASR was decreased by severe restriction of lysine and threonine in liver and muscle and by moderate threonine deficiency (T40, T60, T75) in muscle. In liver, no effect of lysine and threonine on proteolysis was observed. In muscle, only severe lysine (L15) deficiency increased proteolysis. Dietary lysine deficiency induced a decrease in lysine concentration in the portal vein and in the vena cava whereas for threonine deficiency, all AAIs except threonine were decreased in the portal vein and vena cava.
Conclusions
These results indicate that the decreased protein synthesis is the primary mechanism involved in decreased lean body mass in response to the severe deficiency in a single AAI. Deficiency of a single AAI reproduce the effect of the low protein diet on protein synthesis. Lysine and threonine deficiency differently affect for a part protein turnover probably in relation with the tissue where they are metabolized.
Funding Sources
This study was funded by the doctoral school ABIES and AlimH-INRAE department.
Collapse
|
9
|
Plasma and Urinary Amino Acid-Derived Catabolites as Potential Biomarkers of Protein and Amino Acid Deficiency in Rats. Nutrients 2021; 13:1567. [PMID: 34066958 PMCID: PMC8148556 DOI: 10.3390/nu13051567] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2021] [Revised: 04/23/2021] [Accepted: 04/28/2021] [Indexed: 12/14/2022] Open
Abstract
OBJECTIVE Dietary intakes must cover protein and essential amino acid (EAA) requirements. For this purpose, different methods have been developed such as the nitrogen balance method, factorial method, or AA tracer studies. However, these methods are either invasive or imprecise, and the Food and Agriculture Organization of the United Nations (FAO, 2013) recommends new methods and, in particular, metabolomics. The aim of this study is to determine total protein/EAA requirement in the plasma and urine of growing rats. METHODS 36 weanling rats were fed with diets containing 3, 5, 8, 12, 15, and 20% protein for 3 weeks. During experimentation, urine was collected using metabolic cages, and blood from the portal vein and vena was taken at the end of the experiment. Metabolomics analyses were performed using LC-MS, and the data were analyzed with a multivariate analysis model, partial least Squares (PLS) regression, and independent component-discriminant analysis (ICDA). Each discriminant metabolite identified by PLS or ICDA was tested by one-way ANOVA to evaluate the effect of diet. RESULTS PLS and ICDA allowed us to identify discriminating metabolites between different diet groups. Protein deficiency led to an increase in the AA catabolism enzyme systems inducing the production of breakdown metabolites in the plasma and urine. CONCLUSION These results indicate that metabolites are specific for the state of EAA deficiency and sufficiency. Some types of biomarkers such as AA degradation metabolites appear to be specific candidates for protein/EAA requirement.
Collapse
|
10
|
The Consequences of LP Diet on Food Intake, Energy Expenditure and Hepatic and Hypothalamic FGF21 Are Reproduced by lLsine or Threonine Deficiency in Rats. Curr Dev Nutr 2020. [DOI: 10.1093/cdn/nzaa049_041] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Abstract
Objectives
In mice, low protein (LP) diets increase food intake (FI) thereby energy intake, to compensate for protein deficiency, but mice do not gain fat because total energy expenditure (TEE) is also increased. Fibroblast Growth Factor 21 (FGF21), which is expressed in the liver and in the brain, appears as a key player in these effects. The present study hypothesized that both LP diet but also only lysine or threonine deficiency can modulate FGF21 in the liver and in the hypothalamus that in turn is involved in the control of FI and energy expenditure.
Methods
Growing rats were fed for 3 weeks: i) LP diets containing 3-5-8-12-15 or 20% of milk protein, or ii) the same diets but supplemented with free amino acids at the level of the 20% protein diet except for lysine or threonine leading to lysine or threonine deficient diets. Body weight and FI were measured daily and energy expenditure were measured by indirect calorimetry. At the end of the experiment, rats were euthanized, tissues and biological fluids were removed and frozen, and body composition as well as gene expression and plasma FGF21 were analyzed.
Results
Diets with 3 and 5% protein, and diets highly deficient in lysine or threonine (85% and 75%) result in significant growth retardation. LP 3% and 5% induced an increase in relative FI. Surprisingly, an increase in TEE was observed under LP 5% protein, due to an increase in motor activity. Hepatic FGF21 expression is increased at 3 and 5% and strongly decreased at 12, 15 and 20% protein. In contrast, in the hypothalamus, FGF21 expression was significantly lower in LP 3% compared to a 20% protein diet, and for the other LP diets, the values are intermediate. Plasma FGF21 was higher in 3, 5, 8 and 12% protein than in 15 and 20% protein diets. Lysine or threonine deficiency were able to reproduce the effect of LP diet at 3 and 5% whereas at 8%, only the deficiency of threonine was able to reproduce the LP effect.
Conclusions
These results showed that hepatic and hypothalamic expression FGF21 are inversely affected by protein deficiency. Such situations of deficiency induced an up-regulation of hepatic expression of FGF21 that increased TEE and a down-regulation of hypothalamic expression of FGF21 that could led to hyperphagia. Interestingly, the levels of FGF21 observed for the 3, 5, 8% protein diets could be due at least to lysine and threonine deficiency.
Funding Sources
ABIES, AlimH department of INRAe.
Collapse
|
11
|
Severe Protein Restriction Activates Liver Protein Catabolism and ATF4-CHOP-TRB3 Pathway to Comprensate for Amino Acid Deficiency. Curr Dev Nutr 2020. [DOI: 10.1093/cdn/nzaa049_040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Abstract
Objectives
Severely low-protein diets (LP) induce behavioral and metabolic changes including a decrease in body weight, an increase in relative food intake (FI) and alterations in hepatic metabolism. During such protein restriction, changes in hepatic anabolic and catabolic protein pathways could transitory participate to compensate for amino acid (AA) deficiency. In the present study, liver expression of gene involved in proteosynthesis and proteolysis pathways, were related to FI, blood AA levels and body composition in rats fed LP diet.
Methods
Growing rats were fed for three weeks different diets containing 3-5-8-12-15 or 20% energy of milk protein. Body weight and FI were measured daily. At the end of the experiment, tissues and biological fluids were removed for gene expression measurement and blood AA UPLC analysis. Statistical analysis was done by 1- or 2-factor ANOVA, when data were repeated.
Results
Despite an increase in relative food intake under P3 and P5% diets, P3, P5 and P8% diets resulted in significant growth retardation compared to other groups. Lean mass was significantly decreased in rats under P3, P5 and P8% compared to P12, P15 and P20% diets, while there was no difference in fat mass between all groups. P3, P5 and P8% diets induced a decrease in essential amino acid concentrations in portal vein, whereas there was no significant difference between groups in veina cava. Severely protein restricted P3% and P5% diets induced an increase in hepatic gene expression involved in proteolysis as calpain 2 and ubiquitin, and an activation of ATF4-CHOP-TRB3 pathway.
Conclusions
These results suggested that under severe protein restriction, hepatic protein catabolism became a source of plasma amino acid that could partially compensate for the AA not provided by the diet. These observations confirm that liver plays a major role in the adaptation of the body to dietary protein restriction and highlight that severe dietary protein restriction induced liver protein catabolism by inducing an activation of ATF4-CHOP-TRB3 pathway in order to provide amino acids to body tissues.
Funding Sources
ABIES, AlimH-INRAE.
Collapse
|
12
|
Histidine: A Systematic Review on Metabolism and Physiological Effects in Human and Different Animal Species. Nutrients 2020; 12:E1414. [PMID: 32423010 PMCID: PMC7284872 DOI: 10.3390/nu12051414] [Citation(s) in RCA: 80] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 05/08/2020] [Accepted: 05/08/2020] [Indexed: 12/13/2022] Open
Abstract
Histidine is an essential amino acid (EAA) in mammals, fish, and poultry. We aim to give an overview of the metabolism and physiological effects of histidine in humans and different animal species through a systematic review following the guidelines of PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). In humans, dietary histidine may be associated with factors that improve metabolic syndrome and has an effect on ion absorption. In rats, histidine supplementation increases food intake. It also provides neuroprotection at an early stage and could protect against epileptic seizures. In chickens, histidine is particularly important as a limiting factor for carnosine synthesis, which has strong anti-oxidant effects. In fish, dietary histidine may be one of the most important factors in preventing cataracts. In ruminants, histidine is a limiting factor for milk protein synthesis and could be the first limiting AA for growth. In excess, histidine supplementation can be responsible for eating and memory disorders in humans and can induce growth retardation and metabolic dysfunction in most species. To conclude, the requirements for histidine, like for other EAA, have been derived from growth and AA composition in tissues and also have specific metabolic roles depending on species and dietary levels.
Collapse
|
13
|
Liver GCN2 controls hepatic FGF21 secretion and modulates whole body postprandial oxidation profile under a low-protein diet. Am J Physiol Endocrinol Metab 2019; 317:E1015-E1021. [PMID: 31573843 DOI: 10.1152/ajpendo.00022.2019] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/07/2023]
Abstract
General control nonderepressible 2 (GCN2) is a kinase that detects amino acid deficiency and is involved in the control of protein synthesis and energy metabolism. However, the role of hepatic GCN2 in the metabolic adaptations in response to the modulation of dietary protein has been seldom studied. Wild-type (WT) and liver GCN2-deficient (KO) mice were fed either a normo-protein diet, a low-protein diet, or a high-protein diet for 3 wk. During this period, body weight, food intake, and metabolic parameters were followed. In mice fed normo- and high-protein diets, GCN2 pathway in the liver is not activated in WT mice, leading to a similar metabolic profile with the one of KO mice. On the contrary, a low-protein diet activates GCN2 in WT mice, inducing FGF21 secretion. In turn, FGF21 maintains a high level of lipid oxidation, leading to a different postprandial oxidation profile compared with KO mice. Hepatic GCN2 controls FGF21 secretion under a low-protein diet and modulates a whole body postprandial oxidation profile.
Collapse
|
14
|
Molecular Markers of Dietary Essential Amino Acid-deficiency (P08-059-19). Curr Dev Nutr 2019. [DOI: 10.1093/cdn/nzz044.p08-059-19] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Abstract
Objectives
The quality of dietary protein sources became a particularly sensitive issue in the current debates on a rebalancing between animal and vegetable food sources.
The ability of a protein to meet the nutritional requirements of essential amino acid (EAA) is the basis for assessing the quality of protein.
The objective of this study was to characterize the impact of lysine- and threonine-deficient gluten-based diets on the metabolism of growing rats and to identify molecular markers of these diets.
Methods
Growing rats were fed for 3 weeks with a threonine-supplemented and 70% lysine-deficient gluten diet; a lysine-supplemented and 47% threonine-deficient gluten diet; a gluten diet supplemented in lysine and threonine to meet all the AA requirements, and a control diet with milk protein to meet all the AA requirements.
Body weight and food intake were measured daily. At the end of the experiment, tissues and biological fluids were removed. The body composition was analyzed, gene expression measurements involved in protein and lipid metabolism were performed and the urinary metabolome was analyzed by LC-MS. Statistical analysis was done by variance analysis and metabolome analysis by discriminant analysis of independent components.
Results
These EAA deficiency does not modify the food intake. Lysine deficiency induces a decrease in body weight gain, and lean body mass, associated with an increased in proteolysis and a decreased in proteosynthesis, a decreased in bone mineral density, and no effect on lipid metabolism.
Threonine deficiency induces a decrease in body weight gain, and liver and skin weight, without changes in protein metabolism, bone mineral density, and lipid metabolism. After approval of the deficiency model, the metabolomic analysis performed on urine samples revealed the presence of specific discriminating molecules of the diets and types of proteins.
Conclusions
EAA deficiency has an impact on the growth, and bone and protein metabolism of growing rats. These deficiency states have resulted in different metabolome profiles that could lead to the identification of specific molecular markers of protein sources and related to EAA deficiencies.
Funding Sources
This study was funded by the UMR Nutrition Physiology and Ingestive Behavior.
Supporting Tables, Images and/or Graphs
Collapse
|
15
|
Impact of Low Protein and Lysine-deficient Diets on Bone Metabolism (P08-072-19). Curr Dev Nutr 2019. [DOI: 10.1093/cdn/nzz044.p08-072-19] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Abstract
Objectives
Low protein diet and essential amino acid deficient-diet have an impact on body weight and growth and different studies also showed an impact of lysine intake on bone metabolism. Lysine has been shown to promote the absorption of intestinal calcium and to participate in the collagen synthesis through its involvement in the reticulation process of the tropocollagen beams. The assembly of tropocollagen bundle into mature collagen fibers is essential for bone formation and remodeling (civitelli et al, 1992; Fini et al, 2001). The objective of this study was to characterize the impact of low protein diet and lysine-deficient diet on bone metabolism of growing rats.
Methods
Study 1: 6 group of growing rats were fed for 3 weeks different diet with different content of milk protein at levels of 3%, 5%, 8%, 12%, 15% or 20% (% total energy).
Study 2: 7 group of growing rats were fed diets with different lysine content (as % of lysine requirement), for 3 weeks: 15%, 25%, 40%, 60%, 75%, 100% or 170% (% Lysine requirement). Body weight was measured daily. At the end of the experiment, the body composition was analyzed and tissues were removed for measurements of the expression of genes involved in protein and bone metabolism. Statistical analysis was done by variance analysis.
Results
Rats fed low protein diet (3% and 5% of milk protein), compared to control have a lower growth, with a lower body weight and naso-anal length. This weak growth was associated with a lower lean body mass, and also had an impact on bone metabolism. There was a decrease in the bone mineral density, bone mineral content and femur size, associated with a decrease of markers of bone turnover and formation. The same results on bone metabolism were observed on rats fed the 85% lysine deficient diet.
Conclusions
Low protein diet and lysine-deficient diet reduce growth and bone metabolism. The impact of low protein diet could be related to the lysine deficiency, which have an impact on the calcium intestinal absorption and on collagen synthesis.
Funding Sources
INRA, AgroParisTech.
Supporting Tables, Images and/or Graphs
Collapse
|
16
|
Identifying solar access effects on visitors' behavior in outdoor resting areas in a subtropical location: a case study in Japan Square in Curitiba, Brazil. INTERNATIONAL JOURNAL OF BIOMETEOROLOGY 2019; 63:301-313. [PMID: 30680623 DOI: 10.1007/s00484-018-01664-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/03/2018] [Revised: 12/06/2018] [Accepted: 12/17/2018] [Indexed: 06/09/2023]
Abstract
Changes in microclimate due to urban morphology tend to directly affect outdoor thermal comfort, thereby influencing people's behavior. In order to investigate that, this study analyzed preferences for specific resting areas within an urban square surrounded by high-rise buildings in a subtropical location. In order to understand behavioral adaptations as regards sunlight availability (direct or reflected) or shaded situations (partly, fully) in resting areas, the analysis was conducted according to an observational method during the four seasons of 2016. Two high-definition cameras with time-lapse function were positioned at vantage points facing distinct benches, shooting at intervals of 1.5 min between scenes. Altogether, 86,561 scenes were analyzed. As a thermal comfort parameter, the outdoor thermal comfort index 'PET' (physiological equivalent temperature) was used, by post-processing meteorological data from the local meteorological station. The availability of situations (sun-lit, shaded-partly or fully, reflected sunlight) in each frame and per bench and the preference of visitors for such areas were considered in the analysis. During winter, there was a prevalence of shaded situations, mostly due to adjacent buildings. In summer, the most common condition was partly shaded by trees. The choice for a given resting condition was found to be closely related to PET index values and thermal comfort/stress classes and less so to seasonal changes.
Collapse
|
17
|
Identification de biomarqueurs de sources protéiques déficientes en acides aminés indispensables: lysine et thréonine. NUTR CLIN METAB 2019. [DOI: 10.1016/j.nupar.2019.01.334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
18
|
AVR the ignored lead in pre hospital management of STEMI patients. Eur Heart J 2013. [DOI: 10.1093/eurheartj/eht307.p451] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
19
|
|
20
|
Faecal 11-ketoetiocholanolone measurement in Iberian red deer (Cervus elaphus hispanicus): validation of methodology using HPLC–MS/MS. ANIMAL PRODUCTION SCIENCE 2012. [DOI: 10.1071/an12022] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
A cortisol metabolite, 11-ketoetiocholanolone (11-k), is widely used in monitoring stress in several vertebrates, and can be detected by immunoassay. However, these assays have certain limitations with respect to specificity. Also, differences in the excretion of faecal glucocorticoid metabolites (FGM) among species and even between sexes make validation necessary in each case. Therefore, our aims were, first, to develop and validate a high-pressure liquid chromatography–tandem mass spectrometry (HPLC–MS/MS) methodology for monitoring 11-k in faeces of Iberian red deer (Cervus elaphus hispanicus), and second, to investigate the capability of our method to determine variations of this FGM in a longitudinal study. Third, and finally, we assessed the correspondence between faecal 11-k concentrations and plasma cortisol. An adrenocorticotropic hormone (ACTH) test was performed on six red deer stags translocated and kept in captivity for a week and faecal samples were collected twice a day. One single blood and faecal sample from another seven stags was also collected after 2 weeks in captivity. The results of the longitudinal study showed a first peak in 11-k 36 h after the ACTH test and handling, and a second peak at 120 h of being kept indoors. Maximum concentrations of 11-k ranged from 22.71 to 375.68 ng/g. In the second stag group, 11-k concentrations of 25.09 ± 20.53 ng/g had a correlation of r2 = 0.88 with the concentration of plasma cortisol, which was 54.6 ± 55.1 ng/mL. This technique is capable of detecting changes in the concentrations of faecal 11-k. The values determined have a good correlation with the cortisol concentration in blood, and we also detected differences in different individuals’ responses to the same stressors.
Collapse
|
21
|
Assessing red deer antler density with a hydrostatic method versus a new parametric volume-modelling technique using 3D-CAD. ANIMAL PRODUCTION SCIENCE 2012. [DOI: 10.1071/an12015] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Two methods of volume measurement were compared, to develop a simple and reliable method for estimating the whole-antler density. We used 10 cast antlers, previously dried and weighed, from 10 different red deer (Cervus elaphus hispanicus) individuals. The volumes were determined by the traditional Archimedes method versus a new parametric volume-modelling technique using a ‘computer-aided design-three dimensions’ (3D-CAD), which is now being used in the biomedical industry in applications such as medical-implant design, tissue engineering and in developing a better understanding of anatomical functionality and morphological analysis. The process paths to follow in the generation of CAD models from cast antlers were described. The whole-antler density was estimated from the weight and volume measurement and a paired-sample comparison procedure was performed to assess differences between volumes as well as densities. Cast-antler weight ranged from 219.93 to 1857.9 g, and the volume estimated by the hydrostatic method was 732.45 ± 474.06 cm3 and by the CAD-3D method it was 730.65 ± 492.59 cm3. The DM density of the antler by the hydrostatic method (Density A) was 1.112 ± 0.120 g/cm3, ranging from 0.915 to 1.345 g/cm3 (Shapiro–Wilks, P = 0.449), and by the 3D-CAD method (Density B) it was 1.112 ± 0.158 g/cm3, ranging from 0.939 to 1.326 g/cm3 (Shapiro–Wilks, P = 0.751). There were no differences in the volume (t = 0.95, P = 0.37) or density (t = 0.54, P = 0.60) between the two methods and the correlation coefficient between Density A and Density B was 0.968. Both methods had similar reliability, although the computing process with the 3D-CAD system calculated antler volume faster than did the traditional hydrostatic weighing. 3D-CAD also avoided cast damage and the methodological problems with larger or smaller antlers or with floatability due to low density, which occur when using the hydrostatic method.
Collapse
|
22
|
Seasonal and specific diet variations in sympatric red and fallow deer of southern Spain: a preliminary approach to feeding behaviour. ANIMAL PRODUCTION SCIENCE 2012. [DOI: 10.1071/an12016] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
We studied the diet composition and diet overlap in sympatric red deer (Cervus elaphus hispanicus) and fallow deer (Dama dama) throughout a whole year in order to determine variation due to season, species, sex and age class by analysing rumen contents samples of 81 red and 69 fallow deer shot monthly during 2008–09 in Sierra de Andújar Natural Park, southern Spain. We assessed diet similarity and possible inter- and intra-specific foraging competition. We found different foraging strategies for both species and sexes during constraint periods, and several theoretical considerations of specific interactions and behaviour are discussed with respect to the Mediterranean environment. In both species an annual diet dominated by grasses was recorded, peaking in spring. Browses were an important food resource at the end of winter and at the end of summer, and fruit more in autumn and winter. Red deer ingested a higher proportion of browse than fallow deer, which consumed more acorns and for a longer time showed a better ability to compensate for nutritional constraint periods. An overall decline in diet similarity in summer and at the end of winter led us to assume that exploitative competition between red and fallow deer and even between sexes was probable. Red deer females showed low diet similarity to other deer, while there was a great diet overlap between red deer males and fallow deer females at the end of summer. Differences detected between both two species and sexes do not always support predictions deriving from specific body size and morpho-physiological characteristics, but can probably be explained as a consequence of different metabolic demands. The relationship between plant nutritional attributes and food selection according to reproductive or physiological status and seasonal demands for both sexes and species should be researched in order to perform a better assessment of deer feeding behaviour.
Collapse
|
23
|
[Accuracy of the door-to-balloon time for assessing the result of the interventional reperfusion strategy in acute ST-segment elevation myocardial infarction]. Ann Cardiol Angeiol (Paris) 2011; 60:244-251. [PMID: 21978820 DOI: 10.1016/j.ancard.2011.07.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2011] [Accepted: 07/24/2011] [Indexed: 05/31/2023]
Abstract
BACKGROUND In patients with acute ST-segment elevation myocardial infarction (STEMI), recent clinical guidelines recommend that primary percutaneous coronary intervention (PCI) should be performed within 90min of first medical contact or 45min after admission in cathlab. The Door-to-Balloon time (D2B) is widely used to measure the performance of interventional centres. AIM OF THE STUDY To analyze the time to reperfusion in a consecutive series of STEMI patients referred for primary PCI, and to evaluate the clinical accuracy of D2B in primary PCI. METHODS From January 2007 to March 2008, 177 patients were admitted within 12hours of a STEMI in our institution, and 87 were referred for a direct coronary angiography for primary PCI (47 by mobile medical emergency unit, 40 by the emergency department of the institution). RESULTS The median time from first medical contact to balloon inflation (M2B) was 135min [IQR 112-183]. Recommended times were fulfilled in a minority of patients (M2B<90min: 9%,<120min: 34%). Median cathlab D2B was 51min [IQR 44-65], and was less than 45min in 34% of patients. No differences for times to reperfusion within cathlab were found between in- and off-time hours. M2B and D2B were unavailable in 23 patients (26%), because of a spontaneous TIMI 3 flow reperfusion without indication for immediate PCI in 20 patients, contra-indication for PCI in two (distal occlusion, culprit vessel diameter less than 2mm), and failure in occlusion crossing by the guide-wire in one patient. In contrast, first medical contact- or door-to-reperfusion times, assessed by a TIMI 3 flow without no-reflow in culprit artery, were available in 95% of patients, and were shorter than M2B or D2B, respectively. CONCLUSION Although it is a feasible and reproducible process performance measure, D2B time is weakly associated with the outcome of the interventional reperfusion strategy in acute STEMI. This measure should be associated with an outcome performance measure, such as the rate of TIMI 3 flow achieved by primary PCI, or replaced by the Door-to-TIMI 3 flow reperfusion time.
Collapse
|
24
|
Policies and management of overabundant deer (native or exotic) in protected areas. ANIMAL PRODUCTION SCIENCE 2011. [DOI: 10.1071/an10288] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
A workshop was convened in Chile in August 2010 as part of the 7th International Deer Biology Congress (IDBC). Its aim was to explore global differences in the policies and management of overabundant deer in protected areas. The main goal of the workshop was to provide South American researchers and managers with a snapshot of some of the approaches to management of deer overabundance used in a diverse array of case studies from North America, Europe, Australia and New Zealand. Various case studies were presented to illustrate the different methodological approaches in implementing deer control measures. Some general recommendations were formulated.
Collapse
|
25
|
Abstract
INTRODUCTION Acute cellular rejection is a major cause of graft loss in heart transplantation (HT). Endomyocardial biopsy remains the gold standard for its diagnosis, but it is an invasive procedure not without risk. A proinflammatory state exists in rejection that could be assessed by determining plasma levels of inflammatory biomarkers. OBJECTIVE To analyze the utility of various inflammatory markers, which is most important and what values best classify patients to diagnose rejection. MATERIALS AND METHODS A prospective study in 123 consecutive cardiac transplant recipients was conducted from January 2002 to December 2006. Fibrinogen protein (Fgp) and function (Fgf), C-reactive protein (CRP), tumor necrosis factor-alpha (TNF-alpha), interleukin-6 (IL-6), and sialic acid (SA) determinations were performed at one, two, four, six, nine, and 12 months post-HT at the same time as biopsies. Coronary arteriography and intravascular ultrasound were performed on the first and last follow-up visits. Heart-lung transplants, retransplants, pediatric transplants, patients who died in the first month, and patients who refused consent were excluded. Also excluded were determinations that coincided with renal dysfunction, active infection, hemodynamic instability, or a non-evaluable biopsy. The final analysis included 79 patients and 294 determinations. The correlation between the levels of these biomarkers and the presence of rejection in the biopsy (> or = ISHLT grade 3) was studied. RESULTS We did not find significant differences in the values of any of the markers analyzed on the six follow-up visits. Only CRP showed significant and sustained differences between the two groups (with and without rejection) from the second follow-up visit (month 2). The area under the curve showed significant differences in Fgp (0.614, p = 0.013), Fgf (0.585, p = 0.05), TNF-alpha (0.605, p = 0.02), SA (0.637, p = 0.002) and mainly CRP (0.765, p = 0.0001). CRP levels below 0.87 mg/dL ruled out rejection with a specificity of 90%. CONCLUSIONS Among the inflammatory markers analyzed, CRP was the most useful parameter for non-invasive screening of acute cellular rejection in the first year post-HT.
Collapse
|
26
|
Correlation between beta-adrenoceptors and G-protein-coupled receptor kinases in pretransplantation heart failure. Transplant Proc 2009; 40:3014-6. [PMID: 19010176 DOI: 10.1016/j.transproceed.2008.09.011] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
INTRODUCTION Prolonged catecholamine overstimulation of the myocardium in chronic heart failure causes a reduction in the number and functionality of beta1-adrenoceptors (beta1-AR) of the heart. Desensitization of beta1-AR is mediated by their phosphorylation by a group of cytosolic kinases (G-protein-coupled receptor kinases GRK). In advanced heart failure, an increase in GRK levels associated with the severity of the disease has been observed. OBJECTIVE The objective of this study was to analyze messenger RNA (mRNA) levels of beta1-AR in the myocardium of patients who underwent transplantation for advanced heart failure and their correlation with expression of the major cardiac isoenzymes of GRK. MATERIALS AND METHODS Myocardial tissue samples were obtained from the left ventricles of 14 explanted hearts of patients who underwent transplantation for dilated (n = 7) and ischemic (n = 7) cardiomyopathy. RT-PCR techniques were used to analyze mRNA levels of beta1-AR and the isoenzymes GRK2, GRK3, and GRK5. RESULTS We observed a significant correlation between beta1-AR and the 3 subtypes of GRK (R(2) = 0.668, 0.71, and 0.318, respectively). CONCLUSIONS In patients with advanced heart failure pretransplantation, we observed a significant correlation between beta1-AR and GRK2 and GRK3 levels. GRK5, the subtype predominantly expressed in the myocardium, showed a lesser correlation with beta1-AR levels.
Collapse
|
27
|
Analysis of Heart Rate Turbulence in Advanced Heart Failure and Heart Transplantation Patients. Transplant Proc 2008; 40:3012-3. [DOI: 10.1016/j.transproceed.2008.09.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
28
|
126: Development and Validation of a Sedation Scale for Out-of-Hospital Intensive Care. Ann Emerg Med 2008. [DOI: 10.1016/j.annemergmed.2008.06.040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
29
|
Prognostic Value of Brain Natriuretic Peptide in Heart Transplant Patients. J Heart Lung Transplant 2007; 26:986-91. [DOI: 10.1016/j.healun.2007.07.023] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2007] [Revised: 07/04/2007] [Accepted: 07/15/2007] [Indexed: 11/29/2022] Open
|
30
|
Abstract
INTRODUCTION Preoperative pulmonary hypertension is an adverse prognostic factor for early morbidity-mortality after heart transplantation (HT). The persistence of hypertension is likewise associated with a poorer patient prognosis. The present study investigated the evolution of right cardiac pressures in the first year after HT with respect to the background cardiac disease. METHODS This study of 60 consecutive patients subjected to HT analyzed the baseline clinical characteristics and mean right atrial and right ventricle systolic and diastolic pressures in a pre-HT study and during biopsies performed in the first 2 weeks as well as at 1, 3, 6, 9, and 12 months after transplantation. The study excluded retransplantations, heart and lung transplantations, and pediatric patients, as well as patients not subjected to biopsy because of early mortality. RESULTS The mean patient age was 50 years (83% males); 31.7% were diabetics, and 33% showed hypertension. The background heart disease was of ischemic origin in 35% of cases, and consisted of dilated myocardiopathy in 33%, with a mean left ventricle ejection fraction (LVEF) of 23% and a mean pulmonary artery systolic pressure of 50.1 mm Hg. During the postoperative course, an important decrease versus baseline was observed in right heart pressures as soon as 2 weeks post-HT, with a drop in right ventricle (RV) systolic pressure from 50.3 +/- 13.7 to 42.5 +/- 10.4 mm Hg (P = .013), and a drop in RV diastolic pressure from 17.4 +/- 5.8 to 14.2 +/- 4.1 mm Hg (P = .007). This decreased tendency continued to a more moderate extent to the third month, after which the pressures stabilized. The same behavior was observed in patients with diseases of ischemic origin and in those with dilated myocardiopathy. CONCLUSIONS In our series, right cardiac pressures showed an important decrease in the first days after HT, with stabilization by the third month--though without returning to normal values and without modifications in the first year after transplantation. No differences in this evolutive trend were seen according to the type of background heart disease.
Collapse
|
31
|
Abstract
BACKGROUND Mammalian target of rapamycin (mTOR) inhibitors are relatively new drugs in the field of cardiac transplantation (HT), hence the need for further study of their secondary effects. We described the nature and incidence of secondary effects of these drugs in a group of HT recipients. PATIENTS AND METHODS We studied 23 HT recipients aged 52 +/- 9 years (Male: 91%, body mass index: 27 +/- 3.7, ischemic cardiopathy: 52%, dilated cardiomyopathy: 39%) who were started on an mTOR inhibitor (everolimus: 65%, sirolimus: 35%) as part of their treatment. We have described the secondary effects detected during a follow-up period of 10.7 +/- 6 months. RESULTS The reasons for starting the drug were renal impairment (65%), tumors (26%), and others (8%). During follow-up, 17% of patients required a dose reduction and 12% required drug withdrawal: edemas: 4%, recurrent infection: 4%, and hemolytic-uremic syndrome: 4%. Drug-attributable edemas presented in 26% of patients. Thirty nine percent suffered an infection that required hospital admission, 89% of which were lung and all bacterial two patients died due to the infection). The mean time to first infection was 5 +/- 6 months. In patients who had a treatment change due to tumors, 50% experienced improvement. We did not detect alterations in cholesterol, triglycerides, creatinine, or leukocytes. There was a nonsignificant trend toward decreased hemoglobin and platelet levels (P = .07 and P = .056, respectively). CONCLUSIONS Lung infection was the principal complication among our patients treated with mTOR inhibitors. A large percentage required dose reduction (17%) and even drug withdrawal (12%) due to secondary effects.
Collapse
|
32
|
Abstract
INTRODUCTION Smoking is an important risk factor in any population group. According to previous studies, having been a smoker before heart transplantation (HT) confers a greater likelihood of developing any type of tumor or other complication after HT. Our objective was to determine the impact of having been a smoker before HT on survival, respiratory complications during the postoperative period, and long-term tumor development. MATERIALS AND METHODS After excluding combined transplantations, pediatric transplantations, and retransplantations, we retrospectively reviewed 288 HT performed between November 1987 and September 2006. We divided patients into nonsmokers (including those who quit smoking more than 1 year before HT (n = 163), exsmokers for less than 1 year (n = 76), and those who smoked until HT (n = 49). The statistical tests were chi-square, Student t, analysis of variance (ANOVA), and Kaplan-Meier curves. RESULTS There were more male patients among smokers and exsmokers than nonsmokers (93.9% vs 96.1% vs 82%, respectively; P = .003). There were no differences in baseline characteristics between the groups. Exsmokers remained intubated for a longer time than smokers or nonsmokers (33.4 +/- 44.6 vs 14.2 +/- 7.3 vs 17.9 +/- 19.2, respectively; P = .05). We observed the same trend in recovery unit stay (7.9 +/- 10.5 days vs 4.4 +/- 1.88 days vs 4.84 +/- 3.49 days, respectively; P = .021). The development of any type of tumor was also more frequent among smokers and exsmokers, although not significantly. The survival rate was similar in nonsmokers and exsmokers, although higher than in smokers (89.57 vs 92.11% vs 81.63%, respectively; P = .031). We did not observe differences in the causes of death. CONCLUSIONS Patients who smoke or have smoked until shortly before HT showed a poorer prognosis and a longer recovery unit stay. There was also a trend to increased tumor development.
Collapse
|
33
|
Abstract
BACKGROUND Renal function deterioration is one of the main problems facing heart transplant recipients. The mammalian target of rapamycin (mTOR) inhibitors, in combination with or replacing calcineurin inhibitors, may help preserve renal function. The aim of this study was to evaluate the progression of renal function after switching the immunosuppressive regimen. PATIENTS AND METHODS We studied 23 heart transplant recipients (5.5 +/- 4.5 years since transplantation). An mTOR inhibitor was introduced to replace cyclosporine (everolimus, 65%; sirolimus, 35%). Patient clinical characteristics and renal function were studied after switching. The statistical analysis used Student t test for paired data. RESULTS The reason for the transplantation was ischemic cardiopathy (52%), dilated myocardiopathy (39%), or other causes (9%). Mean age at time of transplantation was 52 +/- 9 years. Comorbidities were as follows hypertension (43%), insulin-dependent diabetes (22%), hypercholesterolemia (39%), and ex-smokers (70%). The reason for the switch was increased creatinine (65%), appearance of tumors (26%), or others (8%). Previous creatinine level was 1.89 +/- 0.6 mg/dL with clearance of 61.7 +/- 23 mL/min and at the end of follow-up (mean follow-up, 11 +/- 6 months) creatinine level was 2.0 +/- 1.45 mg/dL with clearance of 68.3 +/- 35 mL/min, namely, no significant difference (P = .49 and P = .57, respectively). In the subgroup of patients who switched treatment due to renal dysfunction, initial creatinine level was 2.38 +/- 0.4 mg/dL with clearance of 42.3 +/- 10 mL/min and at the end of follow-up it was 2.28 +/- 0.2 mg/dL and 43.6 +/- 11 mL/min, respectively (P = .68 for creatinine and clearance). CONCLUSIONS The introduction of mTOR inhibitors to the immunosuppressant regimen may be useful to delay renal functional deterioration caused by calcineurin inhibitors.
Collapse
|
34
|
Differences in Clinical Profile and Survival After Heart Transplantation According to Prior Heart Disease. Transplant Proc 2007; 39:2350-2. [PMID: 17889185 DOI: 10.1016/j.transproceed.2007.06.068] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
OBJECTIVE The objective of this study was to compare baseline characteristics and long-term survival among patients undergoing heart transplantation (HT) according to the 3 main types of prior heart disease: ischemic, idiopathic dilated cardiomyopathy (IDC), and valvular. MATERIALS AND METHODS Four hundred twenty-three HTs performed between 1989 and 2005 were included. We excluded pediatric transplantation, retransplantations, combined transplantations (lung and kidney), and transplantations due to heart diseases other than ischemic, IDC, and valvular. Baseline characteristics of the recipients were analyzed, as well as short-term and long- term survival by groups. Analysis of variance (ANOVA) was used for continuous variables and chi-square was used for categorical variables. Survival analysis was computed using Kaplan-Meier curves and the log-rank test, as well as multivariate analysis using logistic regression. RESULTS The ischemic and valvular heart disease groups were older and had a more frequent history of prior heart surgery and circulatory support at the time of transplantation compared with the IDC group. The incidence of arterial hypertension and dyslipidemia was higher among ischemic heart disease recipients. Survival rates at 30 days did not show significant differences (ischemic, 88%; IDC, 93%; and valvular; 84%; P = .21). Long-term survival rates were greater in the IDC than in the valvular or ischemic heart disease groups (75% vs 65% and 62%, respectively; P = .021). The multivariate analysis showed an association between the IDC group and long-term survival (odds ratio [OR], 0.55; 95% confidence interval [CI] 0.35-0.89; P = .015). CONCLUSIONS (1) Patients showed a different clinical profiles depending on their pretransplantation heart disease. (2) There were no differences in early mortality between the groups. (3) Long-term survival was significantly greater among IDC transplant recipients and similar in ischemic and valvular heart disease transplant recipients.
Collapse
|
35
|
Clinical and Hemodynamic Profile of Patients With Advanced Heart Failure Considered for Heart Transplantation. Transplant Proc 2007; 39:2341-3. [PMID: 17889182 DOI: 10.1016/j.transproceed.2007.06.049] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
INTRODUCTION The present study evaluated the clinical and hemodynamic situation of patients with advanced heart failure considered for heart transplantation (HT) to examine the possible impact of prior cardiac disease. METHODS We analyzed the pretransplant clinical, echocardiographic, and hemodynamic parameters of 422 consecutive HT patients. Pediatric and heart plus lung transplants were excluded, as were retransplantations. The results were compared by dividing the patients into three groups according to the background heart disease that led to HT: ischemic heart disease (IHD), dilated myocardiopathy (DMC), or valvular disease. RESULTS Differences were observed in the baseline characteristics according to the type of heart disease. Male gender, hypertension, and diabetes were more frequent among IHD, while DMC patients tended to be younger. There were no differences in the clinical parameters such as liver and kidney function, in the functional class, or in the need for inotropic treatment over the days prior to transplantation. Likewise, no differences were recorded in the hemodynamic parameters, such as pulmonary pressure, pulmonary vascular resistance, or transpulmonary pressure gradient. As regards the echocardiographic parameters, the patients with DMC showed greater ventricular diameters and lesser ejection fractions for both ventricles. CONCLUSION No important differences were recorded in the clinical situation or hemodynamic parameters of patients with advanced heart failure accepted for transplantation, according to the background cardiac disease. This observation could be due to the homogenization by strict transplant waiting list inclusion criteria.
Collapse
|
36
|
Abstract
UNLABELLED Dyslipidemia is a common problem among heart transplant (HT) recipients; it is a frequent risk factor in these patients that is exacerbated by immunosuppressive drugs. Statins are effective drugs to treat dyslipidemia in HT recipients, but control is suboptimal in some patients. Ezetimibe acts through inhibition of the enterohepatic recirculation, a mechanism different from but complementary to statins. Our objective was to assess the effect of the addition of ezetimibe to statin therapy among a population of HT patients. PATIENTS AND METHODS We included 19 stable patients on statin therapy with suboptimal control of cholesterol. Determinations were performed at baseline on statins and at 6 months (statins + ezetimibe). The analyzed variables were total cholesterol and fractions, triglycerides, cyclosporine levels, CPK, SGOT/SGPT, and bilirubin. The statistics were Student's t test for paired samples. RESULTS The overall mean age was 59 +/- 9 years with 95% males and mean BMI 27.5 +/- 3.5. The time since HT was 7 +/- 3 years. The reason for HT included ischemic heart disease in 68%. Pre-HT risk factors included in arterial hypertension in 32% and insulin-dependent diabetes mellitus in 10%, Dyslipidemia occurred in 68%; hypertriglyceridemia in 16% and hyperuricemia in 21%. Immunosuppression was cyclosporine in 100% and steroids in 94%. Type of lipid-lowering agent was simvastatin in 5%; pravastatin, 32%; atorvastatin, 58%; fibrates, 10%. The ezetimibe dose was 10 mg/day in 95% of cases. When ezetimibe was added we observed differences in total cholesterol values (total cholesterol at baseline: 279 +/- 74, total cholesterol with ezetimibe: 198 +/- 47 mg/dL; P = .0001) and LDL-cholesterol values (LDL-cholesterol at baseline: 171 +/- 69, LDL-cholesterol with ezetimibe: 109 +/- 41 mg/dL; P = .001). The remaining variables did not show significant differences. CONCLUSION The addition of ezetimibe to statin therapy among heart transplant patients was effective to control dyslipidemia and showed an excellent safety profile.
Collapse
|
37
|
Abstract
OBJECTIVE The objective of this study was to describe the clinical course of patients with chronic hepatitis C virus (HCV) infection undergoing heart transplantation (HT). MATERIALS AND METHODS Among 499 patients transplanted in our hospital between January 1989 and September 2006, 11 subjects (2.2%) had chronic HCV infection. We analyzed liver function laboratory parameters pretransplantation as well as at 3, 6, 12 months, and last available, pre- and postsurgical hepatobiliary ultrasounds, and mortality. The mean time since HT was 32 +/- 23 months. RESULTS No abnormalities in the liver parenchyma were observed on the ultrasound examinations performed before or after transplantation. There were 3 deaths (27%), none of which was related to HCV infection. Liver function laboratory parameters remained stable during the follow-up. CONCLUSIONS The clinical course of patients with chronic HCV infection undergoing HT whose presurgical assessment did not show significant liver damage was favorable. No morphological or laboratory abnormalities were observed that would suggest reactivation of the infection during the follow-up.
Collapse
|
38
|
Mortality After Heart Transplantation in Adults With Congenital Heart Disease: A Single-Center Experience. Transplant Proc 2007; 39:2357-9. [PMID: 17889188 DOI: 10.1016/j.transproceed.2007.06.045] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
UNLABELLED The number of congenital heart disease (CHD) patients transplanted to date is small. The results are comparable to those undergoing heart transplantation (HT) for other etiologies. However, advances in pediatric surgery over recent years (eg, the Fontan procedure) has increased the demand for HT by a growing number of children who reach adulthood and who also have a different profile. We analyzed the clinical profile and survival of our CHD patients compared with other etiologies. MATERIALS AND METHODS From July 17, 1991 to December 31, 2006, eight HT were performed in our center for CHD. A descriptive study determined the baseline characteristics and survival of these patients, compared with those of the overall transplant group and other subgroups (dilated cardiomyopathy, ischemic heart disease). RESULTS Mean age was 26 years. Four (50%) CHD patients were diagnosed with single-ventricle anatomy, associated or not with other lesions; none had been operated with the Fontan procedure. Two patients died prematurely. Early, 1-, and 10-year survival was 75% at each time point. Early, 1-, and 10-year survival in the group with other diagnoses was 90%, 78%, and 60%, respectively, and in the dilated cardiomyopathy group it was 94%, 86%, and 72%, respectively. CONCLUSION The current number of CHD transplant patients was small and young. The most common etiology was single-ventricle anatomy without a prior Fontan operation. Overall survival was comparable to HT for dilated cardiomyopathy.
Collapse
|
39
|
Abstract
UNLABELLED The 2006 International Society for Heart and Lung Transplantation registry reported that there were differences in mortality after heart-lung transplantation (HLT) depending on the etiology for transplantation. Our objective was to conduct an analysis on mortality after HLT at our center. MATERIALS AND METHODS From January 1991 to December 2006, 25 HLT were performed on patients with the following characteristics: mean age of 38 +/- 11 years with 62% males and 4% with previous surgery. The cohort included 17% urgent transplants. The mean ischemia time was 198 +/- 60 minutes. We divided patients into four etiologic groups: congenital heart disease of the Eisenmenger type; primary pulmonary hypertension; chronic obstructive pulmonary disease/emphysema/fibrosis with right ventricular impact; or pulmonary dysfunction with concomitant left ventricular depression. Three patients were excluded from the analysis because they did not fit in any of the groups. RESULTS The mean follow-up of the sample was 862 +/- 1290 days. The overall hospital survival as well as that at 1 and 5 years was 59%, 50%, and 37%, respectively. In the Eisemmenger's syndrome cohort no death occurred during hospitalization and survival at 5 years was 50%. CONCLUSIONS HLT was a therapeutic option with high mortality. Hospital mortality was high in absolute terms. Congenital heart disease of the Eisenmenger type may be a lower risk group.
Collapse
|
40
|
Abstract
INTRODUCTION Many studies have shown a detrimental effect of female donor gender on heart transplantation (HT) outcome. OBJECTIVE We retrospectively evaluated our experience in HT to determine the effect of donor gender on early survival. MATERIALS AND METHODS We divided the sample of 464 primary HT from November 1997 to September 2006 into 4 groups: G1, female donor to a male recipient; G2, male donor to a male recipient; G3, male donor to female recipient; and G4, female donor to a female recipient. We performed a descriptive study of the baseline characteristics. The chi(2) test was used to determine differences in early mortality (30 days) between groups and a multivariate analysis to identify confounding factors to increase mortality. RESULTS Although the univariate study showed that G1 showed a significantly lower early survival rate (84%) than G2 (91%), the multivariate study adjusted for donor and recipient weight and size, urgency level, previous surgery, and age only showed urgency level (odds ratio [OR] 2.6; 95% confidence interval [CI] 1.2-5.57; P = .016) and previous surgery (OR 5.8; 95% CI 2.7-12.4; P < .01) to be predictors of early mortality. When baseline characteristics were analyzed, we found that 31% of HT in G1 were urgent versus 18% in G2, and 32% of patients in G1 had previous surgery versus 17% in G2. CONCLUSIONS Donor gender did not appear to negatively affect early survival. In our series, urgent HT in male recipients with a female was more frequent than with a male donor heart. The higher early mortality in male recipients of an urgent HT from a female than from a male donor was attributable to a higher baseline risk profile.
Collapse
|
41
|
Patient radiation dose management in dental facilities according to the X-ray focal distance and the image receptor type. Dentomaxillofac Radiol 2007; 36:282-4. [PMID: 17586855 DOI: 10.1259/dmfr/67494525] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
OBJECTIVES To determine particular reference values for patient radiation dose management according to the focus-to-skin distance and the image receptor type. METHODS Air-free ionization chambers and solid state detectors built in quality control devices were used to measure X-ray tube output in a sample of 2811 X-ray units. The mean exposure time was estimated by the local person responsible for the quality assurance programme. From this data, mean air kerma values were calculated at the focus-to-skin distance. To obtain the entrance surface air dose, a backscatter factor of 1.2 was used. RESULTS Third-quartile values were 5.3 mGy for old X-ray sets with around 10 cm focus-to-skin distance (FSD); for 20 cm FSD, the values were 4.1 mGy for D type, 3.4 mGy for E/F film type and 1.2 mGy for digital image receptors; and for 30 cm FSD the values were 2.1 mGy for D type, 1.8 mGy for E/F film type and 0.6 mGy for digital image receptors. CONCLUSIONS Evidence based on third-quartile values different from those expected justifies the need for particular reference values taking into account image system and focal distance, as a way to improve patient radiation dose management.
Collapse
|
42
|
[Perceived quality of resident training in anesthesiology and postoperative recovery care]. REVISTA ESPANOLA DE ANESTESIOLOGIA Y REANIMACION 2007; 54:340-8. [PMID: 17695944] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
OBJECTIVE To carry out an opinion survey on the quality of residency training in the specialty of anesthesiology and postoperative recovery care. To propose improvements to be made based on the results. METHOD All training programs in the specialty of anesthesiology and postoperative recovery care within the Spanish National Public Health System were provided with an opinion questionnaire designed by the national professional association (Sociedad Espaiiola de Anestesia, Reanimación y Terapéutica del Dolor). The aspects assessed were sense of welcome and integration; curriculum; training sessions; external rotations (outside the anesthesiology department); surgical anesthesia; emergency and on-call training; specific practical training objectives; research; other aspects; annual assessment and evaluation of the assigned supervisor and the educational committee; overall opinion of the education received; the structure of the anesthesiology department. RESULTS Questionnaires were returned between May and November 2005, with a response rate of 30%. Specialized residency training was considered satisfactory overall but there was great interest in measures to improve it. The most highly assessed features were the sense of departmental welcome and integration and the work performed during duty assignments in the specialty. Deficiencies that generated the most dissatisfaction were external rotations, training in certain techniques (chest drainage and management of a bronchoscope), research, complementary training (computer skills, bioethics, and communication skills). CONCLUSIONS Despite a poor rate of return, the results can help indicate directions to take in making improvements at different levels of the specialty's organization.
Collapse
|
43
|
Abstract
INTRODUCTION Graft vessel disease (GVD) is one of the main long-term complications in heart transplant (HT) patients. At present, the diagnosis of this complication requires invasive procedures. Multislice CT is an emerging technique that allows visualization of the coronary anatomy, including the vascular lumen and wall thickness. Our objective was to establish the value of 16-detector multislice CT in the detection of GVD, compared with angiography and intravascular ultrasound (IVUS). PATIENTS AND METHODS We studied 32 HT patients, who had a mean follow-up of 2016 days. CT was performed 24 hours prior to angiography, associated with IVUS if the latter proved normal. Comparisons were subsequently made using contingency tables to establish the sensitivity, specificity, and predictive values of the CT. RESULTS Angiography was not performed on two patients, and eight were excluded from CT assessment due to serum creatinine values >1.5 mg/dL. Comparison of the CT findings with the invasive techniques yielded a sensitivity of 50%, a specificity of 81%, a negative predictive value of 81%, a positive predictive value of 50%, and a precision of 72%. CONCLUSIONS Our results suggested good performance of the technique in screening for GVD because a high negative predictive value was recorded. We plan to increase the number of patients and use the 64-detector CT system to ensure greater time and spatial resolution.
Collapse
|
44
|
Abstract
UNLABELLED Patients with a heart transplant (HT) may require changes in their immunosuppressive maintenance medication. The basic treatment regimen in our patients consisted of an anticalcineurin agent, an antimetabolite, and a steroid. OBJECTIVE We undertook a descriptive study to quantify the incidence and causes of these changes and determine how they occur. MATERIALS AND METHODS We included the 432 HT performed at our center from November 1987 to October 2005. The baseline treatment was considered to be the treatment given following HT, and the maintenance treatment was that taken at the time of data collection. Kaplan-Meier survival curves were constructed for the analysis. RESULTS The most significant change was the switch from azathioprine to mycophenolate mofetil. The survival rate after 17 years was 66%. CONCLUSIONS As in the international registries, there has been an evident reduction in the use of cyclosporine and more particularly of azathioprine, in favor of tacrolimus and mycophenolate mofetil, respectively. No changes in the use of steroids have been observed. These data reflect an increasingly greater use of immunosuppressive agents with reduced side effect profiles.
Collapse
|
45
|
Variations in the Frequency and Type of Infections in Heart Transplantation According to the Immunosuppression Regimen. Transplant Proc 2006; 38:2558-9. [PMID: 17098001 DOI: 10.1016/j.transproceed.2006.08.191] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
OBJECTIVE To evaluate the frequency of infection according to the immunosuppressive regimens used in our center. MATERIALS AND METHODS From 259 consecutive heart transplants we excluded pediatric cases, retransplants, combined transplants (lung and kidney) and immunosuppressive regimens with fewer than 10 cases. The six groups analyzed were: (1) OKT3 (7 days) + cyclosporine (CsA) + mycophenolate mofetil (MMF) + steroids (S); (2) OKT3 (7 days) + CsA + azathioprine (AZA) + S; (3) OKT3 (10 days) + CsA + MMF + S; (4) OKT3 (10 days) + CsA + AZA + S; (5) interleukin-2 (IL-2) antagonists + CsA + MMF + S; (6) IL-2 antagonists + tacrolimus + MMF + S. Infection was considered significant when it causal hospital admission or prolonged hospitalization. RESULTS With a total mean follow-up of 54 +/- 43 months, the total percentage of infection-free patients at the end of follow-up was 45.5%. Infection-free survival was lower among patients administered induction with OKT3 antibodies for 10 days, combined with cyclosporine, either with MMF (10%, group 3) or with azathioprine (27%, group 4), compared to those given IL-2 antagonists (particularly in combination with tacrolimus and MMF-69.2%, group 6). CONCLUSIONS The results of this study showed that infection was frequent in heart transplantation. Furthermore, induction therapy with OKT3 monoclonal antibodies was associated with an important number of infections (particularly viral infections). Comparison of the treatment groups showed that the regimen associated with fever infections included an IL-2 receptor antagonist with tacrolimus, MMF, and S.
Collapse
|
46
|
Abstract
The use of amiodarone before transplantation has been linked to an increased number of complications, acute graft failures, and early mortality after a heart graft. We undertook a retrospective, descriptive, case-controlled study involving early mortality and acute graft failure. The 396 consecutive patients included 25 subjects who had been prescribed amiodarone for at least 30 days before transplantation. We excluded retransplantations, pediatric transplantations, and combined transplantations. The endpoints were early mortality and acute graft failure. No significant differences were observed in early mortality and acute graft failures. The multivariate analysis did not reveal any variable that correlated with early mortality. Our study did not support the idea that amiodarone constituted a negative predictor of early survival or acute graft failure.
Collapse
|
47
|
Abstract
UNLABELLED Since their introduction onto the market, interleukin-2 antagonists have been increasingly used by a growing number of transplant units. Their benefits versus OKT3 appear evident, although the optimal dose remains to be established. Our objective was to establish possible differences related to the use of two versus five doses of daclizumab. MATERIALS AND METHODS This study evaluated 81 consecutive patients treated with two bolus doses of daclizumab (1 mg/kg) on days 1 and 14 posttransplantation. We excluded retransplantations, pediatric transplantations, and combined transplantations. We compared our series to a previous trial involving the administration of a single bolus dose every 14 days (five boluses in total). Study variables included the number of graft rejections, the number of infections, and the mortality. Statistical analysis was performed using the chi square and Student's t tests. Significance was set at P < .05. RESULTS There were no differences between groups in the baseline characteristics of the patients. The number of rejection episodes during the first year was significantly lower among the patients in our series treated with two bolus doses of daclizumab than in the series of patients treated with five bolus doses: 24 (30%) vs 17 (61%) episodes (P = .003). No significant differences were observed for mortality: the group receiving two boluses registered 10 deaths (12%) versus two (7%) in the group receiving five boluses (P = .4), or infection rate: 11 patients (40%) in the group receiving five bolus versus 31 patients (38%) in the group given two bolus doses (P = .9). CONCLUSIONS Our results suggested that induction therapy with two doses of daclizumab was at least as effective in preventing rejection as five doses, with no negative effects on patient survival.
Collapse
|
48
|
Abstract
OBJECTIVE To perform an analysis comparing long-term survival in heart transplant (HT) patients depending on the immunosuppressive regimen. MATERIALS AND METHODS The study included 317 consecutive HT patients. We excluded pediatric cases, retransplants, combined transplants (lung and kidney), and immunosuppressive regimens with fewer than 10 cases. The six groups analyzed were: (1) OKT3 7 days + cyclosporine (CsA) + mycophenolate mofetil (MMF) + steroids (S); (2) OKT3 7 days + CsA + azathioprine (AZA) + S; (3) OKT3 10 days + CsA + MMF + S; (4) OKT3 10 days + CsA + AZA + S; (5) interleukin-2 (IL-2) antagonists + CsA + MMF + S; and (6) IL-2 antagonists + tacrolimus + MMF + S. Probability of survival was analyzed by Kaplan-Meier and log-rank methods. RESULTS The groups were heterogeneous regarding the number of patients and follow-up. The baseline characteristics were similar, although there were differences in surgery times. The survivals by groups at the end of the follow-up period were: group 1: 75.8%; group 2: 51.2%; group 3: 63.6%; group 4: 25.3%; group 5: 91.2%; and group 6: 84.6%. A major reduction in survival was observed in the groups that were given induction with OKT3 monoclonal antibodies (groups 1, 2, 3, and 4), particularly when AZA was combined in the maintenance phase (groups 2 and 4) and when the induction dose was high (10-day therapy in groups 3 and 4). CONCLUSIONS Our study suggested an association between the immunosuppressive regimen and the long-term survival of HT patients. The best results were obtained with an induction regimen based on IL-2 antagonists. On the basis of the survivals observed in this study, the maintenance combination we regard as "optimal" at this time is based on a combination of CsA, MMF, and steroids.
Collapse
|
49
|
Abstract
BACKGROUND AND AIMS Immunosuppressive therapy has undergone great changes in recent years as a result of the introduction of new drugs, presumed a prior to be more effective and better tolerated. The greatest advance seems to have been the introduction of interleukin-2 (IL-2) receptor antagonists. The objective of this study was to determine whether the use of IL-2 receptor antagonists in induction therapy has implications for the development of rejection and survival. MATERIALS AND METHODS Three hundred sixty-five consecutive cardiac transplant patients who received induction therapy were included. Heart-lung and transplants in children under 10 years were excluded. Three groups were compared according to the induction therapy (OKT3, 10 days; OKT3, 7 days; and IL-2R antagonists). Each treatment corresponded to a time period: OKT3 10 days from June 1989 to April 1994; OKT3 7 days from May 1994 to October 2002; and IL-2R antagonists from November 2002 to May 2004. Baseline characteristics of recipient and donor, surgical times, postsurgical complications, maintenance immunosuppression, number of rejections, time (days) to first rejection, and probability of survival at 1 year were recorded. We used analysis of variance, chi(2) test, Kaplan-Meier curves, and log-rank test as appropriate. A P-value < .05 was considered significant. RESULTS There were significant differences in the characteristics of the transplanted patients in the various time periods. Thus, recipients in the OKT3 10 day group had worse status but better donors, whereas recipients in the IL-2R antagonists group had better status but older donors with longer duration of ischemia. The incidence of acute graft failure was similar in the three groups. The number of rejection episodes in the first year was higher among the OKT3 groups (OKT3 10 days, 1.7 +/- 1.3; OKT3 7 days, 1.2 +/- 1.2; IL-2R antagonists, 1.0 +/- 1.2; P = .02) and the probability of survival at 1 year was also lower (OKT3 10 days, 74%; OKT3 7 days, 77%; IL-2R antagonists, 94%; P = .0007). CONCLUSIONS Induction therapy with IL-2 antagonists offers important advantages over treatment with OKT3 in terms of survival, with absolute and relative risk reductions of 20% and 27%. Furthermore, it did not increase the number of rejections, although this may have been due to the greater use of MMF versus azathioprine.
Collapse
|
50
|
Abstract
INTRODUCTION It is known that there is a high incidence of diabetes mellitus (DM) among heart transplant (HT) patients, which may be up to 30% at 5 years. The presence of DM has been associated with increased morbidity (infections, renal dysfunction, or graft vascular disease), and its development has been related primarily to immunosuppressive therapy. The objective of this study was to determine, in our experience, the presence of predictive variables for the development of DM following HT. METHODS We studied 315 consecutive non-DM patients (88.6% men, mean age 51.5 years) who underwent HT in our hospital from November 1987 to May 2003, analyzing all variables that could be related to the development of DM during follow-up. Student t-test and chi(2) test were used for univariate statistical analysis and logistic regression for multivariate analysis. RESULTS Of the 315 patients, 64 developed DM (20.3%) during a mean follow-up of 3.3 years. The univariate analysis showed that patients developing DM are older (54.9 +/- 8.7 versus 50.7 +/- 11.8 years, P = .008), have a higher body mass index (BMI) (27.3 +/- 3.8 versus 25.7 +/- 3.7, P = .003), a higher prevalence of HT (37.5% versus 23.5%, P = .023), a lower frequency of urgent HT (9.4% versus 26.2%, P = .004), are more often treated with steroids (85.9% versus 70.1%, P = .011) and tacrolimus (12.5% versus 4.4%, P = .015), and have a higher frequency of rejection episodes (71.2% versus 44.6%, P = .001). Multivariate analysis identified the following as predictive factors for the development of DM: age (OR = 1.04, P = .013), urgent HT (OR = 0.36, P = .031), treatment with tacrolimus (OR = 3.89, P = .012), and number of rejections (OR = 2.34, P = .002). CONCLUSION In our population, age, urgent HT (which had a protective effect), treatment with tacrolimus, and number of rejections were independent predictive variables for the development of DM during follow-up.
Collapse
|