201
|
Schmitz SK, Hjorth JJJ, Joemai RMS, Wijntjes R, Eijgenraam S, de Bruijn P, Georgiou C, de Jong APH, van Ooyen A, Verhage M, Cornelisse LN, Toonen RF, Veldkamp WJH, Veldkamp W. Automated analysis of neuronal morphology, synapse number and synaptic recruitment. J Neurosci Methods 2011; 195:185-93. [PMID: 21167201 DOI: 10.1016/j.jneumeth.2010.12.011] [Citation(s) in RCA: 128] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2010] [Revised: 11/30/2010] [Accepted: 12/01/2010] [Indexed: 11/17/2022]
Abstract
The shape, structure and connectivity of nerve cells are important aspects of neuronal function. Genetic and epigenetic factors that alter neuronal morphology or synaptic localization of pre- and post-synaptic proteins contribute significantly to neuronal output and may underlie clinical states. To assess the impact of individual genes and disease-causing mutations on neuronal morphology, reliable methods are needed. Unfortunately, manual analysis of immuno-fluorescence images of neurons to quantify neuronal shape and synapse number, size and distribution is labor-intensive, time-consuming and subject to human bias and error. We have developed an automated image analysis routine using steerable filters and deconvolutions to automatically analyze dendrite and synapse characteristics in immuno-fluorescence images. Our approach reports dendrite morphology, synapse size and number but also synaptic vesicle density and synaptic accumulation of proteins as a function of distance from the soma as consistent as expert observers while reducing analysis time considerably. In addition, the routine can be used to detect and quantify a wide range of neuronal organelles and is capable of batch analysis of a large number of images enabling high-throughput analysis.
Collapse
Affiliation(s)
- Sabine K Schmitz
- Functional Genomics, Center for Neurogenomics and Cognitive Research (CNCR), VU University Amsterdam, De Boelelaan 1085, 1081 HV Amsterdam, The Netherlands
| | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
202
|
Stone B. Profitability makeover spawns new billing company. MGMA Connex 2011; 11:21-22. [PMID: 21409875] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
|
203
|
Affiliation(s)
- Max Hirshkowitz
- Department of Medicine & Menninger Department of Psychiatry, Baylor College of Medicine and Michael E. DeBakey VAMC Sleep Center, Houston, TX 77030, USA.
| | | |
Collapse
|
204
|
Abstract
This chapter describes using the Protein Inference Engine (PIE) to integrate various types of data--especially top down and bottom up mass spectrometer (MS) data--to describe a protein's posttranslational modifications (PTMs). PTMs include cleavage events such as the n-terminal loss of methionine and residue modifications like phosphorylation. Modifications are key elements in many biological processes, but are difficult to study as no single, general method adequately characterizes a protein's PTMs; manually integrating data from several MS experiments is usually required. The PIE is designed to automate this process using a guess and refine process similar to how an expert manually integrates data. The PIE repeatedly "imagines" a possible modification set, evaluates it using available data, and then tries to improve on it. After many rounds of refinement, the resulting modification set is proposed as a candidate answer. Multiple candidate answers are generated to obtain both best and near-best answers. Near-best answers are crucial in allowing for proteins with more than one supported modification pattern (isoforms) and obtaining robust results given incomplete and inconsistent data.The goal of this chapter is to walk the reader through installing and using the downloadable version of PIE, both in general and by means of a specific, detailed example. The example integrates several types of experimental and background (prior) data. It is not a "perfect-world" scenario, but has been designed to illustrate several real-world difficulties that may be encountered when trying to analyze imperfect data.
Collapse
Affiliation(s)
- Stuart R Jefferys
- Department of Bioinformatics & Computational Biology, The University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | | |
Collapse
|
205
|
Abstract
Ginsengs (Panax, Araliaceae) are among the plants best known for their medicinal properties. Many ginseng species are endangered due to over-exploitation of natural resources - a situation difficult to remedy while there are no reliable, practical methods for species identification. We screened eleven candidate DNA barcoding loci to establish an accurate and effective Panax species identification system, both for commercial and conservation purposes. We used 95 ginseng samples, representing all the species in the genus. We found considerable differences in the performance of the potential barcoding regions. The sequencing of ATPF-ATPH was unsuccessful due to poly-N structures. The RBCL, RPOB, and RPOC1 regions were found to be mostly invariable, with only four to eight variable sites. Using MATK, PSBK-I, PSBM-TRND, RPS16 and NAD1, we could identify four to six out of eight considerably divergent species but only one to five out of nineteen clusters within the P. bipinnatifidus species group. PSBA-TRNH and ITS were the most variable loci, working very well both in species and cluster identifications. We demonstrated that the combination of PSBA-TRNH and ITS is sufficient for identifying all the species and clusters in the genus.
Collapse
Affiliation(s)
- Yunjuan Zuo
- State Key Laboratory of Systematic and Evolutionary Botany, Institute of Botany, Chinese Academy of Sciences, Beijing, PR China
| | | | | | | | | | | |
Collapse
|
206
|
Ha M, Nehm RH, Urban-Lurain M, Merrill JE. Applying computerized-scoring models of written biological explanations across courses and colleges: prospects and limitations. CBE Life Sci Educ 2011; 10:379-93. [PMID: 22135372 PMCID: PMC3228656 DOI: 10.1187/cbe.11-08-0081] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/27/2011] [Revised: 09/29/2011] [Accepted: 09/29/2011] [Indexed: 05/24/2023]
Abstract
Our study explored the prospects and limitations of using machine-learning software to score introductory biology students' written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors' and nonmajors' written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of "near-perfect" agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations.
Collapse
Affiliation(s)
- Minsu Ha
- The Ohio State University, School of Teaching and Learning, Columbus, USA.
| | | | | | | |
Collapse
|
207
|
|
208
|
Kim S, Eo HS, Koo H, Choi JK, Kim W. DNA barcode-based molecular identification system for fish species. Mol Cells 2010; 30:507-12. [PMID: 21110132 DOI: 10.1007/s10059-010-0148-2] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2010] [Revised: 09/14/2010] [Accepted: 09/15/2010] [Indexed: 11/28/2022] Open
Abstract
In this study, we applied DNA barcoding to identify species using short DNA sequence analysis. We examined the utility of DNA barcoding by identifying 53 Korean freshwater fish species, 233 other freshwater fish species, and 1339 saltwater fish species. We successfully developed a web-based molecular identification system for fish (MISF) using a profile hidden Markov model. MISF facilitates efficient and reliable species identification, overcoming the limitations of conventional taxonomic approaches. MISF is freely accessible at http://bioinfosys.snu.ac.kr:8080/MISF/misf.jsp .
Collapse
Affiliation(s)
- Sungmin Kim
- Interdisciplinary Program in Bioinformatics, Seoul National University, Seoul 151-742, Korea
| | | | | | | | | |
Collapse
|
209
|
Binkiewicz P, Michaluk K, Demiańczyk A. [Calculation of the Pearl Index of Lady-Comp, Baby-Comp and Pearly cycle computers used as a contraceptive method]. Ginekol Pol 2010; 81:834-839. [PMID: 21365899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/30/2023] Open
Abstract
INTRODUCTION Lady-Comp, Baby-Comp and Pearly cycle computers are medical devices that use sophisticated statistical gathering methods, as well as a comprehensive database, to precisely determine fertile and infertile phases of a menstrual cycle on the basis of everyday basal body temperature measurements. They have been produced and distributed worldwide by Valley Electronics GmbH (Eschenlohe, Bavaria, Germany) for over 25 years. OBJECTIVES The aim of the study was to calculate the Pearl Index of cycle computers in order to determine their contraceptive effectiveness. MATERIAL AND METHODS 510 Polish women, randomly chosen from the database of the distributor, who had been using the device for over one year or during 13 menstrual cycles, received the questionnaire. The Pearl Index was calculated as a quotient of the number of unplanned pregnancies and the total number of cycles during which cycle computers were used and the obtained value was then multiplied by 1300. Statistical methods were applied to analyze data from the questionnaires and to calculate the Pearl Index. Unplanned pregnancy odds ratio for women using additionally condoms during the fertile phase of the cycle was also calculated. RESULTS 139 properly filled questionnaires were the source of data about 3332 cycles. After the initial analysis, 290 cycles were declined because the respondents had not complied with the computer indications and 1021 cycles were declined because the respondents had been using other contraceptive methods at the same time--no unplanned pregnancy was noted in that group. In the investigated group of 2040 cycles of correct cycle computers use, one unplanned pregnancy was observed. Calculated Pearl Index for this group amounted to 0.64; it means, that less than 7 out of 1000 users of cycle computer as a contraceptive method may become pregnant within one year The odds of pregnancy in women using a cycle computer and condoms on fertile days amounted to 1.035%; it means that 1 out of 100 users of the combined methods may become pregnant within one year. CONCLUSIONS The Pearl Index value of cycle computers is comparable with the Pearl Index of hormonal contraceptives. Cycle computers offer an effective and drug-free method of contraception to all women who wish to limit interventions in their bodily functions and do not want or cannot use other contraceptive methods.
Collapse
|
210
|
Lee KM, Armstrong PR, Thomasson JA, Sui R, Casada M, Herrman TJ. Development and characterization of food-grade tracers for the global grain tracing and recall system. J Agric Food Chem 2010; 58:10945-10957. [PMID: 20883029 DOI: 10.1021/jf101370k] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
Tracing grain from the farm to its final processing destination as it moves through multiple grain-handling systems, storage bins, and bulk carriers presents numerous challenges to existing record-keeping systems. This study examines the suitability of coded caplets to trace grain, in particular, to evaluate methodology to test tracers' ability to withstand the rigors of a commercial grain handling and storage systems as defined by physical properties using measurement technology commonly applied to assess grain hardness and end-use properties. Three types of tracers to dispense into bulk grains for tracing the grain back to its field of origin were developed using three food-grade substances [processed sugar, pregelatinized starch, and silicified microcrystalline cellulose (SMCC)] as a major component in formulations. Due to a different functionality of formulations, the manufacturing process conditions varied for each tracer type, resulting in unique variations in surface roughness, weight, dimensions, and physical and spectroscopic properties before and after coating. The applied two types of coating [pregelatinized starch and hydroxypropylmethylcellulose (HPMC)] using an aqueous coating system containing appropriate plasticizers showed uniform coverage and clear coating. Coating appeared to act as a barrier against moisture penetration, to protect against mechanical damage of the surface of the tracers, and to improve the mechanical strength of tracers. The results of analysis of variance (ANOVA) tests showed the type of tracer, coating material, conditioning time, and a theoretical weight gain significantly influenced the morphological and physical properties of tracers. Optimization of these factors needs to be pursued to produce desirable tracers with consistent quality and performance when they flow with bulk grains throughout the grain marketing channels.
Collapse
Affiliation(s)
- Kyung-Min Lee
- Office of the Texas State Chemist, Texas Agricultural Experiment Station, College Station, Texas 77841
| | | | | | | | | | | |
Collapse
|
211
|
Li J, Luo S, Jin JS. Sensor data fusion for accurate cloud presence prediction using Dempster-Shafer evidence theory. Sensors (Basel) 2010; 10:9384-96. [PMID: 22163414 PMCID: PMC3230941 DOI: 10.3390/s101009384] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/30/2010] [Revised: 09/15/2010] [Accepted: 09/25/2010] [Indexed: 11/16/2022]
Abstract
Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.
Collapse
Affiliation(s)
- Jiaming Li
- CSIRO ICT Centre, Corner of Vimiera and Pembroke Roads, Marsfield, NSW 2122; Australia
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +61-2-93724707; Fax: +61-2-93724161
| | - Suhuai Luo
- The University of Newcastle, University Drive, Callaghan, NSW 2308, Australia; E-Mails; (S.H.L.); (J.S.J.)
| | - Jesse S. Jin
- The University of Newcastle, University Drive, Callaghan, NSW 2308, Australia; E-Mails; (S.H.L.); (J.S.J.)
| |
Collapse
|
212
|
Mishchenko Y, Hu T, Spacek J, Mendenhall J, Harris KM, Chklovskii DB. Ultrastructural analysis of hippocampal neuropil from the connectomics perspective. Neuron 2010; 67:1009-20. [PMID: 20869597 DOI: 10.1016/j.neuron.2010.08.014] [Citation(s) in RCA: 197] [Impact Index Per Article: 14.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/27/2010] [Indexed: 11/17/2022]
Abstract
Complete reconstructions of vertebrate neuronal circuits on the synaptic level require new approaches. Here, serial section transmission electron microscopy was automated to densely reconstruct four volumes, totaling 670 μm(3), from the rat hippocampus as proving grounds to determine when axo-dendritic proximities predict synapses. First, in contrast with Peters' rule, the density of axons within reach of dendritic spines did not predict synaptic density along dendrites because the fraction of axons making synapses was variable. Second, an axo-dendritic touch did not predict a synapse; nevertheless, the density of synapses along a hippocampal dendrite appeared to be a universal fraction, 0.2, of the density of touches. Finally, the largest touch between an axonal bouton and spine indicated the site of actual synapses with about 80% precision but would miss about half of all synapses. Thus, it will be difficult to predict synaptic connectivity using data sets missing ultrastructural details that distinguish between axo-dendritic touches and bona fide synapses.
Collapse
Affiliation(s)
- Yuriy Mishchenko
- Janelia Farm Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147, USA
| | | | | | | | | | | |
Collapse
|
213
|
Zhang GQ, Wang M, Zhang DM, Liu Y. [Metabonomics and its perspective on forensic medicine]. Fa Yi Xue Za Zhi 2010; 26:374-380. [PMID: 21287744] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Metabolomics is a new study, which use chromatography, mass spectrometry, nuclear magnetic resonance (NMR), capillary electrophoresis (CE) techniques on the cells, organs and other body fluids and metabolites in samples were isolated, purified and testing, re-use bioinformatics tools on the obtained data are analyzed to obtain one or a set of biomarker information. Based on analysis of the literatures in recent years, metabolomics was summarized from history, concept, advantage, methods, application, difficulties and challenges, journals and books, websites, and its application in forensic medicine was forecasted. As a new branch of global system biology, metabonomics developed rapidly, and its perspective on forensic medicine was feasible and very optimistic.
Collapse
Affiliation(s)
- Gao-Qin Zhang
- Chinese People's Public Security University, Beijing 100038, China.
| | | | | | | |
Collapse
|
214
|
Geng X, Jiang K. [Ajax-based child growth monitoring chart automatic drawing]. Sheng Wu Yi Xue Gong Cheng Xue Za Zhi 2010; 27:1016-1050. [PMID: 21089661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
As regards the application of community residents electronics health record system, there is a question of how to draw the child growth monitoring chart quickly and efficiently, which is also the focus of this research for enhancing residents experience in using the system. The system is combined with the current emerging Ajax and GDI+ technology. The client uses the pre-designed Ajax Manager status to deal with residents' request and send XMLHTTP request to the server. Sever responds to the request and makes use of GDI+ programming for implementation of rendering graphics and feedback. The system finally realizes child growth monitoring chart releasing on the Web.
Collapse
Affiliation(s)
- Xingyun Geng
- Department of Medical Information, School of Medicine, Nantong University, Nantong 226001, China
| | | |
Collapse
|
215
|
Liu Z, Chen K, Luo K, Pan H, Chen S. [DNA barcoding in medicinal plants Caprifoliaceae]. Zhongguo Zhong Yao Za Zhi 2010; 35:2527-2532. [PMID: 21174758] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
OBJECTIVE To determine the candidate sequences which can be used as DNA barcode to identify species in Caprifoliaceae family by screening out from four different DNA fragments sequences. METHOD PCR amplification, sequencing efficiency, differential intra- and interspecific divergences, the DNA barcoding gap and identification efficiency were used to evaluate these loci. RESULT The ITS2 was used as a candidate sequence of DNA barcode to identify the species in Caprifoliaceae family, whose rate of success in identification in genera level was 100% and in species 96.6%, and psbA-trnH as a complementary barcode to ITS2 for Caprifoliaceae.
Collapse
Affiliation(s)
- Zhen Liu
- Key Laboratory of Traditional Chinese Medicine Resource and Compound Prescription, Ministry of Education, Hubei College of Traditional Chinese Medicine, Wuhan 430065, China
| | | | | | | | | |
Collapse
|
216
|
Park DH, Hong YK, Cho EH, Kim MS, Kim DC, Bang J, Kim J, Joo J. Light-emitting color barcode nanowires using polymers: nanoscale optical characteristics. ACS Nano 2010; 4:5155-5162. [PMID: 20707343 DOI: 10.1021/nn101096m] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
We report on the light-emitting color barcode nanowires (LECB-NWs), which were fabricated by alternating the electrochemical polymerization of light-emitting polymers with various luminescence colors and efficiencies. The nanoscale photoluminescence characteristics of LECB-NWs were investigated using a laser confocal microscope with a high spatial resolution. The alternating light emissions of the LECB-NWs showed orange-yellow, red, and green colors due to the serial combination of poly(3-butylthiophene), poly(3-methylthiophene), and poly(3,4-ethylenedioxythiophene), respectively, with distinct luminescence intensities. The optical detection sensitivity and stability of LECB-NWs have been enhanced through a nanoscale Cu metal coating onto the NWs, based on surface plasmon resonance coupling and protection against oxidation. The flexibility of the LECB-NWs has been investigated through the folding and unfolding of the NWs by an applied nanotip impetus. The flexible LECB-NWs can be used as highly sensitive optical identification nanosystems for nanoscale or microscale products with complex physical shapes.
Collapse
Affiliation(s)
- Dong Hyuk Park
- Department of Physics, Korea University, Seoul 136-713, Korea
| | | | | | | | | | | | | | | |
Collapse
|
217
|
Yang Q, Reisman CA, Wang Z, Fukuma Y, Hangai M, Yoshimura N, Tomidokoro A, Araie M, Raza AS, Hood DC, Chan K. Automated layer segmentation of macular OCT images using dual-scale gradient information. Opt Express 2010; 18:21293-307. [PMID: 20941025 PMCID: PMC3101081 DOI: 10.1364/oe.18.021293] [Citation(s) in RCA: 123] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/18/2023]
Abstract
A novel automated boundary segmentation algorithm is proposed for fast and reliable quantification of nine intra-retinal boundaries in optical coherence tomography (OCT) images. The algorithm employs a two-step segmentation schema based on gradient information in dual scales, utilizing local and complementary global gradient information simultaneously. A shortest path search is applied to optimize the edge selection. The segmentation algorithm was validated with independent manual segmentation and a reproducibility study. It demonstrates high accuracy and reproducibility in segmenting normal 3D OCT volumes. The execution time is about 16 seconds per volume (480x512x128 voxels). The algorithm shows potential for quantifying images from diseased retinas as well.
Collapse
Affiliation(s)
- Qi Yang
- Topcon Advanced Biomedical Imaging Laboratory, Oakland, NJ, 07436, USA
| | | | - Zhenguo Wang
- Topcon Advanced Biomedical Imaging Laboratory, Oakland, NJ, 07436, USA
| | - Yasufumi Fukuma
- Topcon Advanced Biomedical Imaging Laboratory, Oakland, NJ, 07436, USA
| | - Masanori Hangai
- Department of Ophthalmology and Visual Sciences, Kyoto University, Kyoto, Japan
| | - Nagahisa Yoshimura
- Department of Ophthalmology and Visual Sciences, Kyoto University, Kyoto, Japan
| | - Atsuo Tomidokoro
- Department of Ophthalmology, University of Tokyo School of Medicine, Tokyo, Japan
| | - Makoto Araie
- Department of Ophthalmology, University of Tokyo School of Medicine, Tokyo, Japan
| | - Ali S. Raza
- Department of Psychology, Columbia University, New York, NY, 10027, USA
| | - Donald C. Hood
- Department of Psychology, Columbia University, New York, NY, 10027, USA
- Department of Ophthalmology, Columbia University, New York, NY, 10027, USA
| | - Kinpui Chan
- Topcon Advanced Biomedical Imaging Laboratory, Oakland, NJ, 07436, USA
| |
Collapse
|
218
|
Institute for Safe Medication Practices. Misidentification of alphanumeric symbols in both handwritten and computer-generated information. Alta RN 2010; 66:22-3. [PMID: 21213960] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
|
219
|
Abstract
We report a novel technique for generating polymer fluorescent barcode nanorods by reactive ion etching of polymer multilayer films using nonclose-packed (ncp) colloidal microsphere arrays as masks. The fluorescent polymer multilayer films were spin-coated on a substrate, and ncp microsphere arrays were transferred onto these films. The exposed polymers were then etched away selectively, leaving color-encoded nanorods with well-preserved fluorescent properties. By modifying the spin-coating procedure, the amount of polymer in each layer could be tuned freely, which determined the relative fluorescence intensity of the barcode nanorods. These nanorod arrays can be detached from the substrate to form dispersions of coding materials. Moreover, the shape of the nanorods is controllable according to the different etching speeds of various materials, which also endows the nanorods with shape-encoded characters. This method offers opportunities for the fabrication of novel fluorescent barcodes which can be used for detecting and tracking applications.
Collapse
Affiliation(s)
- Xiao Li
- State Key Laboratory of Supramolecular Structure and Materials, College of Chemistry, Jilin University, Changchun 130012, PR China
| | | | | | | | | | | | | | | |
Collapse
|
220
|
Abstract
Hybrid entangled states exhibit entanglement between different degrees of freedom of a particle pair and thus could be useful for asymmetric optical quantum network where the communication channels are characterized by different properties. We report the first experimental realization of hybrid polarization-orbital angular momentum (OAM) entangled states by adopting a spontaneous parametric down conversion source of polarization entangled states and a polarization-OAM transferrer. The generated quantum states have been characterized through quantum state tomography. Finally, the violation of Bell's inequalities with the hybrid two photon system has been observed.
Collapse
Affiliation(s)
- Eleonora Nagali
- Dipartimento di Fisica, Sapienza Università di Roma, Roma 00185, Italy
| | | |
Collapse
|
221
|
Chuntonov L, Amitay Z. Optical periodic code matching by single-shot broadband frequency-domain cross-correlation. Opt Express 2010; 18:17756-17763. [PMID: 20721163 DOI: 10.1364/oe.18.017756] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
We introduce and experimentally demonstrate a simple and reliable optical technique for matching between two periodic numerical sequences based on optical single-shot measurement of their broadband cross-correlation function in the frequency domain. Each sequence is optically encoded into the shape of the different broadband femtosecond pulse using pulse-shaping techniques. The two corresponding shaped pulses are mixed in a nonlinear medium together with an additional (amplitude-shaped) narrowband pulse. The spectrum of the resulting four-wave mixing signal is measured to provide the cross-correlation function of the two encoded sequences. For identical sequences it is the auto-correlation function that is being measured, allowing also the identification of the sequence period. The high contrast achieved here between cross-correlation and auto-correlation functions allows to determine with a very high reliability whether the two encoded sequences are identical or not. The demonstrated technique might be employed in an optical implementation of CDMA communication protocol.
Collapse
Affiliation(s)
- Lev Chuntonov
- The Shirlee Jacobs Femtosecond Laser Research Laboratory, Schulich Faculty of Chemistry, Technion-Israel Institute of Technology, Haifa 32000, Israel.
| | | |
Collapse
|
222
|
Jain N, Huisman SR, Bimbard E, Lvovsky AI. A bridge between the single-photon and squeezed-vacuum states. Opt Express 2010; 18:18254-18259. [PMID: 20721217 DOI: 10.1364/oe.18.018254] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
The two modes of the Einstein-Podolsky-Rosen quadrature entangled state generated by parametric down-conversion interfere on a beam splitter of variable splitting ratio. Detection of a photon in one of the beam splitter output channels heralds preparation of a signal state in the other, which is characterized using homodyne tomography. By controlling the beam splitting ratio, the signal state can be chosen anywhere between the single-photon and squeezed state.
Collapse
Affiliation(s)
- Nitin Jain
- Institute for Quantum Information Science, University of Calgary, Calgary, Alberta T2N 1N4, Canada
| | | | | | | |
Collapse
|
223
|
Fawns T, McKenzie K. How to ensure e-portfolios are a valuable resource to students' learning. Nurs Times 2010; 106:21-23. [PMID: 20836478] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
Online technology has become an increasingly important part of all types of education, including that of student nurses. This article outlines some factors that need to be considered when introducing e-portfolios into nurse education and looks at issues such as ease of use, ownership, privacy, supervision and assessment.
Collapse
Affiliation(s)
- Tim Fawns
- School of Health in Social Science, University of Edinburgh
| | | |
Collapse
|
224
|
Diessl S, Verburg FA, Luster M, Reiners C. [E-learning in medicine with nuclear medicine as an example]. Nuklearmedizin 2010; 49:125-127. [PMID: 20683547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2010] [Accepted: 06/16/2010] [Indexed: 05/29/2023]
|
225
|
Cheng J, Wang W. [Development of a medical equipment support information system based on PDF portable document]. Zhongguo Yi Liao Qi Xie Za Zhi 2010; 34:273-275. [PMID: 21033114] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
OBJECTIVE According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. METHODS Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. RESULTS The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. CONCLUSIONS The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data
Collapse
Affiliation(s)
- Jiangbo Cheng
- Center of Medical Engineering, General Hospital of PLA, Beijing 100853.
| | | |
Collapse
|
226
|
Lan M, Su J. Empirical investigations into full-text protein interaction Article Categorization Task (ACT) in the BioCreative II.5 Challenge. IEEE/ACM Trans Comput Biol Bioinform 2010; 7:421-427. [PMID: 20671314 DOI: 10.1109/tcbb.2010.49] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
The selection of protein interaction documents is one important application for biology research and has a direct impact on the quality of downstream BioNLP applications, i.e., information extraction and retrieval, summarization, QA, etc. The BioCreative II.5 Challenge Article Categorization task (ACT) involves doing a binary text classification to determine whether a given structured full-text article contains protein interaction information. This may be the first attempt at classification of full-text protein interaction documents in wide community. In this paper, we compare and evaluate the effectiveness of different section types in full-text articles for text classification. Moreover, in practice, the less number of true-positive samples results in unstable performance and unreliable classifier trained on it. Previous research on learning with skewed class distributions has altered the class distribution using oversampling and downsampling. We also investigate the skewed protein interaction classification and analyze the effect of various issues related to the choice of external sources, oversampling training sets, classifiers, etc. We report on the various factors above to show that 1) a full-text biomedical article contains a wealth of scientific information important to users that may not be completely represented by abstracts and/or keywords, which improves the accuracy performance of classification and 2) reinforcing true-positive samples significantly increases the accuracy and stability performance of classification.
Collapse
Affiliation(s)
- Man Lan
- Institute for Infocomm Research, Connexis, Singapore.
| | | |
Collapse
|
227
|
Abstract
*Currently, no official DNA barcode region is defined for the Fungi. The COX1 gene DNA barcode is difficult to apply. The internal transcribed spacer (ITS) region has been suggested as a primary barcode candidate, but for arbuscular mycorrhizal fungi (AMF; Glomeromycota) the region is exceptionably variable and does not resolve closely related species. *DNA barcoding analyses were performed with datasets from several phylogenetic lineages of the Glomeromycota. We tested a c. 1500 bp fragment spanning small subunit (SSU), ITS region, and large subunit (LSU) nuclear ribosomal DNA for species resolving power. Subfragments covering the complete ITS region, c. 800 bp of the LSU rDNA, and three c. 400 bp fragments spanning the ITS2, the LSU-D1 or LSU-D2 domains were also analysed. *Barcode gap analyses did not resolve all species, but neighbour joining analyses, using Kimura two-parameter (K2P) distances, resolved all species when based on the 1500 bp fragment. The shorter fragments failed to separate closely related species. *We recommend the complete 1500 bp fragment as a basis for AMF DNA barcoding. This will also allow future identification of AMF at species level based on 400 or 1000 bp amplicons in deep sequencing approaches.
Collapse
Affiliation(s)
- Herbert Stockinger
- LMU Munich, Department of Biology, Genetics, Grosshaderner Strasse 4, D-82152 Martinsried, Germany
| | - Manuela Krüger
- LMU Munich, Department of Biology, Genetics, Grosshaderner Strasse 4, D-82152 Martinsried, Germany
| | - Arthur Schüßler
- LMU Munich, Department of Biology, Genetics, Grosshaderner Strasse 4, D-82152 Martinsried, Germany
| |
Collapse
|
228
|
Hintermüller C, Marone F, Isenegger A, Stampanoni M. Image processing pipeline for synchrotron-radiation-based tomographic microscopy. J Synchrotron Radiat 2010; 17:550-559. [PMID: 20567088 DOI: 10.1107/s0909049510011830] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/25/2009] [Accepted: 03/29/2010] [Indexed: 05/29/2023]
Abstract
With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 x 1024 to 2048 x 2048 pixels and are acquired in 5-15 min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc.
Collapse
|
229
|
Affiliation(s)
- Ursula Eberhardt
- CBS-KNAW Fungal Biodiversity Centre, PO Box 85167, NL-3508 AD Utrecht, the Netherlands (tel +31 (0)30 2 122 636; email )
| |
Collapse
|
230
|
Guan S, Burlingame AL. Data processing algorithms for analysis of high resolution MSMS spectra of peptides with complex patterns of posttranslational modifications. Mol Cell Proteomics 2010; 9:804-10. [PMID: 19955076 PMCID: PMC2871415 DOI: 10.1074/mcp.m900431-mcp200] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2009] [Revised: 11/18/2009] [Indexed: 11/06/2022] Open
Abstract
The emergence of efficient fragmentation methods such as electron capture dissociation (ECD) and electron transfer dissociation (ETD) provides the opportunity for detailed structural characterization of heavily covalently modified large peptides and small proteins such as intact histones. Even with effective gas phase ion isolation so that a single molecular precursor ion is selected, the MSMS spectrum of a heavily modified peptide may reveal the presence of a mixture of peptides with the same amino acid sequence and the same total number of posttranslational modification (PTM) moieties (same PTM composition) but with different PTM configurations or site-specific occupancy isoforms. Currently available data analysis methods depend on a deisotoping procedure, which becomes less effective when spectra (fragmentation patterns) contain many overlapping isotopic distributions. Peptide database search engines can only identify the most abundant PTM configuration (PTM arrangement on different residues) in such mixtures. To identify all the PTM configurations present in these mixtures and to estimate their relative abundances, we extended our fragment assignment by visual assistance program to search for ions representing all possible configurations, subjected to the total PTM composition constraint. This resulted in the identification of PTM configurations supported by unique fragment ions, and their relative abundances were estimated by use of a non-negative least squares procedure.
Collapse
Affiliation(s)
- Shenheng Guan
- Department of Pharmaceutical Chemistry and Mass Spectrometry Facility, University of California, San Francisco, California 94143-0446, USA.
| | | |
Collapse
|
231
|
Rodríguez-Carreño I, Gila-Useros L, Malanda-Trigueros A, Gurtubay IG, Navallas-Irujo J, Rodríguez-Falces J. Application of a novel automatic duration method measurement based on the wavelet transform on pathological motor unit action potentials. Clin Neurophysiol 2010; 121:1574-1583. [PMID: 20427231 DOI: 10.1016/j.clinph.2010.03.028] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2009] [Revised: 03/10/2010] [Accepted: 03/23/2010] [Indexed: 11/19/2022]
Abstract
OBJECTIVE To evaluate a recently published automatic duration method based on the wavelet transform applied on normal and pathological motor unit action potentials (MUAPs). METHODS We analyzed 313 EMG recordings from normal and pathological muscles during slight contractions. After the extraction procedure, 339 potentials were accepted for analysis: 68 from normal muscles, 124 from myopathic muscles, 20 from chronic neurogenic muscles, 83 from subacute neurogenic muscles and also 44 fibrillation potentials, as an example of very low duration muscular potentials. A "gold standard" of the duration positions (GSP) was obtained for each MUAP from the manual measurements of two senior electromyographists. The results of the novel method were compared to five well-known conventional automatic methods (CAMs). To compare the six methods, the differences between the automatic marker positions and the GSP for the start and end markers were calculated. Then, for the different groups of normal and pathological MUAPs, we applied: a one-factor ANOVA to compare their relative mean differences, the estimated mean square error (EMSE) and a Chi-square test about the rate of automatic marker placements with differences to the GSP greater than 5 ms, taken as gross errors. RESULTS The mean and the standard deviation of the differences, the EMSE and the gross errors for the novel method were smaller than those observed with the CAMs in the five different MUAP groups and significantly different in most of the cases. CONCLUSIONS The novel automatic duration method is more accurate than other available algorithms in normal and pathological MUAPs. SIGNIFICANCE Accurate MUAP duration automatic measurement is an important issue in daily clinical practice.
Collapse
Affiliation(s)
| | - Luis Gila-Useros
- Hospital Virgen del Camino, Department of Clinical Neurophysiology, Pamplona, Spain
| | - Armando Malanda-Trigueros
- Universidad Pública de Navarra, Department of Electrical and Electronic Engineering, Pamplona, Spain
| | - I G Gurtubay
- Hospital Virgen del Camino, Department of Clinical Neurophysiology, Pamplona, Spain
| | - Javier Navallas-Irujo
- Universidad Pública de Navarra, Department of Electrical and Electronic Engineering, Pamplona, Spain
| | - Javier Rodríguez-Falces
- Universidad Pública de Navarra, Department of Electrical and Electronic Engineering, Pamplona, Spain
| |
Collapse
|
232
|
Kravchenko AM, Anokhin AM. [Automated biomedical complexes for rapid temperature diagnosis]. Med Tekh 2010:21-26. [PMID: 20458982] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
|
233
|
Abstract
OBJECTIVES Good Medical Practice, produced by the General Medical Council (GMC), represents the professional code of practice for doctors in the UK. It is regularly updated to reflect the changing relationships of the profession with both patients and society in general. The discourse within this guidance appears to have shifted over time to a stance that aligns itself more closely with the protection of patients, rather than with the traditional professional provision of guidance and support to doctors. METHODS Tag clouds are a feature of the latest applications of the World Wide Web, commonly known as Web 2.0. They can be used to rapidly analyse textual data, revealing textual messages in a pictorial form. Tag cloud-generating software was used to produce tag clouds of four texts illustrating GMC guidance produced between 1963 and 2006 to aid textual analysis and to determine whether this methodology could pick up this change in tenor. RESULTS This analysis supports the view that there has been a shift from a doctor-centred regulatory discourse to a patient-centred health improvement agenda over the period of time examined. DISCUSSION Whether this documentation voices a deprofessionalisation agenda or simply mirrors a period of societal change in general is discussed. The changing discourse around professionalism highlights the need to avoid adopting 'nostalgic' notions of professionalism in educating doctors and reinforces where the GMC sees we should focus our education efforts: in patient-centred care and team-working. The demands on, and responsibilities of students and juniors entering the profession have been fundamentally altered and our teaching, particularly in the domain of professionalism, needs to reflect this. Tag clouds provide an interesting and innovative way of analysing text and revealing obscured discourses, and their potential in education and research is worthy of further exploration.
Collapse
Affiliation(s)
- Deborah Gill
- Academic Centre for Medical Education, UCL Division of Medical Education, UCL Medical School, London N19 5LW, UK.
| | | |
Collapse
|
234
|
Cross DJ, Michaluk R, Gilmore R. Biological algorithm for data reconstruction. Phys Rev E Stat Nonlin Soft Matter Phys 2010; 81:036217. [PMID: 20365842 DOI: 10.1103/physreve.81.036217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2009] [Revised: 01/08/2010] [Indexed: 05/29/2023]
Abstract
An algorithm inspired by Genome sequencing is proposed which "reconstructs" a single long trajectory of a dynamical system from many short trajectories. This procedure is useful in situations when many data sets are available but each is insufficiently long to apply a meaningful analysis directly. The algorithm is applied to the Rössler and Lorenz dynamical systems as well as to experimental data taken from the Belousov-Zhabotinskii chemical reaction. Topological information was reliably extracted from each system and geometrical and dynamical measures were computed.
Collapse
Affiliation(s)
- Daniel J Cross
- Physics Department, Drexel University, Philadelphia, Pennsylvania 19104, USA
| | | | | |
Collapse
|
235
|
D'Onofrio DJ, An G. A comparative approach for the investigation of biological information processing: an examination of the structure and function of computer hard drives and DNA. Theor Biol Med Model 2010; 7:3. [PMID: 20092652 PMCID: PMC2829000 DOI: 10.1186/1742-4682-7-3] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2008] [Accepted: 01/21/2010] [Indexed: 11/14/2022] Open
Abstract
BACKGROUND The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. METHODS Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1) orthogonal uniqueness, (2) low level formatting, (3) high level formatting and (4) translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. RESULTS Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT) during high level formatting of the computer hard drive and the subsequent loading of an operating system (OS). Biological systems do not have an external source for a map of their stored information or for an operational instruction set; rather, they must contain an organizational template conserved within their intra-nuclear architecture that "manipulates" the laws of chemistry and physics into a highly robust instruction set. We propose that the epigenetic structure of the intra-nuclear environment and the non-coding RNA may play the roles of a Biological File Allocation Table (BFAT) and biological operating system (Bio-OS) in eukaryotic cells. CONCLUSIONS The comparison of functional and structural characteristics of the DNA complex and the computer hard drive leads to a new descriptive paradigm that identifies the DNA as a dynamic storage system of biological information. This system is embodied in an autonomous operating system that inductively follows organizational structures, data hierarchy and executable operations that are well understood in the computer science industry. Characterizing the "DNA hard drive" in this fashion can lead to insights arising from discrepancies in the descriptive framework, particularly with respect to positing the role of epigenetic processes in an information-processing context. Further expansions arising from this comparison include the view of cells as parallel computing machines and a new approach towards characterizing cellular control systems.
Collapse
Affiliation(s)
- David J D'Onofrio
- College of Arts and Science, Math Department, University of Phoenix, 5480 Corporate Drive, Suite 240, Troy, Michigan, 48098, USA
- Control Systems Modeling and Simulation, General Dynamics, 38500 Mound Rd, Sterling Heights, MI,48310, USA
| | - Gary An
- Department of Surgery, Northwestern University Feinberg School of Medicine, 676 North St Clair, Suite 650, Chicago, IL 60611, USA
| |
Collapse
|
236
|
Berger D, Borgelt C, Louis S, Morrison A, Grün S. Efficient identification of assembly neurons within massively parallel spike trains. Comput Intell Neurosci 2010; 2010:439648. [PMID: 19809521 PMCID: PMC2754663 DOI: 10.1155/2010/439648] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/09/2009] [Accepted: 06/24/2009] [Indexed: 11/26/2022]
Abstract
The chance of detecting assembly activity is expected to increase if the spiking activities of large numbers of neurons are recorded simultaneously. Although such massively parallel recordings are now becoming available, methods able to analyze such data for spike correlation are still rare, as a combinatorial explosion often makes it infeasible to extend methods developed for smaller data sets. By evaluating pattern complexity distributions the existence of correlated groups can be detected, but their member neurons cannot be identified. In this contribution, we present approaches to actually identify the individual neurons involved in assemblies. Our results may complement other methods and also provide a way to reduce data sets to the "relevant" neurons, thus allowing us to carry out a refined analysis of the detailed correlation structure due to reduced computation time.
Collapse
Affiliation(s)
- Denise Berger
- 1Bernstein Center for Computational Neuroscience, Humboldt-Universität zu Berlin, 10115 Berlin, Germany
- 2Neuroinformatics, Institute of Biology, Department of Biology, Chemistry, and Pharmacy, Freie Universität, 14195 Berlin, Germany
| | - Christian Borgelt
- 3European Centre for Soft Computing, 33600 Mieres, Asturias, Spain
- 4RIKEN Brain Science Institute, Wako-shi 351-0198, Japan
| | | | | | - Sonja Grün
- 1Bernstein Center for Computational Neuroscience, Humboldt-Universität zu Berlin, 10115 Berlin, Germany
- 4RIKEN Brain Science Institute, Wako-shi 351-0198, Japan
- *Sonja Grün:
| |
Collapse
|
237
|
Spidlen J, Moore W, Parks D, Goldberg M, Bray C, Bierre P, Gorombey P, Hyun B, Hubbard M, Lange S, Lefebvre R, Leif R, Novo D, Ostruszka L, Treister A, Wood J, Murphy RF, Roederer M, Sudar D, Zigon R, Brinkman RR. Data File Standard for Flow Cytometry, version FCS 3.1. Cytometry A 2010; 77:97-100. [PMID: 19937951 PMCID: PMC2892967 DOI: 10.1002/cyto.a.20825] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.
Collapse
Affiliation(s)
- Josef Spidlen
- Terry Fox Laboratory, BC Cancer Agency, Vancouver, BC, Canada
| | - Wayne Moore
- Genetics Department, Stanford University School of Medicine, Stanford, CA, USA
| | - David Parks
- Stanford Shared FACS Facility, Stanford University, Stanford, CA, USA
| | | | | | | | | | - Bill Hyun
- Laboratory for Cell Analysis, Helen Diller Family Comprehensive Cancer Center, University of California, San Francisco, CA, USA
| | | | | | | | | | | | | | | | - James Wood
- Department of Cancer Biology, Wake Forest University School of Medicine, Winston-Salem, NC, USA
| | | | | | - Damir Sudar
- Lawrence Berkeley Laboratory, Berkeley, CA, USA
| | | | | |
Collapse
|
238
|
Shao H, Zhang L, Lv HF, Zhou JY, Hu ZB, Li WK. [The utility of trnH-psbA gene for phylogenetic analysis of Huperziaceae and plant barcoding]. Zhong Yao Cai 2010; 33:18-21. [PMID: 20518297] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
OBJECTIVE The purpose of study was to discover the phylogenetic relations and plant barcoding of 17 plants from Huperziaceae. METHODS Phylogenetic tree of chloroplast trnH-psbA gene of 17 plants from Huperziaceae was constructed by software. RESULTS It showed that Huperziaceae could be divided into two genera Huperzia and Phlegmariurus and bootstrap value reached 91%. CONCLUSIONS Holub and Qing' taxonomy was supported and 17 species in Huperziaceae were monophyletic groups and it suggested that trnH-psbA could be used as a DNA barcode to identify plants.
Collapse
Affiliation(s)
- Hao Shao
- Institute of Chinese Materia Medica, Shanghai University of Traditional Chinese Medicine, Shanghai 201203, China
| | | | | | | | | | | |
Collapse
|
239
|
Wang J, Kim JU, Shu L, Niu Y, Lee S. A Distance-based Energy Aware Routing algorithm for wireless sensor networks. Sensors (Basel) 2010; 10:9493-511. [PMID: 22163422 PMCID: PMC3230965 DOI: 10.3390/s101009493] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/04/2010] [Revised: 09/20/2010] [Accepted: 10/19/2010] [Indexed: 11/16/2022]
Abstract
Energy efficiency and balancing is one of the primary challenges for wireless sensor networks (WSNs) since the tiny sensor nodes cannot be easily recharged once they are deployed. Up to now, many energy efficient routing algorithms or protocols have been proposed with techniques like clustering, data aggregation and location tracking etc. However, many of them aim to minimize parameters like total energy consumption, latency etc., which cause hotspot nodes and partitioned network due to the overuse of certain nodes. In this paper, a Distance-based Energy Aware Routing (DEAR) algorithm is proposed to ensure energy efficiency and energy balancing based on theoretical analysis of different energy and traffic models. During the routing process, we consider individual distance as the primary parameter in order to adjust and equalize the energy consumption among involved sensors. The residual energy is also considered as a secondary factor. In this way, all the intermediate nodes will consume their energy at similar rate, which maximizes network lifetime. Simulation results show that the DEAR algorithm can reduce and balance the energy consumption for all sensor nodes so network lifetime is greatly prolonged compared to other routing algorithms.
Collapse
Affiliation(s)
- Jin Wang
- Department of Computer Engineering, Kyung Hee University, Suwon, Korea; E-Mails: (J.W.); (Y.N.)
| | - Jeong-Uk Kim
- Department of Energy Grid, Sang Myung University, Seoul, Korea; E-Mail: (J.-U.K.)
| | - Lei Shu
- Department of Multimedia Engineering, Osaka University, Japan; E-Mail: (L.S.)
| | - Yu Niu
- Department of Computer Engineering, Kyung Hee University, Suwon, Korea; E-Mails: (J.W.); (Y.N.)
| | - Sungyoung Lee
- Department of Computer Engineering, Kyung Hee University, Suwon, Korea; E-Mails: (J.W.); (Y.N.)
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +82-31-201-2514; Fax: +82-31-202-2520
| |
Collapse
|
240
|
Abstract
Dielectrophoresis (DEP) is a technique which offers label-free measurement of cell electrophysiology by monitoring its movement in non-uniform electric fields. In this chapter, the theory underlying DEP is explored, as are the implications of the development of equipment for taking such measurements. Practical considerations such as the selection of a suspending medium are also discussed.
Collapse
Affiliation(s)
- Kai F Hoettges
- Centre for Biomedical Engineering, University of Surrey, Guilford, Surrey, UK
| |
Collapse
|
241
|
Fairley J, Johnson AN, Georgoulas G, Vachtsevanos G. Automated polysomnogram artifact compensation using the generalized singular value decomposition algorithm. Annu Int Conf IEEE Eng Med Biol Soc 2010; 2010:5097-5100. [PMID: 21096035 DOI: 10.1109/iembs.2010.5626213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Manual/visual polysomnogram (psg) analysis is a standard and commonly implemented procedure utilized in the diagnosis and treatment of sleep related human pathologies. Current technological trends in psg analysis focus upon translating manual psg analysis into automated/computerized approaches. A necessary first step in establishing efficient automated human sleep analysis systems is the development of reliable pre-processing tools to discriminate between outlier/artifact instances and data of interest. This paper investigates the application of an automated approach, using the generalized singular value decomposition algorithm, to compensate for specific psg artifacts.
Collapse
Affiliation(s)
- Jacqueline Fairley
- NINDS postdoctoral fellow at Emory University School of Medicine Department of Neurology, Atlanta, GA 30322, USA.
| | | | | | | |
Collapse
|
242
|
Abstract
BACKGROUND The acceptance of closed-loop blood glucose (BG) control using continuous glucose monitoring systems (CGMS) is likely to improve with enhanced performance of their integral hypoglycemia alarms. This article presents an in silico analysis (based on clinical data) of a modeled CGMS alarm system with trained thresholds on type 1 diabetes mellitus (T1DM) patients that is augmented by sensor fusion from a prototype hypoglycemia alarm system (HypoMon). This prototype alarm system is based on largely independent autonomic nervous system (ANS) response features. METHODS Alarm performance was modeled using overnight BG profiles recorded previously on 98 T1DM volunteers. These data included the corresponding ANS response features detected by HypoMon (AiMedics Pty. Ltd.) systems. CGMS data and alarms were simulated by applying a probabilistic model to these overnight BG profiles. The probabilistic model developed used a mean response delay of 7.1 minutes, measurement error offsets on each sample of +/- standard deviation (SD) = 4.5 mg/dl (0.25 mmol/liter), and vertical shifts (calibration offsets) of +/- SD = 19.8 mg/dl (1.1 mmol/liter). Modeling produced 90 to 100 simulated measurements per patient. Alarm systems for all analyses were optimized on a training set of 46 patients and evaluated on the test set of 56 patients. The split between the sets was based on enrollment dates. Optimization was based on detection accuracy but not time to detection for these analyses. The contribution of this form of data fusion to hypoglycemia alarm performance was evaluated by comparing the performance of the trained CGMS and fused data algorithms on the test set under the same evaluation conditions. RESULTS The simulated addition of HypoMon data produced an improvement in CGMS hypoglycemia alarm performance of 10% at equal specificity. Sensitivity improved from 87% (CGMS as stand-alone measurement) to 97% for the enhanced alarm system. Specificity was maintained constant at 85%. Positive predictive values on the test set improved from 61 to 66% with negative predictive values improving from 96 to 99%. These enhancements were stable within sensitivity analyses. Sensitivity analyses also suggested larger performance increases at lower CGMS alarm performance levels. CONCLUSION Autonomic nervous system response features provide complementary information suitable for fusion with CGMS data to enhance nocturnal hypoglycemia alarms.
Collapse
|
243
|
Lillo-Le Louët A, Toussaint Y, Villerd J. A qualitative approach to signal mining in pharmacovigilance using formal concept analysis. Stud Health Technol Inform 2010; 160:969-973. [PMID: 20841828] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
"Pharmacovigilance is the process and science of monitoring the safety of medicines, consisting in (i) collecting and managing data on the safety of medicines (ii) looking at the data to detect 'signals' (any new or changing safety issue)" [1]. Pharmacovigilance is mainly based on spontaneous reports: when suspecting an adverse drug reaction, health care practitioners send a report to a spontaneous reporting system (SRS). This produces huge databases containing numerous reports and their manual exploration is both cost and time prohibitive. Existing techniques that automatically extract relevant signals rely on statistics or Bayesian models but do not provide information to the experts about possible biases lying in the data, nor about the specificity of a signal to a particular patient profile. Our extraction method combines numerical methods from the state of the art with a qualitative approach that helps interpretation. We build a synthetic representation of the database that is used to (i) identify unexpected patterns and biases (ii) extract potentially relevant signals w.r.t. patient profiles (iii) provide traceability facilities between extracted signals and raw data.
Collapse
Affiliation(s)
- Agnès Lillo-Le Louët
- Centre régional de pharmacovigilance, Hôpital européen Georges Pompidou, Paris, France
| | | | | |
Collapse
|
244
|
Abstract
BACKGROUND In 2001, the blood collection agencies of Zhejiang Province adopted a unified information system, but when a blood product was transferred from the collection and processing facility inventory to some other inventories, because of lacking the unique identifying ability between facilities, the numbers might be overlapped. The exchange of blood units between agencies needed change of labels, and it was very complicated. The SARS outbreak in 2003 made this problem more prominent. Therefore, we decided to redevelop a new information system based on ISBT 128 to solve this problem. STUDY DESIGN AND METHODS We registered with ICCBBA and obtained the technical documentation and database. We requested the new product description code from ICCBBA and properly adjusted the incompatibility between ISBT 128 and the national standards. We redesigned the information system and labels and validated some related equipment and the laboratory information system. RESULTS In May 2005, we completed the development of the new information system based on ISBT 128, and by the end of 2005, all the blood collection agencies of the Zhejiang Province began to use ISBT 128. As a result, the product information in blood collection agencies of Zhejiang Province can be shared, and the exchange of blood products between agencies became safe and rapid. It was highly praised by the nation's Ministry of Health. CONCLUSIONS ISBT 128 was successfully implemented in Zhejiang Province.
Collapse
Affiliation(s)
- Changhong Kong
- Blood Center of Zhejiang Province, Key Laboratory of Blood Safety Research, Ministry of Health, Hangzhou, Zhejiang, China.
| | | | | | | | | |
Collapse
|
245
|
Bell MC, Robuck PR, Wright EC, Mihova MS, Hofmann C, De Santo JL, Milstein SL, Richtmyer PA, Shelton JL, Cormier M, King DL, Park CJ, Molchen WA, Park Y, Kelley M. Automated summaries of serious adverse events in the hepatitis C antiviral long-term treatment against cirrhosis trial. Clin Trials 2009; 6:618-27. [PMID: 19889888 PMCID: PMC3753781 DOI: 10.1177/1740774509348525] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
BACKGROUND Even though adverse event (AE) collection and official accounting are mandatory for clinical trials, there are limited detailed guidelines specifying how to summarize the event for reporting in a timely and expeditious manner. This article details the AE and serious adverse event (SAE) reporting summary developed for a large multi-center National Institutes of Health (NIH)-sponsored clinical trial. PURPOSE To review and analyze the large volume of AE data reported by 10 sites (806 SAEs and 19,034 AEs from August 2000 to May 2007) the automated SAE summary was developed. It was designed to ensure timeliness and clarity in the complex process of AE review and reporting. METHODS The AE and SAE case report forms (CRFs) as well as the automated SAE summary were developed within a database management system developed by the Data Coordinating Center (DCC) which allowed for web-based data entry at the DCC and 10 sites and offered immediate overall and site-specific reports accessible by the DCC, site, and NIH project staff. RESULTS The automated SAE summary pulled data from multiple CRFs to create a succinct and informative summary and allowed for prompt and easy reporting to the regulatory agencies. The summary was adaptable to the needs of reviewers because of the availability of multiple search options.
Collapse
|
246
|
Nkoy FL, Wolfe D, Hales JW, Lattin G, Rackham M, Maloney CG. Enhancing an existing clinical information system to improve study recruitment and census gathering efficiency. AMIA Annu Symp Proc 2009; 2009:476-480. [PMID: 20351902 PMCID: PMC2815472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
Information technology can improve healthcare efficiency. We developed and implemented a simple and inexpensive tool, the "Automated Case Finding and Alerting System" (ACAS), using data from an existing clinical information system to facilitate identification of potentially eligible patients for clinical trials and patient encounters for billing purposes. We validated the ACAS by calculating the level of agreement in patient identification with data generated from manual identification methods. There was substantial agreement between the two methods both for clinical trial (kappa:0.84) and billing (kappa:0.97). Automated identification occurred instantaneously vs. about 2 hours/day for clinical trial and 1 hour 10 minutes/day for billing, and was inexpensive ($98.95, one time fee) compared to manual identification ($1,200/month for clinical trial and $670/month for billing). Automated identification was more efficient and cost-effective than manual identification methods. Repurposing clinical information beyond their traditional use has the potential to improve efficiency and decrease healthcare cost.
Collapse
Affiliation(s)
- Flory L Nkoy
- University of Utah School of Medicine, Salt Lake City, UT and
| | | | | | | | | | | |
Collapse
|
247
|
Abstract
As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.
Collapse
Affiliation(s)
- Guangyu Zhu
- Language and Media Processing Laboratory, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742, USA.
| | | | | | | |
Collapse
|
248
|
Barron D, Blumenthal L, Bourque S, Brovarny N, Childress J, Clark JS, Criswell DL, Dillard J, Dougherty M, Gardenier M, Gryzbowski D, Hall T, Hardison M, Hecht J, Hjort B, Jackson K, Johnson M, Lerch DM, Maxim DW, Mozie DI, Osi I, Panzarella D, Pavlick JL, Qazen U, Ray S, Spurrell L, Stephens D, Sugg S, Vernon K, Waugh TE, Wiedemann LA. Electronic signature, attestation, and authorship (updated). J AHIMA 2009; 80:62-69. [PMID: 19953798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
|
249
|
Djioua M, Plamondon R. A new algorithm and system for the characterization of handwriting strokes with delta-lognormal parameters. IEEE Trans Pattern Anal Mach Intell 2009; 31:2060-2072. [PMID: 19762931 DOI: 10.1109/tpami.2008.264] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
In this paper, we present a new analytical method for estimating the parameters of Delta-Lognormal functions and characterizing handwriting strokes. According to the Kinematic Theory of rapid human movements, these parameters contain information on both the motor commands and the timing properties of a neuromuscular system. The new algorithm, called XZERO, exploits relationships between the zero crossings of the first and second time derivatives of a lognormal function and its four basic parameters. The methodology is described and then evaluated under various testing conditions. The new tool allows a greater variety of stroke patterns to be processed automatically. Furthermore, for the first time, the extraction accuracy is quantified empirically, taking advantage of the exponential relationships that link the dispersion of the extraction errors with its signal-to-noise ratio. A new extraction system which combines this algorithm with two other previously published methods is also described and evaluated. This system provides researchers involved in various domains of pattern analysis and artificial intelligence with new tools for the basic study of single strokes as primitives for understanding rapid human movements.
Collapse
Affiliation(s)
- Moussa Djioua
- Laboratoire Scribens, Département de Génie Electrique, Ecole Polytechnique de Montréal, Montréal, QC H3C 3A7, Canada.
| | | |
Collapse
|
250
|
Bergmann T, Hadrys H, Breves G, Schierwater B. Character-based DNA barcoding: a superior tool for species classification. Berl Munch Tierarztl Wochenschr 2009; 122:446-450. [PMID: 19999380] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
In zoonosis research only correct assigned host-agent-vector associations can lead to success. If most biological species on Earth, from agent to host and from procaryotes to vertebrates, are still undetected, the development of a reliable and universal diversity detection tool becomes a conditio sine qua non. In this context, in breathtaking speed, modern molecular-genetic techniques have become acknowledged tools for the classification of life forms at all taxonomic levels. While previous DNA-barcoding techniques were criticised for several reasons (Moritz and Cicero, 2004; Rubinoff et al., 2006a, b; Rubinoff, 2006; Rubinoff and Haines, 2006) a new approach, the so called CAOS-barcoding (Character Attribute Organisation System), avoids most of the weak points. Traditional DNA-barcoding approaches are based on distances, i. e. they use genetic distances and tree construction algorithms for the classification of species or lineages. The definition of limit values is enforced and prohibits a discrete or clear assignment. In comparison, the new character-based barcoding (CAOS-barcoding; DeSalle et al., 2005; DeSalle, 2006; Rach et al., 2008) works with discrete single characters and character combinations which permits a clear, unambiguous classification. In Hannover (Germany) we are optimising this system and developing a semiautomatic high-throughput procedure for hosts, agents and vectors being studied within the Zoonosis Centre of the "Stiftung Tierärztliche Hochschule Hannover". Our primary research is concentrated on insects, the most successful and species-rich animal group on Earth (every fourth animal is a bug). One subgroup, the winged insects (Pterygota), represents the outstanding majority of all zoonosis relevant animal vectors.
Collapse
Affiliation(s)
- Tjard Bergmann
- Institut für Tierökologie und Zellbiologie der Stiftung Tierärztliche Hochschule Hannover.
| | | | | | | |
Collapse
|