1
|
Carusi A. Chemicals regulation and non-animal methods: displacing the gold standard. Wellcome Open Res 2024; 9:167. [PMID: 39267989 PMCID: PMC11391182 DOI: 10.12688/wellcomeopenres.20581.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/15/2024] [Indexed: 09/15/2024] Open
Abstract
Regulating industrial chemicals in foodstuffs and consumer products is a major aspect of protecting populations against health risks. Non-animal testing methods are an essential part of the radical change to the framework for toxicity testing that is long overdue in global economies. This paper discusses reasons why the drive to reduce animal testing for chemical safety testing is so difficult to achieve, as perceived by those who are closely involved in chemicals regulations in different capacities. Progress is slow, despite the fact that the ethico-legal conditions for a move away from animal testing are largely in place, and despite scientific arguments for a radical change in the paradigm of toxicity testing, away from reliance on animal studies. I present empirical data drawn from two studies in a European Commission context promoting non-animal methods. The aim of the paper is modest. It is to foreground the voices of those who deal with the science and regulation of chemicals on a day-to-day basis, rather than to offer a theoretical framework for what I heard from them. I offer a synthesis of the main challenges faced by non-animal alternatives, as these are perceived by people in different stakeholder groups dealing with chemicals regulation. I show where there are pockets of agreement between different stakeholders, and where the main disagreements lie. In particular there is dispute and disagreement over what counts as validation of these alternative tests, and by implication of the traditional 'gold standard' of animal testing. Finally, I suggest that the shift to non-animal methods in chemicals regulation demonstrates the need for the concept of validation to be broadened from a purely techno-scientific definition, and be more explictly understood as a demand for trust and acceptance, with more attention given to the complex social, institutional and economic settings in which it operates.
Collapse
|
2
|
Williamson J. Bayesianism from a philosophical perspective and its application to medicine. Int J Biostat 2023; 19:295-307. [PMID: 36490222 DOI: 10.1515/ijb-2022-0043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Accepted: 10/03/2022] [Indexed: 11/15/2023]
Abstract
Bayesian philosophy and Bayesian statistics have diverged in recent years, because Bayesian philosophers have become more interested in philosophical problems other than the foundations of statistics and Bayesian statisticians have become less concerned with philosophical foundations. One way in which this divergence manifests itself is through the use of direct inference principles: Bayesian philosophers routinely advocate principles that require calibration of degrees of belief to available non-epistemic probabilities, while Bayesian statisticians rarely invoke such principles. As I explain, however, the standard Bayesian framework cannot coherently employ direct inference principles. Direct inference requires a shift towards a non-standard Bayesian framework, which further increases the gap between Bayesian philosophy and Bayesian statistics. This divergence does not preclude the application of Bayesian philosophical methods to real-world problems. Data consolidation is a key challenge for present-day systems medicine and other systems sciences. I show that data consolidation requires direct inference and that the non-standard Bayesian methods outlined here are well suited to this task.
Collapse
Affiliation(s)
- Jon Williamson
- Department of Philosophy and Centre for Reasoning, University of Kent, Canterbury, UK
| |
Collapse
|
3
|
Carusi A, Winter PD, Armstrong I, Ciravegna F, Kiely DG, Lawrie A, Lu H, Sabroe I, Swift A. Medical artificial intelligence is as much social as it is technological. NAT MACH INTELL 2023. [DOI: 10.1038/s42256-022-00603-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
|
4
|
Fast-Fed Variability: Insights into Drug Delivery, Molecular Manifestations, and Regulatory Aspects. Pharmaceutics 2022; 14:pharmaceutics14091807. [PMID: 36145555 PMCID: PMC9505616 DOI: 10.3390/pharmaceutics14091807] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 08/24/2022] [Accepted: 08/25/2022] [Indexed: 12/26/2022] Open
Abstract
Among various drug administration routes, oral drug delivery is preferred and is considered patient-friendly; hence, most of the marketed drugs are available as conventional tablets or capsules. In such cases, the administration of drugs with or without food has tremendous importance on the bioavailability of the drugs. The presence of food may increase (positive effect) or decrease (negative effect) the bioavailability of the drug. Such a positive or negative effect is undesirable since it makes dosage estimation difficult in several diseases. This may lead to an increased propensity for adverse effects of drugs when a positive food effect is perceived. However, a negative food effect may lead to therapeutic insufficiency for patients suffering from life-threatening disorders. This review emphasizes the causes of food effects, formulation strategies to overcome the fast-fed variability, and the regulatory aspects of drugs with food effects, which may open new avenues for researchers to design products that may help to eliminate fast-fed variability.
Collapse
|
5
|
Cool A. Impossible, unknowable, accountable: Dramas and dilemmas of data law. SOCIAL STUDIES OF SCIENCE 2019; 49:503-530. [PMID: 31057059 DOI: 10.1177/0306312719846557] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
On May 25, 2018, the European Union's General Data Protection Regulation (GDPR) came into force. EU citizens are granted more control over personal data while companies and organizations are charged with increased responsibility enshrined in broad principles like transparency and accountability. Given the scope of the regulation, which aims to harmonize data practices across 28 member states with different concerns about data collection, the GDPR has significant consequences for individuals in the EU and globally. While the GDPR is primarily intended to regulate tech companies, it also has important implications for data use in scientific research. Drawing on ethnographic fieldwork with researchers, lawyers and legal scholars in Sweden, I argue that the GDPR's flexible accountability principle effectively encourages researchers to reflect on their ethical responsibility but can also become a source of anxiety and produce unexpected results. Many researchers I spoke with expressed profound uncertainty about 'impossible' legal requirements for research data use. Despite the availability of legal texts and interpretations, I suggest we should take researchers' concerns about 'unknowable' data law seriously. Many researchers' sense of legal ambiguity led them to rethink their data practices and themselves as ethical subjects through an orientation to what they imagined as the 'real people behind the data', variously formulated as a Swedish population desiring data use for social benefit or a transnational public eager for research results. The intentions attributed to people, populations and publics - whom researchers only encountered in the abstract form of data - lent ethical weight to various and sometimes conflicting decisions about data security and sharing. Ultimately, researchers' anxieties about their inability to discern the desires of the 'real people' lent new appeal to solutions, however flawed, that promised to alleviate the ethical burden of personal data.
Collapse
Affiliation(s)
- Alison Cool
- Department of Anthropology, University of Colorado Boulder, USA
| |
Collapse
|
6
|
Parmar JH, Mendes P. A computational model to understand mouse iron physiology and disease. PLoS Comput Biol 2019; 15:e1006680. [PMID: 30608934 PMCID: PMC6334977 DOI: 10.1371/journal.pcbi.1006680] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2018] [Revised: 01/16/2019] [Accepted: 11/29/2018] [Indexed: 12/16/2022] Open
Abstract
It is well known that iron is an essential element for life but is toxic when in excess or in certain forms. Accordingly there are many diseases that result directly from either lack or excess of iron. Yet many molecular and physiological aspects of iron regulation have only been discovered recently and others are still elusive. There is still no good quantitative and dynamic description of iron absorption, distribution, storage and mobilization that agrees with the wide array of phenotypes presented in several iron-related diseases. The present work addresses this issue by developing a mathematical model of iron distribution in mice calibrated with ferrokinetic data and subsequently validated against data from mouse models of iron disorders, such as hemochromatosis, β-thalassemia, atransferrinemia and anemia of inflammation. To adequately fit the ferrokinetic data required inclusion of the following mechanisms: a) transferrin-mediated iron delivery to tissues, b) induction of hepcidin by transferrin-bound iron, c) ferroportin-dependent iron export regulated by hepcidin, d) erythropoietin regulation of erythropoiesis, and e) liver uptake of NTBI. The utility of the model to simulate disease interventions was demonstrated by using it to investigate the outcome of different schedules of transferrin treatment in β-thalassemia. Iron is an essential nutrient in almost all life forms. In humans and animals iron is used for respiration and for transporting oxygen inside red blood cells. But in excess iron can be toxic and therefore the body regulates its distribution and absortion through the action of hormones, which is not yet completely understood. Here we created a computational model of the regulation of iron distribution in the body of a mouse based on experimental data. The model can accurately simulate many iron diseases such as anemia, hemochromatosis, and thalassemia. This computational model is helpful to understand the basis of these diseases and plan therapies to address them.
Collapse
Affiliation(s)
- Jignesh H. Parmar
- Center for Quantitative Medicine and Department of Cell Biology, University of Connecticut School of Medicine, Farmington, Connecticut, United States of America
| | - Pedro Mendes
- Center for Quantitative Medicine and Department of Cell Biology, University of Connecticut School of Medicine, Farmington, Connecticut, United States of America
- * E-mail:
| |
Collapse
|
7
|
Lawson BAJ, Drovandi CC, Cusimano N, Burrage P, Rodriguez B, Burrage K. Unlocking data sets by calibrating populations of models to data density: A study in atrial electrophysiology. SCIENCE ADVANCES 2018; 4:e1701676. [PMID: 29349296 PMCID: PMC5770172 DOI: 10.1126/sciadv.1701676] [Citation(s) in RCA: 49] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2017] [Accepted: 12/08/2017] [Indexed: 05/08/2023]
Abstract
The understanding of complex physical or biological systems nearly always requires a characterization of the variability that underpins these processes. In addition, the data used to calibrate these models may also often exhibit considerable variability. A recent approach to deal with these issues has been to calibrate populations of models (POMs), multiple copies of a single mathematical model but with different parameter values, in response to experimental data. To date, this calibration has been largely limited to selecting models that produce outputs that fall within the ranges of the data set, ignoring any trends that might be present in the data. We present here a novel and general methodology for calibrating POMs to the distributions of a set of measured values in a data set. We demonstrate our technique using a data set from a cardiac electrophysiology study based on the differences in atrial action potential readings between patients exhibiting sinus rhythm (SR) or chronic atrial fibrillation (cAF) and the Courtemanche-Ramirez-Nattel model for human atrial action potentials. Not only does our approach accurately capture the variability inherent in the experimental population, but we also demonstrate how the POMs that it produces may be used to extract additional information from the data used for calibration, including improved identification of the differences underlying stratified data. We also show how our approach allows different hypotheses regarding the variability in complex systems to be quantitatively compared.
Collapse
Affiliation(s)
- Brodie A. J. Lawson
- Australian Research Council Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematical Sciences, Queensland University of Technology, Brisbane, Queensland, Australia
- Corresponding author.
| | - Christopher C. Drovandi
- Australian Research Council Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematical Sciences, Queensland University of Technology, Brisbane, Queensland, Australia
| | | | - Pamela Burrage
- Australian Research Council Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematical Sciences, Queensland University of Technology, Brisbane, Queensland, Australia
| | - Blanca Rodriguez
- Department of Computer Science, University of Oxford, Oxford, UK
| | - Kevin Burrage
- Australian Research Council Centre of Excellence for Mathematical and Statistical Frontiers, School of Mathematical Sciences, Queensland University of Technology, Brisbane, Queensland, Australia
- Department of Computer Science, University of Oxford, Oxford, UK
| |
Collapse
|
8
|
Patterson EA, Whelan MP. A framework to establish credibility of computational models in biology. PROGRESS IN BIOPHYSICS AND MOLECULAR BIOLOGY 2017; 129:13-19. [DOI: 10.1016/j.pbiomolbio.2016.08.007] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/16/2016] [Revised: 07/18/2016] [Accepted: 08/01/2016] [Indexed: 10/20/2022]
|
9
|
Abstract
Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed in neuroscience. I distinguish between two types of applications of the simulation methodology in neuroscientific research. Model-oriented applications aim to use the simulation outputs to derive new hypotheses about brain organization and functioning and thus to extend current theoretical knowledge and understanding in the field. Data-oriented applications of the simulation methodology target the collection and analysis of data relevant for neuroscientific research that is inaccessible via more traditional experimental methods. I argue for a two-stage evaluation schema which helps clarify the differences and similarities between three current large-scale simulation projects pursued in neuroscience.
Collapse
|
10
|
Boon M. An engineering paradigm in the biomedical sciences: Knowledge as epistemic tool. PROGRESS IN BIOPHYSICS AND MOLECULAR BIOLOGY 2017; 129:25-39. [PMID: 28389261 DOI: 10.1016/j.pbiomolbio.2017.04.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/11/2016] [Revised: 03/09/2017] [Accepted: 04/01/2017] [Indexed: 10/19/2022]
Abstract
In order to deal with the complexity of biological systems and attempts to generate applicable results, current biomedical sciences are adopting concepts and methods from the engineering sciences. Philosophers of science have interpreted this as the emergence of an engineering paradigm, in particular in systems biology and synthetic biology. This article aims at the articulation of the supposed engineering paradigm by contrast with the physics paradigm that supported the rise of biochemistry and molecular biology. This articulation starts from Kuhn's notion of a disciplinary matrix, which indicates what constitutes a paradigm. It is argued that the core of the physics paradigm is its metaphysical and ontological presuppositions, whereas the core of the engineering paradigm is the epistemic aim of producing useful knowledge for solving problems external to the scientific practice. Therefore, the two paradigms involve distinct notions of knowledge. Whereas the physics paradigm entails a representational notion of knowledge, the engineering paradigm involves the notion of 'knowledge as epistemic tool'.
Collapse
Affiliation(s)
- Mieke Boon
- Department of Philosophy, University of Twente, PO Box 217, 7500 AE Enschede, The Netherlands.
| |
Collapse
|
11
|
Gemmell PM. Establishing the structures within populations of models. PROGRESS IN BIOPHYSICS AND MOLECULAR BIOLOGY 2017; 129:20-24. [PMID: 28341288 DOI: 10.1016/j.pbiomolbio.2017.03.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/07/2016] [Revised: 03/10/2017] [Accepted: 03/20/2017] [Indexed: 11/30/2022]
Abstract
As computational biology matures as a field, increasing attention is being paid to the relation of computational models to their target. One aspect of this is addressing how computational models can appropriately reproduce the variation seen in experimental data, with one solution being to use populations of models united by a common set of equations (the framework), with each individual member of the population (each model) possessing its own unique set of equation parameters. These model populations are then calibrated and validated against experimental data, and as a whole reproduce the experimentally observed variation. The primary focus of validation thus becomes the population, with the individual models' validation seemingly deriving from their membership of this population. The role of individual models within the population is not clear, with uncertainty regarding the relationship between individual models and the population they make up. This work examines the role of models within the population, how they relate to the population they make up, and how both can be said to be validated in this context.
Collapse
Affiliation(s)
- Philip M Gemmell
- Clyde Biosciences Limited, BioCity Scotland, Bo'Ness Road, Newhouse, Lanarkshire, Scotland, ML1 5UH, United Kingdom.
| |
Collapse
|
12
|
Gross F, MacLeod M. Prospects and problems for standardizing model validation in systems biology. PROGRESS IN BIOPHYSICS AND MOLECULAR BIOLOGY 2017; 129:3-12. [PMID: 28089814 DOI: 10.1016/j.pbiomolbio.2017.01.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/12/2016] [Revised: 08/20/2016] [Accepted: 01/11/2017] [Indexed: 01/22/2023]
Abstract
There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these.
Collapse
Affiliation(s)
- Fridolin Gross
- Institute for Philosophy, University of Kassel, Nora-Platiel-Strasse 1, 34127 Kassel, Germany.
| | - Miles MacLeod
- Department of Philosophy, University of Twente, Drienerlolaan 5, 7522DN Enschede, The Netherlands.
| |
Collapse
|
13
|
Pârvu O, Gilbert D. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking. PLoS One 2016; 11:e0154847. [PMID: 27187178 PMCID: PMC4871515 DOI: 10.1371/journal.pone.0154847] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2015] [Accepted: 04/20/2016] [Indexed: 12/15/2022] Open
Abstract
Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.
Collapse
Affiliation(s)
- Ovidiu Pârvu
- Department of Computer Science, College of Engineering, Design and Physical Sciences, Brunel University London, London, United Kingdom
| | - David Gilbert
- Department of Computer Science, College of Engineering, Design and Physical Sciences, Brunel University London, London, United Kingdom
| |
Collapse
|
14
|
Muszkiewicz A, Britton OJ, Gemmell P, Passini E, Sánchez C, Zhou X, Carusi A, Quinn TA, Burrage K, Bueno-Orovio A, Rodriguez B. Variability in cardiac electrophysiology: Using experimentally-calibrated populations of models to move beyond the single virtual physiological human paradigm. PROGRESS IN BIOPHYSICS AND MOLECULAR BIOLOGY 2015; 120:115-27. [PMID: 26701222 PMCID: PMC4821179 DOI: 10.1016/j.pbiomolbio.2015.12.002] [Citation(s) in RCA: 116] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/16/2015] [Revised: 11/24/2015] [Accepted: 12/02/2015] [Indexed: 01/13/2023]
Abstract
Physiological variability manifests itself via differences in physiological function between individuals of the same species, and has crucial implications in disease progression and treatment. Despite its importance, physiological variability has traditionally been ignored in experimental and computational investigations due to averaging over samples from multiple individuals. Recently, modelling frameworks have been devised for studying mechanisms underlying physiological variability in cardiac electrophysiology and pro-arrhythmic risk under a variety of conditions and for several animal species as well as human. One such methodology exploits populations of cardiac cell models constrained with experimental data, or experimentally-calibrated populations of models. In this review, we outline the considerations behind constructing an experimentally-calibrated population of models and review the studies that have employed this approach to investigate variability in cardiac electrophysiology in physiological and pathological conditions, as well as under drug action. We also describe the methodology and compare it with alternative approaches for studying variability in cardiac electrophysiology, including cell-specific modelling approaches, sensitivity-analysis based methods, and populations-of-models frameworks that do not consider the experimental calibration step. We conclude with an outlook for the future, predicting the potential of new methodologies for patient-specific modelling extending beyond the single virtual physiological human paradigm.
Collapse
Affiliation(s)
- Anna Muszkiewicz
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom
| | - Oliver J Britton
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom
| | - Philip Gemmell
- Clyde Biosciences Ltd, West Medical Building, University of Glasgow, Glasgow G12 8QQ, United Kingdom
| | - Elisa Passini
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom
| | - Carlos Sánchez
- Center for Computational Medicine in Cardiology (CCMC), Institute of Computational Science, Università della Svizzera italiana, Lugano, Switzerland
| | - Xin Zhou
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom
| | | | - T Alexander Quinn
- Department of Physiology and Biophysics, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Kevin Burrage
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom; Mathematical Sciences, Queensland University of Technology, Queensland 4072, Australia; ACEMS, ARC Centre of Excellence for Mathematical and Statistical Frontiers, Queensland University of Technology, Queensland 4072, Australia
| | - Alfonso Bueno-Orovio
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom
| | - Blanca Rodriguez
- Department of Computer Science, University of Oxford, Parks Road, Oxford OX1 3QD, United Kingdom.
| |
Collapse
|