1
|
Data analytics in the football industry: a survey investigating operational frameworks and practices in professional clubs and national federations from around the world. SCI MED FOOTBALL 2024:1-10. [PMID: 38745403 DOI: 10.1080/24733938.2024.2341837] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 04/04/2024] [Indexed: 05/16/2024]
Abstract
The use of data and analytics in professional football organisations has grown steadily over the last decade. Nevertheless, how and whether these advances in sports analytics address the needs of professional football remain unexplored. Practitioners from national federations qualified for the FIFA World Cup Qatar 2022™ and professional football clubs from an international community of practitioners took part in a survey exploring the characteristics of their data analytics infrastructure, their role, and their value for elaborating player monitoring and positional data. Respondents from 29 national federations and 32 professional clubs completed the survey, with response rates of 90.6% and 77.1%, respectively. Summary information highlighted the underemployment of staff with expertise in applied data analytics across organisations. Perceptions regarding analytical capabilities and data governance framework were heterogenous, particularly in the case of national federations. Only a third of national federation respondents (~30%) perceived information on positional data from international sports data analytics providers to be sufficiently clear. The general resourcing limitations, the overall lack of expertise in data analytics methods, and the absence of operational taxonomies for reference performance metrics pose constraints to meaningful knowledge translations from raw data in professional football organisations.
Collapse
|
2
|
Vaccine process technology-A decade of progress. Biotechnol Bioeng 2024. [PMID: 38711222 DOI: 10.1002/bit.28703] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 03/04/2024] [Accepted: 03/14/2024] [Indexed: 05/08/2024]
Abstract
In the past decade, new approaches to the discovery and development of vaccines have transformed the field. Advances during the COVID-19 pandemic allowed the production of billions of vaccine doses per year using novel platforms such as messenger RNA and viral vectors. Improvements in the analytical toolbox, equipment, and bioprocess technology have made it possible to achieve both unprecedented speed in vaccine development and scale of vaccine manufacturing. Macromolecular structure-function characterization technologies, combined with improved modeling and data analysis, enable quantitative evaluation of vaccine formulations at single-particle resolution and guided design of vaccine drug substances and drug products. These advances play a major role in precise assessment of critical quality attributes of vaccines delivered by newer platforms. Innovations in label-free and immunoassay technologies aid in the characterization of antigenic sites and the development of robust in vitro potency assays. These methods, along with molecular techniques such as next-generation sequencing, will accelerate characterization and release of vaccines delivered by all platforms. Process analytical technologies for real-time monitoring and optimization of process steps enable the implementation of quality-by-design principles and faster release of vaccine products. In the next decade, the field of vaccine discovery and development will continue to advance, bringing together new technologies, methods, and platforms to improve human health.
Collapse
|
3
|
Using Healthcare Big Data Analytics to Improve Women's Health: Benefits, Challenges, and Perspectives. China CDC Wkly 2024; 6:173-174. [PMID: 38523815 PMCID: PMC10960518 DOI: 10.46234/ccdcw2024.035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2024] [Accepted: 02/26/2024] [Indexed: 03/26/2024] Open
|
4
|
Automized inline monitoring in perfused mammalian cell culture by MIR spectroscopy without calibration model building. Eng Life Sci 2024; 24:e2300237. [PMID: 38444619 PMCID: PMC10910268 DOI: 10.1002/elsc.202300237] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Revised: 11/10/2023] [Accepted: 11/17/2023] [Indexed: 03/07/2024] Open
Abstract
Process Analytical Technologies (PATs) are taking a key role in the run for automatization in the biopharmaceutical industry. Spectroscopic methods such as Raman spectroscopy or mid-infrared (MIR) spectroscopy are getting more recognition in the recent years for inline monitoring of bioprocesses due to their ability to measure various molecules simultaneously. However, their dependency on laborious model calibration making them a challenge to implement. In this study, a novel one-point calibration that requires a single reference point prior to the inline monitoring of glucose and lactate in bioprocesses with MIR spectroscopy is assessed with 22 mammalian cell perfusion (PER) processes in two different scales and four different products. Concentrations are predicted over all PERs runs with a root mean square error (RMSE) of 0.29 g/L for glucose and 0.24 g/L for lactate, respectively. For comparison conventional partial least square regression (PLSR) models were used and trained with spectroscopic data from six bioreactor runs in two different scales and three products. The general accuracy of those models (RMSE of 0.41 g/L for glucose and 0.16 g/L for lactate) are in the range of the accuracy of the one-point calibration. This shows the potential of the one-point calibration as an approach making spectroscopy more accessible for bioprocess development.
Collapse
|
5
|
HistoriView: Implementation and Evaluation of a Novel Approach to Review a Patient Using a Scalable Space-Efficient Timeline without Zoom Interactions. Appl Clin Inform 2024; 15:250-264. [PMID: 38359876 PMCID: PMC10990596 DOI: 10.1055/a-2269-0995] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Accepted: 11/08/2023] [Indexed: 02/17/2024] Open
Abstract
BACKGROUND Timelines have been used for patient review. While maintaining a compact overview is important, merged event representations caused by the intricate and voluminous patient data bring event recognition, access ambiguity, and inefficient interaction problems. Handling large patient data efficiently is another challenge. OBJECTIVE This study aims to develop a scalable, efficient timeline to enhance patient review for research purposes. The focus is on addressing the challenges presented by the intricate and voluminous patient data. METHODS We propose a high-throughput, space-efficient HistoriView timeline for an individual patient. For a compact overview, it uses nonstacking event representation. An overlay detection algorithm, y-shift visualization, and popup-based interaction facilitate comprehensive analysis of overlapping datasets. An i2b2 HistoriView plugin was deployed, using split query and event reduction approaches, delivering the entire history efficiently without losing information. For evaluation, 11 participants completed a usability survey and a preference survey, followed by qualitative feedback. To evaluate scalability, 100 randomly selected patients over 60 years old were tested on the plugin and were compared with a baseline visualization. RESULTS Most participants found that HistoriView was easy to use and learn and delivered information clearly without zooming. All preferred HistoriView over a stacked timeline. They expressed satisfaction on display, ease of learning and use, and efficiency. However, challenges and suggestions for improvement were also identified. In the performance test, the largest patient had 32,630 records, which exceeds the baseline limit. HistoriView reduced it to 2,019 visual artifacts. All patients were pulled and visualized within 45.40 seconds. Visualization took less than 3 seconds for all. DISCUSSION AND CONCLUSION HistoriView allows complete data exploration without exhaustive interactions in a compact overview. It is useful for dense data or iterative comparisons. However, issues in exploring subconcept records were reported. HistoriView handles large patient data preserving original information in a reasonable time.
Collapse
|
6
|
Edge IoT Prototyping Using Model-Driven Representations: A Use Case for Smart Agriculture. SENSORS (BASEL, SWITZERLAND) 2024; 24:495. [PMID: 38257588 PMCID: PMC10818290 DOI: 10.3390/s24020495] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Revised: 11/24/2023] [Accepted: 11/26/2023] [Indexed: 01/24/2024]
Abstract
Industry 4.0 is positioned at the junction of different disciplines, aiming to re-engineer processes and improve effectiveness and efficiency. It is taking over many industries whose traditional practices are being disrupted by advances in technology and inter-connectivity. In this context, enhanced agriculture systems incorporate new components that are capable of generating better decision making (humidity/temperature/soil sensors, drones for plague detection, smart irrigation, etc.) and also include novel processes for crop control (reproducible environmental conditions, proven strategies for water stress, etc.). At the same time, advances in model-driven development (MDD) simplify software development by introducing domain-specific abstractions of the code that makes application development feasible for domain experts who cannot code. XMDD (eXtreme MDD) makes this way to assemble software even more user-friendly and enables application domain experts who are not programmers to create complex solutions in a more straightforward way. Key to this approach is the introduction of high-level representations of domain-specific functionalities (called SIBs, service-independent building blocks) that encapsulate the programming code and their organisation in reusable libraries, and they are made available in the application development environment. This way, new domain-specific abstractions of the code become easily comprehensible and composable by domain experts. In this paper, we apply these concepts to a smart agriculture solution, producing a proof of concept for the new methodology in this application domain to be used as a portable demonstrator for MDD in IoT and agriculture in the Confirm Research Centre for Smart Manufacturing. Together with model-driven development tools, we leverage here the capabilities of the Nordic Thingy:53 as a multi-protocol IoT prototyping platform. It is an advanced sensing device that handles the data collection and distribution for decision making in the context of the agricultural system and supports edge computing. We demonstrate the importance of high-level abstraction when adopting a complex software development cycle within a multilayered heterogeneous IT ecosystem.
Collapse
|
7
|
Quantitative evaluation of ChatGPT versus Bard responses to anaesthesia-related queries. Br J Anaesth 2024; 132:169-171. [PMID: 37945414 DOI: 10.1016/j.bja.2023.09.030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2023] [Revised: 09/28/2023] [Accepted: 09/29/2023] [Indexed: 11/12/2023] Open
|
8
|
Unveiling the secrets of adeno-associated virus: novel high-throughput approaches for the quantification of multiple serotypes. Mol Ther Methods Clin Dev 2023; 31:101118. [PMID: 37822717 PMCID: PMC10562196 DOI: 10.1016/j.omtm.2023.101118] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2023] [Accepted: 09/14/2023] [Indexed: 10/13/2023]
Abstract
Adeno-associated virus (AAV) vectors are among the most prominent viral vectors for in vivo gene therapy, and their investigation and development using high-throughput techniques have gained increasing interest. However, sample throughput remains a bottleneck in most analytical assays. In this study, we compared commonly used analytical methods for AAV genome titer, capsid titer, and transducing titer determination with advanced methods using AAV2, AAV5, and AAV8 as representative examples. For the determination of genomic titers, we evaluated the suitability of qPCR and four different digital PCR methods and assessed the respective advantages and limitations of each method. We found that both ELISA and bio-layer interferometry provide comparable capsid titers, with bio-layer interferometry reducing the workload and having a 2.8-fold higher linear measurement range. Determination of the transducing titer demonstrated that live-cell analysis required less manual effort compared with flow cytometry. Both techniques had a similar linear range of detection, and no statistically significant differences in transducing titers were observed. This study demonstrated that the use of advanced analytical methods provides faster and more robust results while simultaneously increasing sample throughput and reducing active bench work time.
Collapse
|
9
|
A Data-Driven Paradigm for a Resilient and Sustainable Integrated Health Information Systems for Health Care Applications. J Multidiscip Healthc 2023; 16:4015-4025. [PMID: 38107085 PMCID: PMC10725635 DOI: 10.2147/jmdh.s433299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 11/02/2023] [Indexed: 12/19/2023] Open
Abstract
Introduction Many transformations and uncertainties, such as the fourth industrial revolution and pandemics, have propelled healthcare acceptance and deployment of health information systems (HIS). External and internal determinants aligning with the global course influence their deployments. At the epic is digitalization, which generates endless data that has permeated healthcare. The continuous proliferation of complex and dynamic healthcare data is the digitalization frontier in healthcare that necessitates attention. Objective This study explores the existing body of information on HIS for healthcare through the data lens to present a data-driven paradigm for healthcare augmentation paramount to attaining a sustainable and resilient HIS. Method Preferred Reporting Items for Systematic Reviews and Meta-Analyses: PRISMA-compliant in-depth literature review was conducted systematically to synthesize and analyze the literature content to ascertain the value disposition of HIS data in healthcare delivery. Results This study details the aspects of a data-driven paradigm for robust and sustainable HIS for health care applications. Data source, data action and decisions, data sciences techniques, serialization of data sciences techniques in the HIS, and data insight implementation and application are data-driven features expounded. These are essential data-driven paradigm building blocks that need iteration to succeed. Discussions Existing literature considers insurgent data in healthcare challenging, disruptive, and potentially revolutionary. This view echoes the current healthcare quandary of good and bad data availability. Thus, data-driven insights are essential for building a resilient and sustainable HIS. People, technology, and tasks dominated prior HIS frameworks, with few data-centric facets. Improving healthcare and the HIS requires identifying and integrating crucial data elements. Conclusion The paper presented a data-driven paradigm for a resilient and sustainable HIS. The findings show that data-driven track and components are essential to improve healthcare using data analytics insights. It provides an integrated footing for data analytics to support and effectively assist health care delivery.
Collapse
|
10
|
Investigations into mRNA Lipid Nanoparticles Shelf-Life Stability under Nonfrozen Conditions. Mol Pharm 2023; 20:6492-6503. [PMID: 37975733 DOI: 10.1021/acs.molpharmaceut.3c00956] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2023]
Abstract
mRNA LNPs can experience a decline in activity over short periods (ranging from weeks to months). As a result, they require frozen storage and transportation conditions to maintain their full functionality when utilized. Currently approved commercially available mRNA LNP vaccines also necessitate frozen storage and supply chain management. Overcoming this significant inconvenience in the future is crucial to reducing unnecessary costs and challenges associated with storage and transport. In this study, our objective was to illuminate the potential time frame for nonfrozen storage and transportation conditions of mRNA LNPs without compromising their activity. To achieve this goal, we conducted a stability assessment and an in vitro cell culture delivery study involving five mRNA LNPs. These LNPs were constructed by using a standard formulation similar to that employed in the three commercially available LNP formulations. Among these formulations, we selected five structurally diverse ionizable lipids─C12-200, CKK-E12, MC3, SM-102, and lipid 23─from the existing literature. We incorporated these lipids into a standard LNP formulation, keeping all other components identical. The LNPs, carrying mRNA payloads, were synthesized by using microfluidic mixing technology. We evaluated the shelf life stability of these LNPs over a span of 9 weeks at temperatures of 2-8, 25, and 40 °C, utilizing an array of analytical techniques. Our findings indicated minimal impact on the hydrodynamic diameter, zeta potential, encapsulation efficiency, and polydispersity of all LNPs across the various temperatures over the studied period. The RiboGreen assay analysis of LNPs showed consistent mRNA contents over several weeks at various nonfrozen storage temperatures, leading to the incorrect assumption of intact and functional LNPs. This misunderstanding was rectified by the significant differences observed in EGFP protein expression in an in vitro cell culture (using HEK293 cells) across the five LNPs. Specifically, only LNP 1 (C12-200) and LNP 4 (SM-102) exhibited high levels of EGFP expression at the start (T0), with over 90% of HEK293 cells transfected and mean fluorescence intensity (MFI) levels exceeding 1. Interestingly, LNP 1 (C12-200) maintained largely unchanged levels of in vitro activity over 11 weeks when stored at both 2-8 and 25 °C. In contrast, LNP 4 (SM-102) retained its functionality when stored at 2-8 °C over 11 weeks but experienced a gradual decline of in vitro activity when stored at room temperature over the same period. Importantly, we observed distinct LNP architectures for the five formulations through cryo-EM imaging. This highlights the necessity for a deeper comprehension of structure-activity relationships within these complex nanoparticle structures. Enhancing our understanding in this regard is vital for overcoming storage and stability limitations, ultimately facilitating the broader application of this technology beyond vaccines.
Collapse
|
11
|
Performance Analysis of Lambda Architecture-Based Big-Data Systems on Air/Ground Surveillance Application with ADS-B Data. SENSORS (BASEL, SWITZERLAND) 2023; 23:7580. [PMID: 37688034 PMCID: PMC10490665 DOI: 10.3390/s23177580] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/29/2023] [Revised: 08/18/2023] [Accepted: 08/29/2023] [Indexed: 09/10/2023]
Abstract
This study introduces a novel methodology designed to assess the accuracy of data processing in the Lambda Architecture (LA), an advanced big-data framework qualified for processing streaming (data in motion) and batch (data at rest) data. Distinct from prior studies that have focused on hardware performance and scalability evaluations, our research uniquely targets the intricate aspects of data-processing accuracy within the various layers of LA. The salient contribution of this study lies in its empirical approach. For the first time, we provide empirical evidence that validates previously theoretical assertions about LA, which have remained largely unexamined due to LA's intricate design. Our methodology encompasses the evaluation of prospective technologies across all levels of LA, the examination of layer-specific design limitations, and the implementation of a uniform software development framework across multiple layers. Specifically, our methodology employs a unique set of metrics, including data latency and processing accuracy under various conditions, which serve as critical indicators of LA's accurate data-processing performance. Our findings compellingly illustrate LA's "eventual consistency". Despite potential transient inconsistencies during real-time processing in the Speed Layer (SL), the system ultimately converges to deliver precise and reliable results, as informed by the comprehensive computations of the Batch Layer (BL). This empirical validation not only confirms but also quantifies the claims posited by previous theoretical discourse, with our results indicating a 100% accuracy rate under various severe data-ingestion scenarios. We applied this methodology in a practical case study involving air/ground surveillance, a domain where data accuracy is paramount. This application demonstrates the effectiveness of the methodology using real-world data-intake scenarios, therefore distinguishing this study from hardware-centric evaluations. This study not only contributes to the existing body of knowledge on LA but also addresses a significant literature gap. By offering a novel, empirically supported methodology for testing LA, a methodology with potential applicability to other big-data architectures, this study sets a precedent for future research in this area, advancing beyond previous work that lacked empirical validation.
Collapse
|
12
|
Approach strategies and application of metabolomics to biotechnology in plants. FRONTIERS IN PLANT SCIENCE 2023; 14:1192235. [PMID: 37636096 PMCID: PMC10451086 DOI: 10.3389/fpls.2023.1192235] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Accepted: 07/24/2023] [Indexed: 08/29/2023]
Abstract
Metabolomics refers to the technology for the comprehensive analysis of metabolites and low-molecular-weight compounds in a biological system, such as cells or tissues. Metabolites play an important role in biological phenomena through their direct involvement in the regulation of physiological mechanisms, such as maintaining cell homeostasis or signal transmission through protein-protein interactions. The current review aims provide a framework for how the integrated analysis of metabolites, their functional actions and inherent biological information can be used to understand biological phenomena related to the regulation of metabolites and how this information can be applied to safety assessments of crops created using biotechnology. Advancement in technology and analytical instrumentation have led new ways to examine the convergence between biology and chemistry, which has yielded a deeper understanding of complex biological phenomena. Metabolomics can be utilized and applied to safety assessments of biotechnology products through a systematic approach using metabolite-level data processing algorithms, statistical techniques, and database development. The integration of metabolomics data with sequencing data is a key step towards improving additional phenotypical evidence to elucidate the degree of environmental affects for variants found in genome associated with metabolic processes. Moreover, information analysis technology such as big data, machine learning, and IT investment must be introduced to establish a system for data extraction, selection, and metabolomic data analysis for the interpretation of biological implications of biotechnology innovations. This review outlines the integrity of metabolomics assessments in determining the consequences of genetic engineering and biotechnology in plants.
Collapse
|
13
|
GD-OES Investigations of Lithium-Ion Battery Graphite Anodes: Impact of Plasma Parameters and Electrode Properties. ACS APPLIED MATERIALS & INTERFACES 2023. [PMID: 37409783 DOI: 10.1021/acsami.3c04485] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 07/07/2023]
Abstract
Due to the demand of lithium-ion battery investigations with glow discharge optical emission spectroscopy (GD-OES), a fundamental study of the influence of essential GD-OES parameters toward graphite anodes in an argon plasma was conducted and compared to previous investigations of massive materials. It is shown that increased applied voltage (500-700 V) enhances the sputtering rate by up to 100%/100 V while keeping the crater shape unaffected. In contrast to this, gas pressure variation seems to be the main tool for crater shape adjustment. Enhancement of the gas pressure (160-300 Pa) pushes the crater profile from a concave to flat shape and to concave again. Known plasma effects are discussed and correlated with the observations. A set of measuring parameters providing a good balance between the crater shape and the sputtering rate is proposed. Additionally, an increase of the duty cycle in the pulsed glow discharge mode leads to a linear increase of the sputtering rate, while a pulse duration rise enhances the sputtering rate in a nonlinear fashion. Thus, different pulsing conditions represent instruments for enhancement of the sputtering rate without affecting the crater shape significantly. Our investigation of different electrode densities shows that lower densities lead to a larger sputtered volume as well as a larger concavity of the released crater.
Collapse
|
14
|
Signaling and meaning in organizational analytics: coping with Goodhart's Law in an era of digitization and datafication. JOURNAL OF COMPUTER-MEDIATED COMMUNICATION : JCMC 2023; 28:zmad023. [PMID: 37520858 PMCID: PMC10376445 DOI: 10.1093/jcmc/zmad023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Revised: 02/14/2023] [Accepted: 04/11/2023] [Indexed: 08/01/2023]
Abstract
The future of work will be measured. The increasing and widespread adoption of analytics, the use of digital inputs and outputs to inform organizational decision making, makes the communication of data central to organizing. This article applies and extends signaling theory to provide a framework for the study of analytics as communication. We report three cases that offer examples of dubious, selective, and ambiguous signaling in the activities of workers seeking to shape the meaning of data within the practice of analytics. The analysis casts the future of work as a game of strategic moves between organizations, seeking to measure behaviors and quantify the performance of work, and workers, altering their behavioral signaling to meet situated goals. The framework developed offers a guide for future examinations of the asymmetric relationship between management and workers as organizations adopt metrics to monitor and evaluate work.
Collapse
|
15
|
Testing Multiple Methods to Effectively Promote Use of a Knowledge Portal to Health Policy Makers: Quasi-Experimental Evaluation. J Med Internet Res 2023; 25:e41997. [PMID: 37379073 PMCID: PMC10365604 DOI: 10.2196/41997] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Revised: 03/16/2023] [Accepted: 04/28/2023] [Indexed: 06/29/2023] Open
Abstract
BACKGROUND Health policy makers and advocates increasingly utilize online resources for policy-relevant knowledge. Knowledge brokering is one potential mechanism to encourage the use of research evidence in policy making, but the mechanisms of knowledge brokerage in online spaces are understudied. This work looks at knowledge brokerage through the launch of Project ASPEN, an online knowledge portal developed in response to a New Jersey legislative act that established a pilot program for adolescent depression screening for young adults in grades 7-12. OBJECTIVE This study compares the ability to drive policy brief downloads by policy makers and advocates from the Project ASPEN knowledge portal using a variety of online methods to promote the knowledge portal. METHODS The knowledge portal was launched on February 1, 2022, and a Google Ad campaign was run between February 27, 2022, and March 26, 2022. Subsequently, a targeted social media campaign, an email campaign, and tailored research presentations were used to promote the website. Promotional activities ended on May 31, 2022. Website analytics were used to track a variety of actions including new users coming to the website, page views, and policy brief downloads. Statistical analysis was used to assess the efficacy of different approaches. RESULTS The campaign generated 2837 unique user visits to the knowledge portal and 4713 page views. In addition, the campaign generated 6.5 policy web page views/day and 0.7 policy brief downloads/day compared with 1.8 views/day and 0.5 downloads/day in the month following the campaign. The rate of policy brief page view conversions was significantly higher for Google Ads compared with other channels such as email (16.0 vs 5.4; P<.001) and tailored research presentations (16.0 vs 0.8; P<.001). The download conversion rate for Google Ads was significantly higher compared with social media (1.2 vs 0.1; P<.001) and knowledge brokering activities (1.2 vs 0.2; P<.001). By contrast, the download conversion rate for the email campaign was significantly higher than that for social media (1.0 vs 0.1; P<.001) and tailored research presentations (1.0 vs 0.2; P<.001). While Google Ads for this campaign cost an average of US $2.09 per click, the cost per conversion was US $11 per conversion to drive targeted policy web page views and US $147 per conversion to drive policy brief downloads. While other approaches drove less traffic, those approaches were more targeted and cost-effective. CONCLUSIONS Four tactics were tested to drive user engagement with policy briefs on the Project ASPEN knowledge portal. Google Ads was shown to be effective in driving a high volume of policy web page views but was ineffective in terms of relative costs. More targeted approaches such as email campaigns and tailored research presentations given to policy makers and advocates to promote the use of research evidence on the knowledge portal website are likely to be more effective when balancing goals and cost-effectiveness.
Collapse
|
16
|
Rational Design of Topical Semi-Solid Dosage Forms-How Far Are We? Pharmaceutics 2023; 15:1822. [PMID: 37514009 PMCID: PMC10386014 DOI: 10.3390/pharmaceutics15071822] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Revised: 06/14/2023] [Accepted: 06/20/2023] [Indexed: 07/30/2023] Open
Abstract
Specific aspects of semi-solid dosage forms for topical application include the nature of the barrier to be overcome, aspects of susceptibility to physical and chemical instability, and a greater influence of sensory perception. Advances in understanding the driving forces of skin penetration as well as the design principles and inner structure of formulations, provide a good basis for the more rational design of such dosage forms, which still often follow more traditional design approaches. This review analyses the opportunities and constraints of rational formulation design approaches in the industrial development of new topical drugs. As the selection of drug candidates with favorable physicochemical properties increases the speed and probability of success, models for drug selection based on theoretical and experimental approaches are discussed. This paper reviews how progress in the scientific understanding of mechanisms and vehicle-influence of skin penetration can be used for rational formulation design. The characterization of semi-solid formulations is discussed with a special focus on modern rheological approaches and analytical methods for investigating and optimizing the chemical stability of active ingredients in consideration of applicable guidelines. In conclusion, the combination of a good understanding of scientific principles combined with early consideration of regulatory requirements for product quality are enablers for the successful development of innovative and robust semi-solid formulations for topical application.
Collapse
|
17
|
Analytics, Properties and Applications of Biologically Active Stilbene Derivatives. Molecules 2023; 28:molecules28114482. [PMID: 37298957 DOI: 10.3390/molecules28114482] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2023] [Revised: 05/18/2023] [Accepted: 05/30/2023] [Indexed: 06/12/2023] Open
Abstract
Stilbene and its derivatives belong to the group of biologically active compounds. Some derivatives occur naturally in various plant species, while others are obtained by synthesis. Resveratrol is one of the best-known stilbene derivatives. Many stilbene derivatives exhibit antimicrobial, antifungal or anticancer properties. A thorough understanding of the properties of this group of biologically active compounds, and the development of their analytics from various matrices, will allow for a wider range of applications. This information is particularly important in the era of increasing incidence of various diseases hitherto unknown, including COVID-19, which is still present in our population. The purpose of this study was to summarize information on the qualitative and quantitative analysis of stilbene derivatives, their biological activity, potential applications as preservatives, antiseptics and disinfectants, and stability analysis in various matrices. Optimal conditions for the analysis of the stilbene derivatives in question were developed using the isotachophoresis technique.
Collapse
|
18
|
Precision quality control: a dynamic model for risk-based analysis of analytical quality. Clin Chem Lab Med 2023; 61:679-687. [PMID: 36617955 DOI: 10.1515/cclm-2022-1094] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2022] [Accepted: 12/20/2022] [Indexed: 01/10/2023]
Abstract
OBJECTIVES There is continuing pressure to improve the cost effectiveness of quality control (QC) for clinical laboratory testing. Risk-based approaches are promising but recent research has uncovered problems in some common methods. There is a need for improvements in risk-based methods for quality control. METHODS We provide an overview of a dynamic model for assay behavior. We demonstrate the practical application of the model using simulation and compare the performance of simple Shewhart QC monitoring against Westgard rules. We also demonstrate the utility of trade-off curves for analysis of QC performance. RESULTS Westgard rules outperform simple Shewhart control over a narrow range of the trade-off curve of false-positive and false negative risk. The risk trade-off can be visualized in terms of risk, risk vs. cost, or in terms of cost. Risk trade-off curves can be "smoothed" by log transformation. CONCLUSIONS Dynamic risk-models may provide advantages relative to static models for risk-based QC analysis.
Collapse
|
19
|
The laboratory journey to become a decision engine: a roadmap for diagnostic transformation. Clin Chem Lab Med 2023; 61:576-579. [PMID: 36739524 DOI: 10.1515/cclm-2022-0889] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2022] [Accepted: 01/16/2023] [Indexed: 02/06/2023]
Abstract
Laboratories and diagnostic departments are presiding over a massive amount of data they are failing to fully leverage it. Data is the new black gold of healthcare organizations and by extracting insights from it, laboratories could become true decision engines, able to drive action across healthcare. This opinion paper responds three fundamental questions: (1) Where are we (diagnostic parties)? Taking a look at the most significant trends and challenges in healthcare and shedding some light upon the status of diagnostics. (2) Where do we want to be? Reviewing the opportunities for digital health, its role in the healthcare of the future and providing inspiration about what success looks like. (3) What do we need to do? Explaining what Digital Health Solutions (DHS) from Abbott is doing in this regard. This will include information about how DHS can impact the Diagnosis Cycle and how to set a roadmap for laboratories and diagnostic organizations. Diagnosis Cycle means the different steps in the diagnosis process, from the beginning when a patient is seen by a clinician and some tests are ordered, until the results are reviewed by the clinician and the treatment, follow up or discharge is decided.
Collapse
|
20
|
Increasing yield of in vitro transcription reaction with at-line high pressure liquid chromatography monitoring. Biotechnol Bioeng 2023; 120:737-747. [PMID: 36471904 DOI: 10.1002/bit.28299] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 10/27/2022] [Accepted: 12/01/2022] [Indexed: 12/12/2022]
Abstract
The COVID-19 pandemic triggered an unprecedented rate of development of messenger ribonucleic acid (mRNA) vaccines, which are produced by in vitro transcription reactions. The latter has been the focus of intense development to increase productivity and decrease cost. Optimization of in vitro transcription (IVT) depends on understanding the impact of individual reagents on the kinetics of mRNA production and the consumption of building blocks, which is hampered by slow, low-throughput, end-point analytics. We implemented a workflow based on rapid at-line high pressure liquid chromatography (HPLC) monitoring of consumption of nucleoside triphosphates (NTPs) with concomitant production of mRNA, with a sub-3 min read-out, allowing for adjustment of IVT reaction parameters with minimal time lag. IVT was converted to fed-batch resulting in doubling the reaction yield compared to batch IVT protocol, reaching 10 mg/ml for multiple constructs. When coupled with exonuclease digestion, HPLC analytics for quantification of mRNA was extended to monitoring capping efficiency of produced mRNA. When HPLC monitoring was applied to production of an anti-reverse cap analog (ARCA)-capped mRNA construct, which requires an approximate 4:1 ARCA:guanidine triphosphate ratio, the optimized fed-batch approach achieved productivity of 9 mg/ml with 79% capping. The study provides a methodological platform for optimization of factors influencing IVT reactions, converting the reaction from batch to fed-batch mode, determining reaction kinetics, which are critical for optimization of continuous addition of reagents, thus in principle enabling continuous manufacturing of mRNA.
Collapse
|
21
|
Editorial: Advanced analytic techniques in developmental neuroscience. Front Neurosci 2023; 16:1070337. [PMID: 36685248 PMCID: PMC9851148 DOI: 10.3389/fnins.2022.1070337] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Accepted: 12/12/2022] [Indexed: 01/07/2023] Open
|
22
|
Risk Analysis for Quality Control Part 2: Theoretical Foundations for Risk Analysis. J Appl Lab Med 2023; 8:23-33. [PMID: 36610426 DOI: 10.1093/jalm/jfac106] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 09/28/2022] [Indexed: 01/09/2023]
Abstract
BACKGROUND Risk analysis can be used to determine control limits for quality control (QC). The Parvin model is the most commonly used method for risk analysis; however; the Parvin model rests on assumptions that have been shown to produce paradoxical results and to underestimate risk. There is a need for an improved framework for risk analysis. METHODS We developed a dynamic model (Markov Reward Model) to analyze the long-term behavior of an assay under the influence of a QC monitoring system. The model is flexible and accounts for different patterns of assay behavior (shift frequency, shift distribution) and the impact of error on patient outcomes. The model determines the distribution of undetected reported errors and the frequency of false-positive laboratory results as a function of QC settings. The model accounts for the competing risks (false detections, shifts in the mean) that cause an assay to move from an in-control state to an out-of-control state. RESULTS The model provides a tradeoff curve that expresses the cost to prevent an unacceptable reported result in terms of laboratory cost (false-positive QC). The model can be used to optimize settings of a particular QC method or to compare the performance of different methods. CONCLUSIONS We developed a method to evaluate that determines the cost to reduce the risk to patients (reported results with unacceptable errors) in terms of laboratory costs (false-positive QC).
Collapse
|
23
|
Using web analytics data to identify platforms and content that best engage high-priority HIV populations in online and social media marketing advertisements. Digit Health 2023; 9:20552076231216547. [PMID: 38025100 PMCID: PMC10668575 DOI: 10.1177/20552076231216547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 11/08/2023] [Indexed: 12/01/2023] Open
Abstract
Background Online advertisements on social media platforms are an important tool for engaging relevant populations in public health research. However, little is known about what platforms and ad characteristics are most effective in engaging high-priority HIV populations, including racial/ethnic and sexual minority individuals. Methods Data from this study were drawn from advertising campaigns conducted on popular websites and social media platforms that recruited for several nationwide randomized controlled trials of various HIV prevention and testing strategies among sexual minority men (SMM) from December 2019 until March 2022. Descriptive statistics and LASSO regression models were used to determine which platforms and ad characteristics were associated with significantly higher odds of engagement. Results Ads on Google search, Facebook, and Instagram yielded the most cost-effective engagement, while gay-oriented dating platforms and TrafficJunky yielded the highest percentage of users who appeared to meet basic eligibility criteria. The highest percentages of Black users were screened through ads on Jack'd, TrafficJunky, and Google search; for Hispanic/Latino users, Google search, Grindr, Facebook, and Instagram. Analyzing ad characteristics, we found ads that used suggestive content, animation, and included study or institution logos were associated with greater engagement. Ads that emphasized convenience of the research (e.g. mentioned participating "from home") and that depicted people of similar races/ethnicities were also associated with greater engagement among Black and Hispanic/Latino sexual minority men. Conclusions We found that advertisements on mainstream social media sites are most cost effective. Although gay-oriented dating platforms were much more effective at reaching the target population, they were considerably more expensive. We also identified ad characteristics that were particularly effective in engaging users. These results could inform the design of online public health outreach campaigns for similar populations to improve their engagement and reach. Findings also demonstrated the value of conducting focused research on the effectiveness of various online marketing strategies.
Collapse
|
24
|
From Project-Based Health Literacy Data and Measurement to an Integrated System of Analytics and Insights: Enhancing Data-Driven Value Creation in Health-Literate Organizations. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:13210. [PMID: 36293791 PMCID: PMC9603602 DOI: 10.3390/ijerph192013210] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 10/11/2022] [Accepted: 10/12/2022] [Indexed: 06/16/2023]
Abstract
Health literacy measurement is important to improve equity, health and well-being as part of health system transformation. However, health literacy data of good quality are often lacking or difficult to access for decision-makers. To better inform policy, research and practice, this paper discusses how to move from project-based health literacy data and measurement to an integrated system of analytics and insights enhancing data-driven value creation in health-literate organizations. There is a need for the development of health literacy data pipelines, data dashboards, and data governance mechanisms which are timely and trustworthy. Investing in health literacy data analytics and data governance can pave the way for the integration of health literacy as an acknowledged global health indicator in large-scale surveys, ventures, and daily business. Leadership and management buy-in are needed to steer the process. Lessons learned from decades of measurement research combined with strategic implementation of systematic use of health literacy monitoring may accelerate the progress.
Collapse
|
25
|
Accuracy of Physician Electronic Health Record Usage Analytics using Clinical Test Cases. Appl Clin Inform 2022; 13:928-934. [PMID: 36198309 PMCID: PMC9534596 DOI: 10.1055/s-0042-1756424] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2022] [Accepted: 07/25/2022] [Indexed: 11/02/2022] Open
Abstract
Usage log data are an important data source for characterizing the potential burden related to use of the electronic health record (EHR) system. However, the utility of this data source has been hindered by concerns related to the real-world validity and accuracy of the data. While time-motion studies have historically been used to address this concern, the restrictions caused by the pandemic have made it difficult to carry out these studies in-person. In this regard, we introduce a practical approach for conducting validation studies for usage log data in a controlled environment. By developing test runs based on clinical workflows and conducting them within a test EHR environment, it allows for both comparison of the recorded timings and retrospective investigation of any discrepancies. In this case report, we describe the utility of this approach for validating our physician EHR usage logs at a large academic teaching mental health hospital in Canada. A total of 10 test runs were conducted across 3 days to validate 8 EHR usage log metrics, finding differences between recorded measurements and the usage analytics platform ranging from 9 to 60%.
Collapse
|
26
|
Data Analytics, Self-Organization, and Security Provisioning for Smart Monitoring Systems. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22197201. [PMID: 36236298 PMCID: PMC9571973 DOI: 10.3390/s22197201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/07/2022] [Revised: 09/16/2022] [Accepted: 09/17/2022] [Indexed: 05/08/2023]
Abstract
Internet availability and its integration with smart technologies have favored everyday objects and things and offered new areas, such as the Internet of Things (IoT). IoT refers to a concept where smart devices or things are connected and create a network. This new area has suffered from big data handling and security issues. There is a need to design a data analytics model by using new 5G technologies, architecture, and a security model. Reliable data communication in the presence of legitimate nodes is always one of the challenges in these networks. Malicious nodes are generating inaccurate information and breach the user's security. In this paper, a data analytics model and self-organizing architecture for IoT networks are proposed to understand the different layers of technologies and processes. The proposed model is designed for smart environmental monitoring systems. This paper also proposes a security model based on an authentication, detection, and prediction mechanism for IoT networks. The proposed model enhances security and protects the network from DoS and DDoS attacks. The proposed model evaluates in terms of accuracy, sensitivity, and specificity by using machine learning algorithms.
Collapse
|
27
|
Observed heterogeneity in players' football performance analysis using PLS-PM. J Appl Stat 2022; 50:3088-3107. [PMID: 37969543 PMCID: PMC10631390 DOI: 10.1080/02664763.2022.2101044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2021] [Accepted: 07/08/2022] [Indexed: 10/17/2022]
Abstract
Nowadays, data science is applied in several areas of daily life. There have been many applications to sports. In this context, the attention will be focused on football (i.e. 'soccer' for Americans): the making of strategic choices, whether by the scouting department of the football club, or the technical staff, up to the management, is crucial. It has been measured and monitored football players' performance in the season 2018/2019, for the top five European Leagues, using data provided by Electronic Arts (EA) experts and available on the Kaggle data science platform. For this purpose, with the help of football experts, a third-order partial least-squares path model (PLS-PM) approach was adopted to the sofifa key performance indices in order to compute a composite indicator differentiated by role and compare it with the well-known overall indicator from EA Sports. It has been taken into account players' observed heterogeneity (i.e. roles and leagues), since often experts refer to differences in these features, and so the objective is to verify their importance scientifically. The results are very consistent with this because they underline how some sub-areas of performance have different significance weights depending on the role.
Collapse
|
28
|
Abstract
Big Data has proved to be vast and complex, without being efficiently manageable through traditional architectures, whereas data analysis is considered crucial for both technical and non-technical stakeholders. Current analytics platforms are siloed for specific domains, whereas the requirements to enhance their use and lower their technicalities are continuously increasing. This paper describes a domain-agnostic single access autoscaling Big Data analytics platform, namely Diastema, as a collection of efficient and scalable components, offering user-friendly analytics through graph data modelling, supporting technical and non-technical stakeholders. Diastema's applicability is evaluated in healthcare through a predicting classifier for a COVID19 dataset, considering real-world constraints.
Collapse
|
29
|
ASAS-NANP symposium: mathematical modeling in animal nutrition: the progression of data analytics and artificial intelligence in support of sustainable development in animal science. J Anim Sci 2022; 100:6567454. [PMID: 35412610 PMCID: PMC9171329 DOI: 10.1093/jas/skac111] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2022] [Accepted: 04/09/2022] [Indexed: 12/01/2022] Open
Abstract
A renewed interest in data analytics and decision support systems in developing automated computer systems is facilitating the emergence of hybrid intelligent systems by combining artificial intelligence (AI) algorithms with classical modeling paradigms such as mechanistic modeling (HIMM) and agent-based models (iABM). Data analytics have evolved remarkably, and the scientific community may not yet fully grasp the power and limitations of some tools. Existing statistical assumptions might need to be re-assessed to provide a more thorough competitive advantage in animal production systems towards sustainability. This paper discussed the evolution of data analytics from a competitive advantage perspective within academia and illustrated the combination of different advanced technological systems in developing HIMM. The progress of analytical tools was divided into three stages: collect and respond, predict and prescribe, and smart learning and policy making, depending on the level of their sophistication (simple to complicated analysis). The collect and respond stage is responsible for ensuring the data is correct and free of influential data points, and it represents the data and information phases for which data are cataloged and organized. The predict and prescribe stage results in gained knowledge from the data and comprises most predictive modeling paradigms, and optimization and risk assessment tools are used to prescribe future decision-making opportunities. The third stage aims to apply the information obtained in the previous stages to foment knowledge and use it for rational decisions. This stage represents the pinnacle of acquired knowledge that leads to wisdom, and AI technology is intrinsic. Although still incipient, HIMM and iABM form the forthcoming stage of competitive advantage. HIMM may not increase our ability to understand the underlying mechanisms controlling the outcomes of a system, but it may increase the predictive ability of existing models by helping the analyst explain more of the data variation. The scientific community still has some issues to be resolved, including the lack of transparency and reporting of AI that might limit code reproducibility. It might be prudent for the scientific community to avoid the shiny object syndrome (i.e., AI) and look beyond the current knowledge to understand the mechanisms that might improve productivity and efficiency to lead agriculture towards sustainable and responsible achievements.
Collapse
|
30
|
ASHP Statement on the Pharmacy Technician's Role in Pharmacy Informatics. Am J Health Syst Pharm 2022; 79:1449-1452. [PMID: 35640562 DOI: 10.1093/ajhp/zxac136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
31
|
The effects of travel on performance: a 13-year analysis of the National Rugby League (NRL) competition. SCI MED FOOTBALL 2022; 6:60-65. [PMID: 35236226 DOI: 10.1080/24733938.2021.1876243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
The purpose was to investigate the effects of travel on performance in the National Rugby League (NRL). A total of 4,704 observations from 2,352 NRL matches (2007-2019) were analysed. The effect of travel on match outcome (i.e., win/loss) was analysed using a generalized linear mixed model, and the points difference using a linear mixed model. For every 1,000 km travelled in the NRL, the estimated probability of winning a match was reduced by -2.7% [-5.7 to 0.3%] and the estimated points difference by -1.1 [-2.0 to -0.2] points. In relation to every 1,000 km travelled, the 2007-2009 seasons had the greatest reduction in the likelihood of winning a match (-2.7% [-4.7 to -0.6%]), with the 2018-2019 seasons having the greatest likelihood (1.1% [-1.2 to 3.3%]). Regarding inter-state travel, teams from the state of Queensland had the greatest reduction in the likelihood of winning a match while the team from the state of Victoria had the greatest likelihood, although there were no clear differences between states. These data suggest that travel has impacted performance in NRL matches although this effect has reduced over time. These findings are useful for practitioners that prepare athletes in sports where frequent short-haul travel is required.
Collapse
|
32
|
Leveraging predictive analytics to reduce influenza and COVID-19-related adverse events. Nursing 2022; 52:35-37. [PMID: 35196281 PMCID: PMC8862671 DOI: 10.1097/01.nurse.0000806160.64587.92] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/05/2022]
Abstract
ABSTRACT Optimizing technology can help healthcare facilities better manage the risks from a potential "twindemic" of COVID-19 and influenza. This article explores the use of predictive analytics and artificial intelligence to address these risks and improve overall patient care and safety amid a simultaneous pandemic and flu season.
Collapse
|
33
|
Innovative Platform for the Advanced Online Monitoring of Three-Dimensional Cells and Tissue Cultures. Cells 2022; 11:cells11030412. [PMID: 35159222 PMCID: PMC8834321 DOI: 10.3390/cells11030412] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Revised: 01/21/2022] [Accepted: 01/22/2022] [Indexed: 12/12/2022] Open
Abstract
The use of 3D cell cultures has gained increasing importance in medical and pharmaceutical research. However, the analysis of the culture medium is hardly representative for the culture conditions within a 3D model which hinders the standardization of 3D cultures and translation of results. Therefore, we developed a modular monitoring platform combining a perfusion bioreactor with an integrated minimally invasive sampling system and implemented sensors that enables the online monitoring of culture parameters and medium compounds within 3D cultures. As a proof-of-concept, primary cells as well as cell lines were cultured on a collagen or gelatin methacryloyl (GelMA) hydrogel matrix, while monitoring relevant culture parameters and analytes. Comparing the interstitial fluid of the 3D models versus the corresponding culture medium, we found considerable differences in the concentrations of several analytes. These results clearly demonstrate that analyses of the culture medium only are not relevant for the development of standardized 3D culture processes. The presented bioreactor with an integrated sampling and sensor platform opens new horizons for the development, optimization, and standardization of 3D cultures. Furthermore, this technology holds the potential to reduce animal studies and improve the transferability of pharmaceutical in vitro studies by gaining more relevant results, bridging the gap towards clinical translation.
Collapse
|
34
|
Multi-criteria decision-making leveraged by text analytics and interviews with strategists. JOURNAL OF MARKETING ANALYTICS 2022. [PMCID: PMC8317142 DOI: 10.1057/s41270-021-00125-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
Strategic decision-making in organisations is a complex process affected by preferences, experiences, perspectives, and knowledge, which, in most cases, are ambiguous, contradictory, and represented in unstructured data. This paper develops a methodological framework to address strategic decision-making processes from a multi-criteria perspective, assisted by text analytics and interviews. The framework comprises five stages and 12 steps, and is empirically tested in a decision scenario involving a strategic focus for future analytics initiatives in order to stimulate value generation from analytics. The proposed framework enables the discovery, validation, and prioritisation of strategic patterns from relevant interview data. Among six decision alternatives discovered in the validation scenario, customer analytics was the strategic focus most relevant to future analytics initiatives. This article contributes to understanding and addressing complex decision-making processes and mixed research in organisations, through a multi-criteria perspective leveraged by a text-driven computational approach.
Collapse
|
35
|
Need We Say More? Measuring Redundancy in Nursing Progress Notes. Stud Health Technol Inform 2021; 284:65-67. [PMID: 34920473 DOI: 10.3233/shti210667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
36
|
Integration of Web Analytics Into Graduate Medical Education: Usability Study. JMIR Form Res 2021; 5:e29748. [PMID: 34898459 PMCID: PMC8713092 DOI: 10.2196/29748] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2021] [Revised: 08/17/2021] [Accepted: 09/18/2021] [Indexed: 11/13/2022] Open
Abstract
Background Web analytics is the measurement, collection, analysis, and reporting of website and web application usage data. While common in the e-commerce arena, web analytics is underutilized in graduate medical education (GME). Objective The University of Arkansas for Medical Sciences Department of Surgery website was revamped with input from in-house surgeons in August 2017. This study investigated the use of web analytics to gauge the impact of our department’s website redesign project. Methods Google Analytics software was used to measure website performance before and after implementation of the new website. Eight-month matched periods were compared. Factors tracked included total users, new users, total sessions, sessions per user, pages per session, average session duration, total page views, and bounce rate (the percentage of visitors who visit a site and then leave [ie, bounce] without continuing to another page on the same site). Results Analysis using a nonpaired Student t test demonstrated a statistically significant increase for total page views (before vs after: 33,065 vs 81,852; P<.001) and decrease for bounce rate (before vs after: 50.70% vs 0.23%; P<.001). Total users, new users, total sessions, sessions per user, and pages per session showed improvement. The average session duration was unchanged. Subgroup analysis showed that after the main page, the next 3 most frequently visited pages were related to GME programs in our department. Conclusions Web analytics is a practical measure of a website’s efficacy. Our data suggest that a modern website significantly improves user engagement. An up-to-date website is essential for contemporary GME recruitment, will likely enhance engagement of residency applicants with GME programs, and warrants further investigation.
Collapse
|
37
|
Creating a Live and Flexible Normative Dataset for Netball. Front Sports Act Living 2021; 3:743612. [PMID: 34746778 PMCID: PMC8570700 DOI: 10.3389/fspor.2021.743612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Accepted: 09/17/2021] [Indexed: 11/24/2022] Open
Abstract
A previous research has identified large data and information sources which exist about netball performance and align with the discussion of coaches during the games. Normative data provides context to measures across many disciplines, such as fitness testing, physical conditioning, and body composition. These data are normally presented in the tables as representations of the population categorized for benchmarking. Normative data does not exist for benchmarking or contextualization in netball, yet the coaches and players use performance statistics. A systems design methodology was adopted for this study where a process for automating the organization, normalization, and contextualization of netball performance data was developed. To maintain good ecological validity, a case study utilized expert coach feedback on the understandability and usability of the visual representations of netball performance population data. This paper provides coaches with benchmarks for assessing the performances of players, across competition levels against the player positions for performance indicators. It also provides insights to a performance analyst around how to present these benchmarks in an automated “real-time” reporting tool.
Collapse
|
38
|
Managing demand volatility during unplanned events with sentiment analysis: a case study of the COVID-19 pandemic. IFAC-PAPERSONLINE 2021; 54:1017-1022. [PMID: 38620115 PMCID: PMC10226410 DOI: 10.1016/j.ifacol.2021.08.200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/17/2024]
Abstract
Unplanned events such as natural disasters or epidemic outbreaks are usually accompanied by supply chain disruption and highly volatile markets. Besides, the recent COVID-19 crisis has shown that existing artificial intelligence systems and data analytics models, which normally provide valuable support in demand forecasting, have not been able to manage demand volatility. This study contributes addressing this issue and aims to determine whether sentiments conveyed by news media influence consumer behavior. It provides a case study conducted in three steps: (1) data were collected and prepared; (2) a sentiment analysis model was developed; and (3) a statistical analysis was performed to analyze the correlation between sentiments in news and drug consumption during the COVID-19 crisis. Findings highlighted a strong positive correlation between sentiments in news and consumption variability. They therefore suggest that sentiments in news have strong predictive power for demand forecasting in unplanned situations.
Collapse
|
39
|
An active human role is essential in big data-led decisions and data-intensive science. F1000Res 2021; 10:1127. [PMID: 38435673 PMCID: PMC10905148 DOI: 10.12688/f1000research.73876.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 10/27/2021] [Indexed: 03/05/2024] Open
Abstract
Big data is transforming many sectors, with far-reaching consequences to how decisions are made and how knowledge is produced and shared. In the current move toward more data-led decisions and data-intensive science, we aim here to examine three issues that are changing the way data are read and used. First, there is a shift toward paradigms that involve a large amount of data. In such paradigms, the creation of complex data-led models becomes tractable and appealing to generate predictions and explanations. This necessitates for instance a rethinking of Occam's razor principle in the context of knowledge discovery. Second, there is a growing erosion of the human role in decision making and knowledge discovery processes. Human users' involvement is decreasing at an alarming rate, with no say on how to read, process, and summarize data. This makes legal responsibility and accountability hard to define. Third, thanks to its increasing popularity, big data is gaining a seductive allure, where volume and complexity of big data can de facto confer more persuasion and significance to knowledge or decisions that result from big-data-based processes. These issues call for an active human role by creating opportunities to incorporate, in the most unbiased way, human expertise and prior knowledge in decision making and knowledge production. This also requires putting in place robust monitoring and appraisal mechanisms to ensure that relevant data is answering the right questions. As the proliferation of data continues to grow, we need to rethink the way we interact with data to serve human needs.
Collapse
|
40
|
Analytics and Lean Health Care to Address Nurse Care Management Challenges for Inpatients in Emerging Economies. J Nurs Scholarsh 2021; 53:803-814. [PMID: 34668285 PMCID: PMC9297932 DOI: 10.1111/jnu.12711] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 05/25/2021] [Accepted: 08/17/2021] [Indexed: 11/30/2022]
Abstract
Purpose Prescriptive and predictive analytics and artificial intelligence (AI) provide tools to analyze data with objectivity. In this paper, we provide an overview of how these techniques can improve nursing care, and we detail a quantitative model to afford managerial insights about care management in a Hospital in Colombia. Our main purpose is to provide tools to improve key performance indicators for the care management of inpatients which includes the nurse workload. Methods The optimal nurse‐to‐patient assignment problem is addressed using analytics, lean health care, and AI. Also, we propose a new mathematical model to optimize the nurse‐to‐patient assignment decisions considering several variables about the patient state such as the Barthel index, their risks, the complexity of the care, and the mental state. Findings Our results show that there are several processes inherent to compassionate nursing care that can be improved using technology. By using data analytics, we can also provide insights about the high variability of the care requirements and, by using models, find nurse‐to‐patient assignments that are nearly perfectly balanced. Conclusions We illustrated this improvement with a pilot test that makes the equitable distribution of nursing workload the functionality of this strategy. The findings can be useful in highly complex hospitals in Latin America. Clinical Relevance The proposed model presents an opportunity to make near perfectly balanced nurse‐to‐patient assignments according to the number of patients and their health conditions using technology.
Collapse
|
41
|
Analytical methods to characterize recombinant adeno-associated virus vectors and the benefit of standardization and reference materials. Curr Opin Biotechnol 2021; 71:65-76. [PMID: 34273809 PMCID: PMC8530916 DOI: 10.1016/j.copbio.2021.06.025] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Revised: 05/26/2021] [Accepted: 06/28/2021] [Indexed: 12/18/2022]
Abstract
Recombinant adeno-associated virus (rAAV) is an increasingly important gene therapy vector, but its properties present unique challenges to critical quality attribute (CQA) identification and analytics development. Advances in, and ongoing hurdles to, characterizing rAAV proteins, nucleic acids, and vector potency are discussed in this review. For nucleic acids and vector potency, current analytical techniques for defined CQAs would benefit from further optimization, while for proteins, more complete characterization and mapping of properties to safety and efficacy is needed to finalize CQAs. The benefits of leveraging reference vectors to validate analytics and CQA ranges are also proposed. Once defined, CQA specifications can be used to establish target parameters for and inform the development of next generation rAAV processes.
Collapse
|
42
|
Assessment of quality attributes for adeno-associated viral vectors. Biotechnol Bioeng 2021; 118:4186-4203. [PMID: 34309017 DOI: 10.1002/bit.27905] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Revised: 07/21/2021] [Accepted: 07/22/2021] [Indexed: 12/24/2022]
Abstract
There is a strong and growing interest in the development and production of gene therapy products, including those utilizing adeno-associated virus (AAV) particles. This is evident with the increase in the number of clinical trials and agency approvals for AAV therapeutics. As bioproduction of AAV viral vectors matures, a quality by design (QbD) approach to process development can aid in process robustness and product quality. Furthermore, it may become a regulatory expectation. The first step in any QbD approach is to determine what physical, chemical, biological, or microbiological property or characteristic product attributes should be controlled within an appropriate limit, range, or distribution to ensure the desired product quality. Then predefined goals are set to allow proactive process development to design in quality. This review lists typical quality attributes used for release testing of AAV viral vectors and discusses these and selected attributes important to extended characterization studies in terms of safety, efficacy, and impact upon the patient immune response.
Collapse
|
43
|
Human Behavior Analysis Using Intelligent Big Data Analytics. Front Psychol 2021; 12:686610. [PMID: 34295289 PMCID: PMC8290162 DOI: 10.3389/fpsyg.2021.686610] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2021] [Accepted: 06/09/2021] [Indexed: 11/25/2022] Open
Abstract
Intelligent big data analysis is an evolving pattern in the age of big data science and artificial intelligence (AI). Analysis of organized data has been very successful, but analyzing human behavior using social media data becomes challenging. The social media data comprises a vast and unstructured format of data sources that can include likes, comments, tweets, shares, and views. Data analytics of social media data became a challenging task for companies, such as Dailymotion, that have billions of daily users and vast numbers of comments, likes, and views. Social media data is created in a significant amount and at a tremendous pace. There is a very high volume to store, sort, process, and carefully study the data for making possible decisions. This article proposes an architecture using a big data analytics mechanism to efficiently and logically process the huge social media datasets. The proposed architecture is composed of three layers. The main objective of the project is to demonstrate Apache Spark parallel processing and distributed framework technologies with other storage and processing mechanisms. The social media data generated from Dailymotion is used in this article to demonstrate the benefits of this architecture. The project utilized the application programming interface (API) of Dailymotion, allowing it to incorporate functions suitable to fetch and view information. The API key is generated to fetch information of public channel data in the form of text files. Hive storage machinist is utilized with Apache Spark for efficient data processing. The effectiveness of the proposed architecture is also highlighted.
Collapse
|
44
|
Playing-Side Analytics in Team Sports: Multiple Directions, Opportunities, and Challenges. Front Sports Act Living 2021; 3:671601. [PMID: 34291203 PMCID: PMC8287128 DOI: 10.3389/fspor.2021.671601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2021] [Accepted: 05/26/2021] [Indexed: 11/29/2022] Open
Abstract
This paper describes developments in the player-side analytics in major team sports. We take a decision-making lens to the role of analytics in player decisions by general managers and coaches. We outline key accelerators and inhibitors to the wider adoption and acceptance of data analytics playing a greater role in the decisions of clubs.
Collapse
|
45
|
New Insights into the Degradation Path of Deltamethrin. Molecules 2021; 26:molecules26133811. [PMID: 34206625 PMCID: PMC8270271 DOI: 10.3390/molecules26133811] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 06/17/2021] [Accepted: 06/18/2021] [Indexed: 11/17/2022] Open
Abstract
Pyrethroids are among the insecticidal compounds indicated by the World Health Organization for mitigation of vector-borne diseases. Active deltamethrin (with chiral configuration α-S,1-R-cis) is one of the most effective pyrethroids characterized by low toxicity to humans, and it is currently tested as active ingredient for insecticidal paints. Nevertheless, several degradation processes can occur and affect the insecticidal efficacy in the complex paint matrix. In the present study, a detailed NMR analysis of deltamethrin stability has been carried out under stress conditions, mimicking a water-based insecticidal paint environment. Two novel by-products, having a diastereomeric relationship, were identified and their structure was elucidated by combining NMR, HPLC, GC-MS, and ESI-MS analyses. These compounds are the result from a nucleophilic addition involving deltamethrin and one of its major degradation products, 3-phenoxybenzaldehyde. Given the known toxicity of the aldehyde, this reaction could represent a way to reduce its concentration into the matrix. On the other hand, the toxicology of these compounds towards humans should be addressed, as their presence may adversely affect the performance of deltamethrin-containing products.
Collapse
|
46
|
Quantifying and interpreting inequality of surgical site infections among operating rooms. Can J Anaesth 2021; 68:812-824. [PMID: 33547628 DOI: 10.1007/s12630-021-01931-5] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2020] [Revised: 11/11/2020] [Accepted: 11/12/2020] [Indexed: 01/27/2023] Open
Abstract
PURPOSE The incidence of surgical site infection differs among operating rooms (ORs). However, cost effectiveness of interventions targeting ORs depends on infection counts. The purpose of this study was to quantify the inequality of infection counts among ORs. METHODS We performed a single-centre historical cohort study of elective surgical cases spanning a 160-week period from May 2017 to May 2020, identifying cases of infection within 90 days using International Classification of Diseases, Tenth Revision, Clinical Modification diagnosis codes. We used the Gini index to measure inequality of infections among ORs. As a reference, the Gini index for inequality of household disposable income in the US in 2017 was 0.39, and 0.31 for Canada. RESULTS There were 3,148 (3.67%) infections among the 85,744 cases studied. The 20% of 57 ORs with the most and least infections accounted for 44% (99% confidence interval [CI], 36 to 52) and 5% (99% CI, 2 to 8), respectively. The Gini index was 0.40 (99% CI, 0.31 to 0.50), which is comparable to income inequality in the US. There were more infections in ORs with more minutes of cases (Spearman correlation ρ = 0.68; P < 0.001), but generally not in ORs with more total cases (ρ = 0.11; P = 0.43). Moderately long (3.3 to 4.8 hr) cases had a large effect, having greater incidences of infection, while not being so long as to have just one case per day per OR. There was substantially greater inequality in infection counts among the 557 observed combinations of OR specialty (Gini index 0.85; 99% CI, 0.81 to 0.88). CONCLUSIONS Inequality of infections among ORs is substantial and caused by both inequality in the incidence of infections and inequality in the total minutes of cases. Inequality in infections among OR and specialty combinations is due principally to inequality in total minutes of cases.
Collapse
|
47
|
College Football Overtime Outcomes: Implications for In-Game Decision-Making. Front Artif Intell 2021; 3:61. [PMID: 33733178 PMCID: PMC7861217 DOI: 10.3389/frai.2020.00061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2020] [Accepted: 07/14/2020] [Indexed: 11/20/2022] Open
Abstract
The use of AI and machine learning in sports is increasingly prevalent, including their use for in-game strategy and tactics. This paper reports on the use of machine learning techniques, applying it to analysis of U.S. Division I-A College Football overtime games. The present overtime rules for tie games in Division I-A college football was adopted in 1996. Previous research (Rosen and Wilson, 2007) found little to suggest that the predominantly used strategy of going on defense first was advantageous. Over the past decade, even with significant transformation of new offensive and defensive strategies, college football coaches still opt for the same conventional wisdom strategy. In revisiting this analysis of overtime games using both logistic regression and inductive learning/decision tree analysis, the study validates there remains no advantage to the defense first strategy in overtime. The study found evidence that point spread (as an indicator of team strength) and red zone offense performance of both teams were useful to predict game results. Additionally, by altering the decision-making “frame,” specific scenarios are illustrated where a coach can use these machine learning discovered relationships to influence end-of-regulation game decisions that may increase their likelihood of winning whether in regulation time or in overtime.
Collapse
|
48
|
"To Tech or Not to Tech?" A Critical Decision-Making Framework for Implementing Technology in Sport. J Athl Train 2021; 55:902-910. [PMID: 32991702 DOI: 10.4085/1062-6050-0540.19] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
The current technological age has created exponential growth in the availability of technology and data in every industry, including sport. It is tempting to get caught up in the excitement of purchasing and implementing technology, but technology has a potential dark side that warrants consideration. Before investing in technology, it is imperative to consider the potential roadblocks, including its limitations and the contextual challenges that compromise implementation in a specific environment. A thoughtful approach is therefore necessary when deciding whether to implement any given technology into practice. In this article, we review the vision and pitfalls behind technology's potential in sport science and medicine applications and then present a critical decision-making framework of 4 simple questions to help practitioners decide whether to purchase and implement a given technology.
Collapse
|
49
|
Targeting cancer with antibody-drug conjugates: Promises and challenges. MAbs 2021; 13:1951427. [PMID: 34291723 PMCID: PMC8300931 DOI: 10.1080/19420862.2021.1951427] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Revised: 06/29/2021] [Accepted: 06/29/2021] [Indexed: 01/03/2023] Open
Abstract
Antibody-drug conjugates (ADCs) are a rapidly expanding class of biotherapeutics that utilize antibodies to selectively deliver cytotoxic drugs to the tumor site. As of May 2021, the U.S. Food and Drug Administration (FDA) has approved ten ADCs, namely Adcetris®, Kadcyla®, Besponsa®, Mylotarg®, Polivy®, Padcev®, Enhertu®, Trodelvy®, Blenrep®, and Zynlonta™ as monotherapy or combinational therapy for breast cancer, urothelial cancer, myeloma, acute leukemia, and lymphoma. In addition, over 80 investigational ADCs are currently being evaluated in approximately 150 active clinical trials. Despite the growing interest in ADCs, challenges remain to expand their therapeutic index (with greater efficacy and less toxicity). Recent advances in the manufacturing technology for the antibody, payload, and linker combined with new bioconjugation platforms and state-of-the-art analytical techniques are helping to shape the future development of ADCs. This review highlights the current status of marketed ADCs and those under clinical investigation with a focus on translational strategies to improve product quality, safety, and efficacy.
Collapse
|
50
|
Transforming data into insight: Establishment of a pharmacy analytics and outcomes team. Am J Health Syst Pharm 2020; 78:65-73. [PMID: 33325486 DOI: 10.1093/ajhp/zxaa411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
PURPOSE The importance of a data management strategy is increasingly necessary for demonstrating value and driving performance within pharmacy departments. Data analytics capabilities often do not match the pace of data accumulation. At our organization, the establishment of an embedded pharmacy analytics and outcomes (PAO) team has been instrumental to pharmacy services in generating and demonstrating value and proactively supporting a business intelligence strategy grounded in a data-driven culture. SUMMARY The PAO team was established to support the operational and strategic needs of clinical, financial, and operational pharmacy services. The team is charged with implementing the vision of extending medication-use influence and data insight to drive value-based patient care outcomes while decreasing waste, optimizing therapeutic decisions, and achieving medication management standardization across the continuum of healthcare. The PAO team is composed of 3 pharmacist full-time equivalents (FTEs), 5 business analyst FTEs, 1 biostatistician FTE, 0.2 pharmacy intern FTE, and 1 pharmacy manager FTE. Pharmacy services leaders believe it is necessary to have a mix of both clinical and analytical skill sets, given the clinical nature of the data managed by team and complexities of the medication-use process. CONCLUSION Pharmacy reporting and analytics should require the same depth of scrutiny and overview as any other step in the medication-use process where pharmacists are held accountable. For our organization, it was critical to establish pharmacist-level oversight into every portion of the analytics process where medication data are involved. This structure has led to measurable improvements in patient outcomes, operational efficiency, and financial performance.
Collapse
|