1
|
Abbessi R, Verrier N, Taddese AM, Laroche S, Debailleul M, Lo M, Courbot JB, Haeberlé O. Multimodal image reconstruction from tomographic diffraction microscopy data. J Microsc 2022; 288:193-206. [PMID: 35775607 DOI: 10.1111/jmi.13131] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Revised: 06/18/2022] [Accepted: 06/24/2022] [Indexed: 11/29/2022]
Abstract
Tomographic Diffraction Microscopy (TDM) is a tool of choice for high-resolution, marker-less 3D imaging of biological samples. Based on a generalization of Digital Holographic Microscopy (DHM) with full control of the sample's illumination, TDM measures, from many illumination directions, the diffracted fields in both phase and amplitude. Photon budget associated to TDM imaging is low. Therefore, TDM is not limited by photo-toxicity issues. The recorded information makes it possible to reconstruct 3D refractive index distribution (with both refraction and absorption contributions) of the object under scrutiny, without any staining. In this contribution, we show an alternate use of this information. A tutorial for multimodal image reconstruction is proposed. Both intensity contrasts and phase contrasts are proposed, from the image formation model to the final reconstruction with both 2D and 3D rendering, turning TDM into a kind of "universal" digital microscope. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Riadh Abbessi
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Nicolas Verrier
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Asemare Mengistie Taddese
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Steve Laroche
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Matthieu Debailleul
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Mohamed Lo
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Jean-Baptiste Courbot
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| | - Olivier Haeberlé
- Institut Recherche en Informatique, Mathématiques, Automatique et Signal (IRIMAS UR UHA 7499), Université de Haute-Alsace, IUT Mulhouse, 61 rue Albert Camus, Mulhouse Cedex, 68093, France
| |
Collapse
|
2
|
Ueda T, Iwai D, Sato K. IlluminatedZoom: spatially varying magnified vision using periodically zooming eyeglasses and a high-speed projector. OPTICS EXPRESS 2021; 29:16377-16395. [PMID: 34154202 DOI: 10.1364/oe.427616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2021] [Accepted: 05/07/2021] [Indexed: 06/13/2023]
Abstract
Spatial zooming and magnification, which control the size of only a portion of a scene while maintaining its context, is an essential interaction technique in augmented reality (AR) systems. It has been applied in various AR applications including surgical navigation, visual search support, and human behavior control. However, spatial zooming has been implemented only on video see-through displays and not been supported by optical see-through displays. It is not trivial to achieve spatial zooming of an observed real scene using near-eye optics. This paper presents the first optical see-through spatial zooming glasses which enables interactive control of the perceived sizes of real-world appearances in a spatially varying manner. The key to our technique is the combination of periodically fast zooming eyeglasses and a synchronized high-speed projector. We stack two electrically focus-tunable lenses (ETLs) for each eyeglass and sweep their focal lengths to modulate the magnification periodically from one (unmagnified) to higher (magnified) at 60 Hz in a manner that prevents a user from perceiving the modulation. We use a 1,000 fps high-speed projector to provide high-resolution spatial illumination for the real scene around the user. A portion of the scene that is to appear magnified is illuminated by the projector when the magnification is greater than one, while the other part is illuminated when the magnification is equal to one. Through experiments, we demonstrate the spatial zooming results of up to 30% magnification using a prototype system. Our technique has the potential to expand the application field of spatial zooming interaction in optical see-through AR.
Collapse
|
3
|
Akiyama R, Yamamoto G, Amano T, Taketomi T, Plopski A, Sandor C, Kato H. Robust Reflectance Estimation for Projection-Based Appearance Control in a Dynamic Light Environment. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:2041-2055. [PMID: 31514141 DOI: 10.1109/tvcg.2019.2940453] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
We present a novel method that robustly estimates the reflectance, even in an environment with dynamically changing light. To control the appearance of an object by using a projector-camera system, an appropriate estimate of the object's reflectance is vital to the creation of an appropriate projection image. Most conventional estimation methods assume static light conditions; however, in practice, the appearance is affected by both the reflectance and environmental light. In an environment with dynamically changing light, conventional reflectance estimation methods require calibration every time the conditions change. In contrast, our method requires no additional calibration because it simultaneously estimates both the reflectance and environmental light. Our method is based on the concept of creating two different light conditions by switching the projection at a rate higher than that perceived by the human eye and captures the images of a target object separately under each condition. The reflectance and environmental light are then simultaneously estimated by using the pair of images acquired under these two conditions. We implemented a projector-camera system that switches the projection on and off at 120 Hz. Experiments confirm the robustness of our method when changing the environmental light. Further, our method can robustly estimate the reflectance under practical indoor lighting conditions.
Collapse
|
4
|
Takeda S, Iwai D, Sato K. Inter-reflection Compensation of Immersive Projection Display by Spatio-Temporal Screen Reflectance Modulation. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2016; 22:1424-1431. [PMID: 26780805 DOI: 10.1109/tvcg.2016.2518136] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
We propose a novel inter-reflection compensation technique for immersive projection displays wherein we spatially modulate the reflectance pattern on the screen to improve the compensation performance of conventional methods. As the luminance of light reflected on a projection surface is mathematically represented as the multiplication of the illuminance of incident light and the surface reflectance, we can reduce undesirable intensity elevation because of inter-reflections by decreasing surface reflectance. Based on this principle, we improve conventional inter-reflection compensation techniques by applying reflectance pattern modulation. We realize spatial reflectance modulation of a projection screen by painting it with a photochromic compound, which changes its color (i.e., the reflectance of the screen) when ultraviolet (UV) light is applied and by controlling UV irradiation with a UV LED array placed behind the screen. The main contribution of this paper is a computational model to optimize a reflectance pattern for the accurate reproduction of a target appearance by decreasing the intensity elevation caused by inter-reflection while maintaining the maximum intensity of the target appearance. Through simulation and physical experiments, we demonstrate the feasibility of the proposed model and confirm its advantage over conventional methods.
Collapse
|