A1 Journal article (refereed)
Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera (2022)


Rahkonen, S., Lind, L., Raita-Hakola, A.-M., Kiiskinen, S., & Pölönen, I. (2022). Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera. Sensors, 22(22), Article 8668. https://doi.org/10.3390/s22228668


JYU authors or editors


Publication details

All authors or editorsRahkonen, Samuli; Lind, Leevi; Raita-Hakola, Anna-Maria; Kiiskinen, Sampsa; Pölönen, Ilkka

Journal or seriesSensors

eISSN1424-8220

Publication year2022

Publication date10/11/2022

Volume22

Issue number22

Article number8668

PublisherMDPI AG

Publication countrySwitzerland

Publication languageEnglish

DOIhttps://doi.org/10.3390/s22228668

Publication open accessOpenly available

Publication channel open accessOpen Access channel

Publication is parallel published (JYX)https://jyx.jyu.fi/handle/123456789/84327


Abstract

Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties of the sensors, selecting the right optics and finding the sensors’ mutual reference frame through calibration. In this research we demonstrate a method for fusing data from Fabry–Perot interferometer hyperspectral camera and a Kinect V2 time-of-flight depth sensing camera. We created an experimental application to demonstrate utilizing the depth augmented hyperspectral data to measure emission angle dependent reflectance from a multi-view inferred point cloud. We determined the intrinsic and extrinsic camera parameters through calibration, used global and local registration algorithms to combine point clouds from different viewpoints, created a dense point cloud and determined the angle dependent reflectances from it. The method could successfully combine the 3D point cloud data and hyperspectral data from different viewpoints of a reference colorchecker board. The point cloud registrations gained 0.29–0.36 fitness for inlier point correspondences and RMSE was approx. 2, which refers a quite reliable registration result. The RMSE of the measured reflectances between the front view and side views of the targets varied between 0.01 and 0.05 on average and the spectral angle between 1.5 and 3.2 degrees. The results suggest that changing emission angle has very small effect on the surface reflectance intensity and spectrum shapes, which was expected with the used colorchecker.


Keywordsreflectancemeasuring methodsoptical propertiesoptical detectors

Free keywordshyperspectral; depth data; kinect; sensor fusion; reflectance


Contributing organizations


Related projects


Ministry reportingYes

Reporting Year2022

JUFO rating1


Last updated on 2024-30-04 at 18:17