A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera (2022)
Rahkonen, S., Lind, L., Raita-Hakola, A.-M., Kiiskinen, S., & Pölönen, I. (2022). Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera. Sensors, 22(22), Article 8668. https://doi.org/10.3390/s22228668
JYU-tekijät tai -toimittajat
Julkaisun tiedot
Julkaisun kaikki tekijät tai toimittajat: Rahkonen, Samuli; Lind, Leevi; Raita-Hakola, Anna-Maria; Kiiskinen, Sampsa; Pölönen, Ilkka
Lehti tai sarja: Sensors
eISSN: 1424-8220
Julkaisuvuosi: 2022
Ilmestymispäivä: 10.11.2022
Volyymi: 22
Lehden numero: 22
Artikkelinumero: 8668
Kustantaja: MDPI AG
Julkaisumaa: Sveitsi
Julkaisun kieli: englanti
DOI: https://doi.org/10.3390/s22228668
Julkaisun avoin saatavuus: Avoimesti saatavilla
Julkaisukanavan avoin saatavuus: Kokonaan avoin julkaisukanava
Julkaisu on rinnakkaistallennettu (JYX): https://jyx.jyu.fi/handle/123456789/84327
Tiivistelmä
Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties of the sensors, selecting the right optics and finding the sensors’ mutual reference frame through calibration. In this research we demonstrate a method for fusing data from Fabry–Perot interferometer hyperspectral camera and a Kinect V2 time-of-flight depth sensing camera. We created an experimental application to demonstrate utilizing the depth augmented hyperspectral data to measure emission angle dependent reflectance from a multi-view inferred point cloud. We determined the intrinsic and extrinsic camera parameters through calibration, used global and local registration algorithms to combine point clouds from different viewpoints, created a dense point cloud and determined the angle dependent reflectances from it. The method could successfully combine the 3D point cloud data and hyperspectral data from different viewpoints of a reference colorchecker board. The point cloud registrations gained 0.29–0.36 fitness for inlier point correspondences and RMSE was approx. 2, which refers a quite reliable registration result. The RMSE of the measured reflectances between the front view and side views of the targets varied between 0.01 and 0.05 on average and the spectral angle between 1.5 and 3.2 degrees. The results suggest that changing emission angle has very small effect on the surface reflectance intensity and spectrum shapes, which was expected with the used colorchecker.
YSO-asiasanat: reflektanssi; mittausmenetelmät; optiset ominaisuudet; optiset anturit
Vapaat asiasanat: hyperspectral; depth data; kinect; sensor fusion; reflectance
Liittyvät organisaatiot
Hankkeet, joissa julkaisu on tehty
- iADDVA - Adding Value by Creative Industry Platform kehittämishanke
- Sajavaara, Timo
- Pirkanmaan liitto
OKM-raportointi: Kyllä
Raportointivuosi: 2022
JUFO-taso: 1