نوع مقاله : مقاله پژوهشی

نویسندگان

1 دانشجوی دکتری سنجش ازدور، دانشکده مهندسی عمران و حمل و نقل، دانشگاه اصفهان

2 استادیارگروه مهندسی نقشه برداری، دانشکده عمران و حمل و نقل، دانشگاه اصفهان، اصفهان، ایران

چکیده

یکی از روش‌های ساده و معروف ادغام تصاویر ماهواره ای، روش IHS می‌باشد. بر طبق مطالعات گذشته، هنگامی‌که نسخه ­های مختلف روش‌ IHS روی تصاویر ماهواره‌ای با قدرت تفکیک مکانی بالا اجرا می‌گردد، اعوجاجات فراوانی به علت تفاوت منحنی پاسخ طیفی تصویر چند طیفی و تصویر پانکروماتیک به وجود می‌آید. در این تحقیق از اطلاعات مفید منحنی پاسخ طیفی تصویر چند طیفی و باند پانکروماتیک سنجنده در فرایند ادغام استفاده ‌شده است. بر اساس روش ادغام FIHS، هفت حالت مختلف برای تشکیل تصویر شدت با در نظر گرفتن ضرایبی که با استفاده از منحنی­ های پاسخ طیفی سنجنده محاسبه می­ شوند، بررسی ‌شده‌اند. به این صورت که وضعیت قسمت­ های مشترک و غیرمشترک منحنی­ های پاسخ طیفی سنجنده­ های چند طیفی و پانکروماتیک ارزیابی می­ گردند و ضرایب مختلفی برای باندهای چند طیفی در محاسبه مؤلفه شدت برآورد می ­شوند. سپس ادغام به روشFIHS  بر اساس مؤلفه شدت جدید انجام می­ شود. این نتایج به وسیله سه معیار طیفی و یک معیار لبه بررسی گردید. با توجه به نتایج متفاوت حاصل از معیارها بر اساس چهار معیار گفته شده یک رتبه بندی انجام گردید و بر اساس آن روش های مورد آزمون رتبه ای از 1 تا 8 گرفتند. نتایج محاسبه معیارهای طیفی و مکانی گوناگون نشان می‌دهند که استفاده از قسمت های خارج از منحنی پاسخ طیفی باند پانکروماتیک بهترین نتیجه را حاصل می کند. روش پیشنهادی نسبت به سایر روش‌های ادغام IHS با استفاده از منحنی پاسخ طیفی بهتر عمل می‌کند. سرعت و دقت مطلوب از نتایج استفاده از اطلاعات منحنی پاسخ طیفی باندهای سنجنده­ ها در ادغام تصاویر ماهواره ای می ­باشد.

کلیدواژه‌ها

عنوان مقاله [English]

Satellite image fusion using FastIHS method and spectral response curves

نویسندگان [English]

  • Kosar Kabiri 1
  • Sayyed Bagher Fatemi 2

1 PhD candidate of remote sensing, Department of geomatics engineering, Faculty of civil engineering and transportation, University of Isfahan

2 Assistant professor, Department of geomatics engineering, Faculty of civil engineering and transportation, University of Isfahan

چکیده [English]

Extended Abstract
Introduction
Different image fusion methodsprimarily seek to improve spectral and spatial content of the final result. However, the final fused image often suffers from some spectral distortions. Moreover, some image fusion methods are too slow. Image fusion using IHS transformation is known as a fast image fusion method. Unfortunately, the resulting image fused with IHS also suffers from some spectral distortions and therefore several versions of this method have been developed. Defining weights of each band for generation of the intensity component is one of the main problems discussed in the literature. Spectral response curves are used as one of the major sources for defining relative weight of each spectral band. Scientific reports indicate that spectral response curves can improve the quality of the final fused image.
Weights of each individual band is often calculated based on the overlapping area of the spectral response curves of the panchromatic and multi-spectral bands. But, information like the non-overlapping areas of the curves are also considered to play a role in the calculation of the weights. The present comparative studyinvestigatesthe potential of using this information.
 
Materials & Methods
A multi-spectral Geoeye-1 satellite image with 2 meter spatial resolution, four spectral bands and the corresponding panchromatic band with a spatial resolution of0.5 meter were used to test the idea. Seven variants of the FastIHS fusion method have been developed based on different approaches of intensity component estimation using the information obtained from spectral response curves. The test methods have been compared with the original FastIHS image fusion method. The only difference of these methods was in the way they calculate the weights of each band.
The seven tested methods included: 1) ratio of the overlapping area of the spectral response curves of the panchromatic and multi-spectral bands and multispectral response curves, 2) the ratio of the area of the multispectral band’s response curves and the area of the panchromatic band’s response curve, 3) the inverse of the distance between the central wavelength of the panchromatic and multispectral response curves, 4) the ratio of the overlapping area of the spectral response curves of the panchromatic and multi-spectral bands and the area of the panchromatic response curve, 5) the ratio of the non-overlapping area of the panchromatic and multi-spectral response curves and the area of the multispectral response curves, 6) ratio of the overlapping area of both panchromatic and multi-spectral response curves and the area of the panchromatic response curve minus the area of the multispectral response curves, 7) the ratio of the panchromatic and multispectral response curves’non-overlapping area and the area of the multispectral response curves multiplied by the ratio of the area of the multispectral response curve and the area of the overlapping regions of the panchromatic and multispectral response curves.
 
Results & Discussion
In order to evaluate the fused images, four criteria were used, including ERGAS, RMSE, Correlation Coefficient, and edge correlation with panchromatic band. In order to calculate edge correlation Coefficient, a Sobel filter was applied on the panchromatic and fused bands. Then, the correlation coefficient between the individual filtered spectral bands and the filtered panchromatic bands was calculated. All eight methods were ranked based on the four evaluation criteria. Because of the inconsistencies in the ranking results, the four criteria have been merged and a new ranking method was obtained based on the final results. Based on this final ranking, the fifth method is in the first rank and the second method is in the eighth rank.  Therefore, the sorted list of the methods based on the final ranking is: IHS5, IHS3, IHS6, IHS1, IHS4, IHS7, FastIHS, and IHS2. As the ranking shows, almost all tested methods have a higher level of accuracy as compared to the base method (FastIHS).
 
Conclusion
The results indicates that using the information obtained from the spectral response curves can improve the final results of the FastIHS image fusion. This information can improvethe fusion speed and reduce spectral distortions of the final fused image. Unfortunately, the spectral feature of the data is preserved and the total number of detected edges is decreased. Spectral response curves are directly tied with the physics of the imaging, therefore using their information can produce some natural fused images with better visualization and enhanced spatial contents.      

کلیدواژه‌ها [English]

  • FastIHS method
  • Image fusion
  • Satellite image
  • Spectral response Curve
1.   قاسمیان یزدی،الیاسی؛محمدحسن،مصلح. (1389). ادغام اطلاعات مکانی تصویرآیکونوس واطلاعات طیفی تصاویراسپات. فصلنامه برنامه ریزی وآمایش فضا،دوره 14،شماره 1.
2.   محمدنژادنیازی سعید, م. م. (1395). ارائه روش ادغام تصاویرچندطیفی و پانکروماتیک IHS-GAمبتنی بر مناطق بهبودیافته گیاهی. . نشریه علمی- پژوهشی علوم و فنون نقشه برداری، دوره ششم،شماره1, 235-24.3. 
3. Aiazzi, B. B. (2007). Improving component substitution pansharpening through multivariate regression of MS $+ $ Pan data. IEEE Transactions on Geoscience and Remote Sensing, 45(10), 3230-3239.
4. al, B. A. (2009). A Comparison between Global and ContextAdaptivePansharpening of Multispectral Images.IEE Geosci. Remote Sens., pp. 302-306.
5.   al, C. T. (2008). Synthesis of Multispectral Images to High Spatial Resolution: A Critical Review of Fusion Methods Based on Remote Sensing Physics. IEEE Trans. Geosci. Remote Sens, pp. 1301-1312.
6. al, M. G.-A. (2004). Fusion of Multispectral and Panchromatic Images Using Improved IHS and PCA Mergers Based on Wavelet Decomposition. IEEE Trans. Geosci. Remote Sens, pp. 1291-1299.
7.   al, P. P. (2006 ). Estimation of the Number of Decomposition Levels for a Wavelet-Based Multiresolution Multisensor Image Fusion. IEEE Trans. Geosci. Remote Sens., pp. 3674-3686.
8. al, Z. W. (2005). A Comparative Analysis of Image Fusion Methods. IEEE Trans. Geosci. Remote Sens, pp. 1391-1402.
9. Barsi, J. A. (2014). The spectral response of the Landsat-8 operational land imager. Remote Sensing, 10232-10251.
10. Beirle, S. e. (2017). Parameterizing the instrumental spectral response function and its changes by a super-Gaussian and its derivatives. Atmospheric Measurement Techniques 10.2 .
11. Choi, M. (2006). A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter. IEEE Transactions on Geoscience and Remote sensing, 44(6), 1672-1682.
12. DOU, W. (2011). Comparison among remotely sensed image fusion methods based on spectral response function. Spectroscopy and Spectral Analysis, 746-752.
13. Ghassemian, H. (2016). A review of remote sensing image fusion methods. Information Fusion, 32, 75-89.
14. González-Audícana, M. O.-M. (2006). A low computational-cost method to fuse IKONOS images using the spectral response function of its sensors. IEEE Transactions on Geoscience and Remote Sensing, 1683-1691.
15. Haddadpour, M. D. (2017). PET and MRI image fusion based on combination of 2-D Hilbert transform and IHS method. Biomedical journal, 40(4), 219-225.
16. Hoa, L. N. (2016 ). Enhanced spatial resolution for VNREDSat-1 multispectral images using IHS fusion technique based on sensor spectral response function. In Knowledge and Systems Engineering (KSE), Eighth International Confer.
17. Hoa, L. N. (2016). Enhanced spatial resolution for VNREDSat-1 multispectral images using IHS fusion technique based on sensor spectral response function. Eighth International Conference on Knowledge and Systems Engineering.
18. Jin-Yu, Z. Y.-X. (2009). Edge detection of images based on improved Sobel operator and genetic algorithms.In Image Analysis and Signal Processing IASP International Conference, pp. 31-35.
19. Kanopoulos, N. V. (1988). Design of an image edge detection filter using the Sobel operator.IEEE Journal of solid-state circuits, 23(2), 358-367.
20. Kim, Y. E. (2011). Generalized IHS‐Based Satellite Imagery Fusion Using Spectral Response Functions.Etri Journal, 33(4), 497-505.
21. Kwan, C. B. (2017). Blind quality assessment of fused worldview-3 images by using the combinations of pansharpening and hypersharpening paradigms. IEEE Geoscience and Remote Sensing Letters, 14(10), 1835-1839.
22. Kwarteng, A. P. (1989). Extracting spectral contrast in Landsat Thematic Mapper image data using selective principal component analysis.Photogrammetric Engineering and Remote Sensing, 55, 339.
23. LI, B. C. (2003). Remote-Sensing Image Fusion Based on IHS and Wavelet Transforms .Journal of Data Acquisition & Processing, 3, 005.
24. Li, L. W. (2011 ). Study on the fusion of MODIS and TM images using the spectral response function and STARFM algorithm. In Image Analysis and Signal Processing (IASP), (pp. 171-176).IEEE.
25. Loncan, L. D.-D. (2015). Hyperspectral pansharpening: A review. IEEE Geoscience and remote sensing , 27-46.
26. Morales, R. M. (2011). Assessment of Acacia koa forest health across environmental gradients in Hawai ‘i using fine resolution remote sensing and GIS. Sensors 11.6, 5677-5694.
27. Otazu, X. G.-A. (2005). Introduction of sensor spectral response into image fusion methods.Application to wavelet-based methods. IEEE Transactions on Geoscience and Remote Sensing, 43(10), 2376-2385.
28. Palsson, F. S. (2012). Classification of pansharpened urban satellite images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 281-297.
29. R. Haydn, G. W. (1982). Applications of the IHS color transform to the processing of multisensor data and image enhancement. in Proc. Int. Symp. Remote Sensing of Arid and Semi-Arid Lands, Cairo, Egypt,, pp. 559–616.
30. T.-M. Tu, S.-C.S.-C. (2001). A new look at IHS like image fusion methods. Inform. Fusion, vol. 2, pp. 177–186.
31. Teague, Z. (2001). Ikonos pan-sharpened products evaluation .in Proc. High Spatial Resolution Commercial Imagery Workshop, Mar. 20.
32. Tu, T. M. (2004). A fast intensity-hue-saturation fusion technique with spectral adjustment for IKONOS imagery.IEEE Geoscience and Remote sensing letters, 1(4), 309-312.
33. Tu, T. M. (2012). An adjustable pan-sharpening approach for IKONOS/QuickBird/GeoEye-1/WorldView-2 imagery. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 5(1), 125-134.
34. Verde, N. M.-S. (2018). Assessment of radiometric resolution impact on remote sensing data classification accuracy.Remote Sensing 10(8), 1267.
35. Wald., L. (2000). Quality of High Resolution Synthesized Images: Is There a Simple Criterion? Proc. Int. Conf.
36. Wen, X. (2011). Image fusion based on improved IHS transform with weighted average. In 2011 International Conference on Computational and Information Sciences, pp. 111-113.
37. Xie, Z. C. (2019). Classification of land cover, forest, and tree species classes with ZiYuan-3 multispectral and stereo data. Remote Sensing, 11(2), 164.
38. Xu, J. G. (2008). An improved IHS fusion method for merging multi-spectral and panchromatic images considering sensor spectral response .Photogramm. Remote Sens. Spat. Inf. Sci, 37, 1169-1174.
39. Yang, Y. W. (2016). Remote sensing image fusion based on adaptive IHS and multiscale guided filter. IEEE Access, 4, , 4573-4582.
40. Yokoya, N. C. (2017). Hyperspectral and multispectral data fusion: A comparative review of the recent literature. IEEE Geoscience and Remote Sensing Magazine 5.2 , 29-56.