عنوان مقاله [English]
One of the important features of desert areas (arid and semi-arid) is dust phenomena that occurs in most days of the year. Dust phenomena occur especially in tropical areas. In some parts of the world, including Africa, Australia and the Middle East, the annual sediment volume carried by the flow of the wind is greater than the sediment volume carried by the rivers. Today, the dust phenomena are among the most important environmental hazards which have put human and environmental health at serious risk. Based on the country’s comprehensive water plan, the size of the real deserts of Iran has increased to 4.7 million hectares or 35.5 percent of the country’s land area.
Materials & Methods
The study area was the southwest of Iran including Khuzestan and the Persian Gulf regions. In recent years, these regions have strongly been affected by the dust with internal source and especially with external sources such as dust sources in Iraq, Syria, and Saudi Arabia. In this research, we employed the library method and also determined the days of the dust storm using the weather data of the province. We used satellite data, MODIS sensor data and several algorithms based on the image processing to detect dust.
In order to evaluate the different methods of dust detection, it is necessary to compare the results of the algorithms with another independent source. This source can be a natural color images, aerosol sensor products, MODIS dust indicators or other sensors products. In this research, we first introduced the HDF file of MOD021k MODIS images into the ENVI5.2 software to visualize the dust. After preprocessing the satellite images, we employed different methods such as creating False Color images, BTD and NDDI algorithms, and the neural network method to detect dust on satellite imagery. In this regard, we stored the required bands for the NDDI and BTD algorithms as a single band in the ENVI software, and entered it into MATLAB software to apply the detection algorithms.
Due to the importance of remote sensing and satellite images and also the efficiency of the artificial neural networks method we decided to classify the images of the MODIS sensor by using the methods of the Artificial Neural Network and dust detection indexes. In general, the bands 20, 23, 31 and 32 of MODIS sensor and the infrared thermal bands were used more to detect dust storms. The Brightness Temperature Difference between these bands can detect dust storms from other phenomena. In this study, a Feed Forward Neural Network (FFNN) was used to detect dust storm in Khuzestan and the north of the Persian Gulf, using 20 data sets for the day and 11 data sets for the night. To categorize different pixels in the neural network based on BTD values, BTD of the bands 20-31, BTD of the bands 23-31, BTD of the bands 31-32 and bands 1, 3 and 4 were used.
MODIS bands 1, 3 and 4 were used to create realistic color images to for the better detection of the Earth’s surface phenomena. These three bands were used only for MODIS’s daily images.
The results show that the emissivity of sand in band 31 (0.96) is slightly lower than the band 32 (0.98), while the soil emissivity for these two bands was (0.97) and water emissivity (0.99). Also, the emissivity value of band 31 for the cloud was (0.98) and for band 32 was (0.95). There was a difference between the emissivity value of bands 23 and 31 for soil, sand, and water, which can be used to distinguish dust from other surfaces. The brightness temperature of dust storm (K298/4) and cloud (K276) in the band 23 (4.6 µm) was higher than the brightness temperature of dust storm (K287) and cloud (K271) in the band 31 (11.02 micrometers), while the brightness temperature of water (K285), ground (K310) and vegetation (K295) in the band 23 was lower than that in band 31 for the same items (Water (286K), ground (310K) and vegetation (296K). For these reasons, the difference in brightness temperature between bands 23 and 31 is useful for detecting dust from the ground, vegetation, cloud and water.
In the artificial neural network, the correlation coefficient of the training, evaluation, test and total data was equal to R = 0.996, R = 0.99505, R = 0.99559 and R = 0.9958, respectively. These results show the good capability of the neural network in detecting dust. The data was divided into two classes of dust (0.9) and no dust (0.1). In fact, various inputs entered the network and were divided into two classes of dust and no dust. The results showed that the error started from a large amount and gradually decreased. Epoch is referred to as every step of the data correction. In other words, when an input passes through the network and generates an overall error, the weight factors are corrected with the help of that error, a process which is called the number of repetitions or the Epoch. Thus, as itis shown in the figure, the training ends after 151 repetitions.
Given the results of the neural network output images, it is observed that dust is well distinguished in both the aquatic and terrestrial ecosystems and a better differentiation will be done with higher dust concentration. The ACC parameter indicates that the neural network method has had a good accuracy and performance. Results show that neural network is a more appropriate method than the BTD index in dust detection, and the neural network does not need to determine the threshold for examining each image.
The results of the NDDI index show that this parameter alone, is not able to distinguish dust pixels existing in the atmosphere from the pixels of sand and other than dust, and has poor accuracy in images with cloud or water. It seems that this low efficiency is related to the features of the earth’s surface such as land use, land cover, topographical differences, as well as chemical properties of dust minerals in the region. According to the results of this study, the results of applying the BTD index have suitable performance for the detection of dust. In the present research, the artificial neural network shows a fairly good accuracy and performance for the daytime images with an accuracy of 60%.