Global warming and its consequence which occurs as climate change are of the world's major problems in the current century. Climate change and the warming of the earth have adverse effects on resources such as water, forests, pastures, agricultural land, industry and ultimately human life. The initial effect of climate change is on the atmospheric elements, particularly on the precipitation and temperature. Through evaluating long-term temperature trends we can be provided with a better insight as to how to plan for the upcoming years.
Temperature is one of the elements influencing this issue. That is why monitoring and assessing its behavior is very important to humans. Therefor the simulation of these variables can be vital to gain a perception of human future. There are various methods to simulate and predict climate variables. The most reliable one is using the data from the atmospheric general circulation models or GCM. The GCM models are only able to simulate the atmospheric general circulation data on large surfaces. The implementation of these models for long periods of time is time consuming and requires high processing speeds. To overcome this problem some simplifications should be done including a reduction in spatial resolution and removing some of the physical and thermodynamic processes at the micro scale. These simplifications increase the errors in the atmospheric circulation models and also they cause errors in the prediction and evaluation of the earth’s future climate. To solve this problem, the outputs of general circulation models are down-scaled through dynamical and statistical methods. In recent years, from the various methods of downscaling, researchers have been interested in the statistical downscaling method more than other methods. In the statistical downscaling, statistical methods such as regression and air generator models can be used. The statistical downscaling methods which also include the SDSM model, do the reducing scale based on the statistical history of large-scale predictors and the dependent variables. One of the most widely used models for downscaling GCM data, is the statistical model SDSM. In this study, the competency of this model for downscaling mean temperature was evaluated in Kermanshah station. Several data series including the mean daily temperature in Kermanshah station, data from the function of the national center for environmental prediction and the data from HadCM3 general circulation models were used under the A2 and B2 scenarios. Based on the A2 scenario a world is imagined in which the countries are operating independently, they are self-reliant, the world's population constantly increases, and economic development is region-based. And according to the B2 scenario, the population steadily increases but its growth rate is lower than the A2. The emphasis is on local solutions rather than having global solutions for economic, environmental and social stability, moderate economic development and Rapid technological changes. Kermanshah station data includes daily average from the beginning of 1961 until the end of 2010 which were used for calibration of the model. To this end, collecting the independent variables and the calibration of the model were done for the mean temperature by applying the daily temperature data of Kermanshah’s synoptic station and the data from the National Center for Environmental Prediction. In order to calibrate the observed data from Kermanshah’s station and the data from the National Center for Environmental Prediction (NCEP), it was divided into two 15-year periods (1975, 1961) and (1990 to 1976). The first 15 years was used to calibrate the model using the least square error method optimization. This work was done for the period of 40 years from 1961 to 2000. Then the mean temperature for the 10-year period 2010 -2001 data based on two basic periods of 15 years (1990-1961) and the 40 years (2000- 1961) under the two scenarios A2 and B2, were Predicted and were compared with the observed data of this period to evaluate the predicting performance of the model. The results of the evaluation period (2000-1961 and 1990-1976) using NCEP data showed that the SDSM model has an acceptable capability in simulating the variables such as the mean temperature in the evaluation period and the basic. It should be noted that with an increase in the prediction base period to 40 years, the differences according to the NCEP model and the observed data turned to zero. This can be considered as one of the model’s defects which is due to the use of linear regression because, by reducing the base period to simulate the mean temperature, the results of it, falls away from the average of the observed period, but by increasing the period duration, the outcomes will be valid. Also the amount of variance, the maximum and minimum temperature which are applied by the model to calculate the mean temperature, are not suitable and competence and it commits several errors. This can be caused by poor capability of the model to evaluate and reveal temperature fluctuations; this could be the consequence of adherence to linear regression of the model, although the station’s local conditions and the Hadcm3 model’s errors could intensify the inability.