Fengyun-3D MERSI True Color Imagery Developed for Environmental Applications

+ Author Affiliations + Find other works by these authors


通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Fengyun-3D MERSI True Color Imagery Developed for Environmental Applications

    Corresponding author: Xiuzhen HAN, hanxz@cma.gov.cn
  • 1. National Satellite Meteorological Center, China Meteorological Administration, Beijing 100081
  • 2. Beijing Piesat Information Technology Co. Ltd., Beijing 100195
  • 3. Nanjing University of Information Science & Technology, Nanjing 210044
Funds: Supported by the National Key Research and Development Program of China (2018YFC1506500)

Abstract: Many techniques were developed for creating true color images from satellite solar reflective bands, and the so-derived images have been widely used for environmental monitoring. For the newly launched Fengyun-3D (FY-3D) satellite, the same capability is required for its Medium Resolution Spectrum Imager-II (MERSI-II). In processing the MERSI-II true color image, a more comprehensive processing technique is developed, including the atmospheric correction, nonlinear enhancement, and image splicing. The effect of atmospheric molecular scattering on the total reflectance is corrected by using a parameterized radiative transfer model. A nonlinear stretching of the solar band reflectance is applied for increasing the image contrast. The discontinuity in composing images from multiple orbits and different granules is eliminated through the distance weighted pixel blending (DWPB) method. Through these processing steps, the MERSI-II true color imagery can vividly detect many natural events such as sand and dust storms, snow, algal bloom, fire, and typhoon. Through a comprehensive analysis of the true color imagery, the specific natural disaster events and their magnitudes can be quantified much easily, compared to using the individual channel data.

    • Remote sensing images can be displayed with true or false colors. The true color image is often made of three measurements at red, green, and blue wavelengths, which looks very close to the color of natural scenes. Through some training, people can relate the true color images to natural atmospheric and surface features. For example, in the true color image, cloud and snow look bright white while the haze and smoke are in milky grey. Under clear atmospheric conditions, the forest looks green, the dessert is brown, and the ocean and lake are blue. Therefore, the true color imagery improves the visualization of remote sensing data and can be very powerful tools in the field of disaster monitoring and risk mitigation. Additionally, the true color imagery can help extract the environmental information through validating relevant retrieval products.

      The first remote sensing true color image is made from the Multicolor Spin-Scan Cloudcover Camera onboard the US geostationary satellite ATS-III (Application Technology Satellite-III) in 1967 (Greaves and Shenk, 1985). Sea-viewing Wide Field-of-view Sensor onboard Seastar/OrbView-2 was launched successfully in 1997. It can generate true color images of the specific field for special mission. From Moderate-Resolution Imaging Spectroradiometer (MODIS) onboard Terra/Aqua satellites, the data at green and blue bands are remapped to a 250-m resolution, and the true color image is made separately for morning and afternoon orbits, which can be then sliced and distributed by WorldView. The Visible Infrared Imaging Radiometer Suite (VIIRS) on Suomi National Polar-orbiting Partnership was launched in 2011. VIIRS bands 3–5 were used to produce the 750-m-resolution true color images and were released by NASA Earth Observation (Hillger et al., 2014). After Japan Meteorological Agency launched Himawari satellite in 2014, the GeoColor algorithm was developed for the Advanced Himawari Imager (AHI) by the Cooperative Institute for Research in the Atmosphere to generate true color images. From the Advanced Baseline Imager (ABI) on GOES-R satellite, a green band was simulated from the MODIS surface reflectance database. This database is produced from 16 days of MODIS observations, including the solar reflectivity at red (0.64 μm), near-Infrared (IR, 0.86 μm), and blue bands (0.47 μm). Alternatively, the reflectance of ABI green band is derived from the statistical relationship of red, near-IR, and blue bands. The simulated ABI green band with red and blue bands are then used to produce the true color imagery (Miller et al., 2016). Today, the true color image is normally defined as a key performance parameter or product in both polar-orbiting and geostationary satellites worldwide.

      FY-3D satellite is a new-generation China operational meteorological polar-orbiting satellite, which was launched successfully on 15 November 2017. The Me-dium Resolution Spectral Imager (MERSI)-II onboard FY-3D satellite has three visible bands with a spatial resolution of 250 m at red, green, and blue bands. Its wide swaths can complete the global coverage twice a day, and its spectral bands meet the true color imaging requirements. This study presents the technical approach for MERSI-II true color processing. Schemes for the atmospheric correction, image enhancement, and splicing are developed. Various natural events are displayed with true-color images, including the sand dust, snow, sand storm, algal bloom, fire, and typhoon.

    2.   Datasets
    • FY-3D is configured to the afternoon orbit and flies at an altitude of 836 km above the earth. Its orbital inclination angle is 98.75° to the equator, and it achieves fourteen orbits of observations, which cover the global observations twice a day. Each orbital cycle is about 102 min. The satellite platform carries 10 advanced instruments, including Micro-Wave Humidity Sounder-II, Micro-Wave Temperature Sounder-II, Hyperspectral Infrared Atmospheric Sounder, Micro-Wave Radiation Imager, Greenhouse-gases Absorption Spectrometer, Wide-field Auroral Imager, Ionospheric PhotoMeter, GNSS Radio Occultation Sounder, Space Environment Monitor, and MERSI-II. Datasets retrieved from all these instruments have great impacts on the global numerical weather prediction, climate change, ecosystem monitoring, and space weather prediction.

      As a core optical instrument onboard FY-3D satellite, MERSI-II has 6 visible channels, 10 visible/near-IR channels, 3 shortwave IR channels, and 6 medium- and long-wave channels (Table 1). The spatial resolution corresponding to its field of view at nadir is 1000 and 250 m, respectively; and the spectral resolution is 20 nm, 50 nm, 1.0 μm, respectively, depending on the channel. It not only inherits the mature technology from its ancestors (scanning radiometer and medium-resolution spectral imager) onboard FY-3A/B/C satellites, but also adds 6 IR channels with a resolution of 250 m. Compared to MERSI on FY-3C satellite, MERSI-II improves its calibration accuracy and sensitivity, which is recognized as one of the most advanced wide-swath imagers. MERSI-II data can be used to construct the global true color imagery, with nearly no gap, and to provide information on the global ecosystem, disaster monitoring, and climate assessment.

      wavelength (μm)
      resolution (m)
      Primary application
      1 0.470 250Land, PBL, features
      2 0.550 250
      3 0.650 250
      4 0.865 250
      5 1.3801000
      6 1.6401000
      7 2.1301000
      8 0.4121000Ocean color, plankton, biology, earth chemistry
      9 0.4431000
      10 0.4901000
      11 0.5551000
      12 0.6701000
      13 0.7091000
      14 0.7461000
      15 0.8651000
      16 0.9051000Atmosphere, water vapor
      17 0.9361000
      18 0.9401000
      19 1.0301000Cirrus
      20 3.8001000Land, water, cloud
      21 4.0501000
      22 7.2001000Atmosphere, water vapor
      23 8.5501000
      2410.800 250Land, water, cloud
      2512.000 250

      Table 1.  MERSI-II characteristics including the central wavelength (μm), spatial resolution (m), and primary applications (PBL: planetary boundary layer)

      In the FY-3 satellite ground data operational system, satellite data are processed into L0, L1, and L2 levels. Specifically, L0 data are raw data records with digital counts; L1 data are calibrated radiances with geolocation information; and L2 data contain geophysical parameters from retrieval algorithms, which reveal distributions and variations of the atmosphere, cloud, land, and ocean parameters. The true-color image processing requires MERSI-II L1 data in individual HDF files within a 5-min duration with two different resolutions (250 m and 1 km). The relevant geolocation data are stored in different files (GEOQK and GEO1K), which include geolocation (latitude and longitude) and geometry data (zenith and azimuth angles) for observed pixels with a spatial resolution of 250 m and 1 km, respectively. Datasets needed by the true color image are shown in Table 2.

      File nameDataset nameDescription
      FY-3D_MERSI_GBAL_L1_20180806_0510_0250M_MS.HDFEV_250_RefSB_b1MERSI-II Channel 1 (0.470 μm) DN data
      EV_250_RefSB_b2MERSI-II Channel 2 (0.550 μm) DN data
      EV_250_RefSB_b3MERSI-II Channel 3 (0.650 μm) DN data
      VIS_Cal_CoeffVisible channel reflection calibration coefficients (k0, k1, and k2) and reflectance (ρ) can be computed as ρ = k0 + k1·DN + k2·DN2
      FY-3D_MERSI_GBAL_L1_20180806_0510_GEOQK_MS.HDFDEMDEM data (m) used in the correction of Rayleigh scattering
      Sensor azimuthSensor azimuth (0.01 m)
      Sensor zenithSensor zenith (0.01 m)
      Solar azimuthSolar azimuth (0.01 m)
      Solar zenithSolar zenith (0.01 m)

      Table 2.  MERSI-II L1 dataset structures including the reflectance count (DN), calibration coefficients, sensor azimuth and zenith angles, solar azimuth, and zenith angles

    3.   Methodology
    • The procedures for generating the global true color imagery include three steps: (1) atmospheric correction, which removes contributions of the scattering and absorption of atmospheric molecules to the reflectance at the top of atmosphere (TOA), (2) image enhancements to detail the texture of low reflection targets, and (3) image splicing or transition between adjacent orbits and granules.

    • The atmospheric correction is important in remote sensing image processing. At visible bands, it mainly removes the effect of scattering and absorption of atmospheric molecules on TOA reflectance so that surface targets can be depicted accurately and objectively. Currently, methods for atmospheric corrections include the invariable-object methods (Zheng et al., 2007), histogram matching method (Richter, 1996), dark object method (Teillet and Fedosejevs, 1995), contrast reduction methods (Tanré and Legrand, 1991), and some radiative transfer models: Low Resolution Transmission (LOWTRAN; Kneizys et al., 1980), Moderate Resolution Atmospheric Transmission (MODTRAN; Xu et al., 2008), Atmosphere Removal Model (ATREM; Gao et al., 2000), and Second Simulation of the Satellite Signal in the Solar Spectrum (6S; Vermote et al., 1997) and so on. 6S is based on the radiative transfer theory and is designed for several sensors with multiple bands, which can be applied for different geolocation and variable targets. 6S is thus widely used by the remote sensing community for atmospheric radiative transfer.

      Here, the 6S radiative transfer model is briefly described and applied to make the atmospheric correction for MERSI-II channels at wavelengths of 470, 550, and 650 nm. The surface is approximated to Lambertian type. In the absence of aerosol scattering, the TOA reflectance can be expressed as:

      where ${\rho_{\rm R}^{}}$ is the reflectance from Rayleigh scattering, ${\rho_{\rm t}}$ is the reflectance from surface, ${T^{\rm O}}$ is the atmospheric transmittance of ozone absorption, ${T^{\rm H}}$ is the atmospheric transmittance of water vapor, $T_{\rm R}^ \downarrow $ and $T_{\rm R}^ \uparrow $ are the downwelling and upwelling radiative transmittances of atmospheric molecules respectively, ${\mu _s}$ and ${\mu _v}$ are the cosine of solar and sensor zenith angles respectively, and $\varphi $ is the relative azimuthal angle between the sun and sensor.


      where ${\theta _s}$, ${\theta _v}$, ${\varphi _s}$, and ${\varphi _v}$ are the solar zenith, sensor zenith, solar azimuth, and sensor azimuth respectively; $S$ is the atmospheric spherical albedo. From Eq. (1), the surface reflectance ${\rho _t}$ can be accurately derived from satellite observations, atmospheric parameters, and other spectral information.

      Given the raw reflectance count (DN) data with wavelengths of 470, 550, and 650 nm from MERSI-II L1 data (Fig. 1), the reflectance at TOA (${\rho ^{\rm TOA}}$) can be retrieved with the calibration coefficients and cosine of so-lar zenith angle (${\mu _s}$). GEOQK files in MERSI-II L1 data contain solar zenith (${\theta _s}$), solar azimuth (${\varphi _s}$), sensor zenith (${\theta _v}$), sensor azimuth (${\varphi _v}$), and digital elevation model (DEM) data for each observed pixel. The atmospheric molecule optical depth ($\tau $) can be computed according to DEM. Thus,

      Figure 1.  The MERSI-II atmospheric correction flow chart. The atmospheric correction process mainly includes corrections of the atmospheric path radiation reflectance, atmospheric transmittance (TR), and spherical albedo (S).

      In Eq. (1), ${\rho _{\rm R}^{}}$ can be solved by combining single and multiple scattering as follows (NOAA NESDIS Center for Satellite Applications and Research, 2012):

      where $\tau $ is the relative spectral optical depth of atmospheric molecule scattering (Table 3) and is related to the spectral optical depth (${\tau _0}$) of atmospheric molecules under the atmospheric pressure ($P$) and standard atmospheric pressure (${P_0}$); ${\tau _0}$ can be calculated by using the MERSI spectral response functions into 6S model; ${\delta _{0,m}}$ is the Kronecker function; and ${P^m}$ is the mth term after the Fourier expansion of molecular scattering phase function. Moreover, $\rho _l^m$ is the mth single scattering reflectance, and ${\Delta ^m}(\tau)$ is the correction term for multiple scattering of the molecular optical depth. The first three terms in Eq. (7) are expressed as:

      Spectral nameCentral wavelength (nm)Atmospheric molecular
      optical depth $\tau $

      Table 3.  The atmospheric molecular optical depth of three MERSI-II channels with the central wavelength at 470, 550, and 650 nm, respectively


      In Eq. (12), when m is set to 0, ${a^0}$ and ${b^0}$ can be solved by:

      where $a_{0 - 4}^0$ is 0.332438, −0.309248, 0.114933, 0.162854, and −0.103244, respectively; and $b_{0 - 4}^0$ is −0.067771, −0.012409, −0.035037, 0.001577, and 0.032417, respectively.

      In Eq. (12), when m is set as 1 and 2, the corresponding ${a^1}$, ${a^2}$, ${b^1}$, and ${b^2}$ are 0.19666, 0.145459, −0.054391, and −0.029108, respectively.

      If neglecting the contribution of aerosol, the atmospheric transmittances at red, green, and blue bands are a function of ozone (${T^{\rm O}}$), water vapor (${T^{\rm H}}$), and atmospheric molecule (including downwelling $T_{\rm R}^ \downarrow $ and upwelling $T_{\rm R}^ \uparrow $), and ${T^{\rm O}}$ can be computed by:

      where $U$ is the atmospheric ozone content and is set as a constant of 0.319 cm, and A is the ozone absorption coefficient, as referred to Kneizys et al. (1980). The values of A are given in Table 4.

      Spectral nameCentral wavelength (nm)Ozone absorption coefficients A

      Table 4.  The ozone absorption coefficient of three MERSI-II channels with the central wavelength at 470, 550, and 650 nm, respectively

      ${T^{\rm H}}$ can be calculated by:

      where ${U^{\rm H}}$ is water vapor content (2.93 g cm−2 here), and ${A^{\rm H}}$ and ${B^{\rm H}}$ are the water vapor absorption coefficients (Table 5).

      Spectral nameCentral wavelength (nm)${A^{\rm H}}$${B^{\rm H}}$
      EV_250_RefSB_b1470 0.00000.0000
      EV_250_RefSB_b2550 0.00000.0000

      Table 5.  The water vapor absorption coefficient of three MERSI-II channels with the central wavelength at 470, 550, and 650 nm, respectively

      Assuming the atmospheric molecule only has scattering and the scattering asymmetry factor is zero, the downwelling radiation transmittance ($T_{\rm R}^ \downarrow $) and upwelling radiation transmittance ($T_{\rm R}^ \uparrow $) are given as (Vermote et al., 1997; NOAA NESDIS Center for Satellite Applications and Research, 2012):

      For those scattering molecules with no absorption, their $S$ values can be calculated by:

      where the coefficients ${c_i}$ = −0.57721566, 0.99999193, −0.24991055, 0.5519968, −0.00976004, and 0.00107857 for i = 0 to 5.

      In Fig. 2, the true color imageries are shown over ocean and land before and after the atmospheric correction. Before the atmospheric correction, the images look light-milky fog in the true color imageries, which are difficult to detect the detailed texture of the dark targets (e.g. the forest). After applying the atmospheric correction, the images look clean and are able to reveal the surface details more vividly.

      Figure 2.  The MERSI true color composite images over two surface types before (left) and after (right) atmospheric corrections.

    • The surface reflectance derived from the above processes varies from 0 to 1. To generate the RGB true color image, the value of reflectance should be transferred into an integer value between 0 and 255. The transformation can be performed by a linear stretch, as below:

      where $y$ is the outputted eight-bit integer, int represents the rounding function, and ${\rho _t}$ is the corrected reflectance. After the linear stretch of all reflectances from 650, 550, and 470 nm, as shown in Fig. 3, it is hard to detect the texture and gray information of dark targets (dense vegetation and water) in the true color composite image. To enhance the feature of the dark targets, a nonlinear brightness enhancement table (Table 6), developed by Jacques Descloitres, is employed here (Gumley et al., 2010). Compared with the linear stretching method, the nonlinear method is better for identifying the texture and detailed information from the dark targets.

      Figure 3.  The linear (left) and non-linear (right) brightness enhanced true color composite images.

      0 0

      Table 6.  The non-linear brightness enhancement table adopted from Gumley et al. (2010). The inputs and outputs are the original and non-linear stretched eight-bit integers

    • For the restriction of polar-orbiting satellite, it is hard to cover the specific area in a single 5-min image. Hence, a seamless splicing method is required to make a large image. Here, all the reflectance values are corrected in an equivalent latitude and longitude coordinate system if there are no further explanations. Two splicing techniques are developed as follows.

      The same-track slicing is used to combine two or more scenes that don’t share the same geolocation information. It is easy to use the latitude and longitude to stitch all the scenes. The sliced image should not have the stitching marks since all scenes are exposed to a continuously changing light condition (Fig. 4).

      Figure 4.  The Mosaic effect of two adjacent scenes in the same orbit on 5 December 2018. (a) 0520 UTC image, (b) 0515 UTC image, and (c) Mosaic result.

      Since different scenes with an overlapping area do not share the continuously light and Bidirectional Reflectance Distribution Function (BRDF) condition, this may lead to the discontinuity in image if the same-track splicing is used. In the overlapping area, it is a challenge to define the location of mosaic line. To solve the problem, the distance weighted pixel blending (DWPB) method is proposed (see Fig. 5). Over the overlapping areas of different scenes, the raw reflectance DN values are weight-averaged by using the center of mosaic line, as follows,

      Figure 5.  Illustration of the distance weighted pixel blending (DWPB) method. The blue, green, and orange boxes represent pixels of the left edge in the overlapping area, center line between left and right edges, and right edge in the overlapping area, respectively. The boxes with and without the dashed line represent the overlapping and individual pixels, respectively.

      where DN is the weighted reflectance DN values; ${W_{\rm left}}$ is the raw reflectance DN value of left scene; ${W_{\rm right}}$ is that of right scene; ${W_{\rm left}}$ and ${W_{\rm right}}$ are the weight of left and right scenes, ranging from 0 to 1, which can be determined by:

      where ${x_0}$ is the abscissa value of mosaic line, ${x_l}$ and ${x_r}$ are the abscissa value at the leftmost and rightmost, respectively. ${x_0}$ can be computed by the median of the left most ordinate and the rightmost abscissa. ${x_l}$ and ${x_r}$ are the distances of ${x_0}$ shifting to the left and right, respectively. If the distance is too small, it can lead to a rough transition at the mosaic line. On the contrary, the true color image can become too blurred at the mosaic line. After several experiments, the optimal values of ${x_l}$ and ${x_r}$ are derived by dividing the half width of overlapping area in the horizontal axis, which cannot exceed 200. Figures 6 and 7 compare the image with and without applying the DWPB method. It is clearly shown that the image after splicing has a very good quality and smooth transition.

      Figure 6.  An example of the true color image combination of MERSI-II using DWPB (left) and normal (right) methods over the typhoon area.

      Figure 7.  As in Fig. 6, but over a land area.

    4.   Environmental monitoring using MERSI-II true color imagery
    • After the atmospheric correction, nonlinear enhancement, and smoothing in discontinuity between two adjacent orbits, the MERSI-II true color image is of high quality and is now operationally utilized to monitor some natural events and environmental disasters such as snow, sandstorm, algal bloom, fire, and tropical cyclones.

    • In late March 2018, there is a widespread snowfall in eastern Europe and Russia. In Fig. 8c, the MERSI-II image, however, shows orange in color. In comparison, the surface snow on 22 March 2018 is white (Fig. 8b). On the same day, a dust storm formed in North Africa, passing through the Mediterranean to eastern Europe (Fig. 8a). The massive dust from the Sahara Desert was blown over the Mediterranean Sea by strong winds and piled up in Bulgaria, Romania, Moldova, Ukraine, and Russia. Therefore, the snow on the surface on 25 March appears orange in color after the sedimentation of dust.

      Figure 8.  Examples of the true color image combination of MERSI-II Channels 1 (0.470 μm), 2 (0.550 μm), and 3 (0.650 μm) at the 250-m spatial resolution. (a) Dust transport over Mediterranean Sea on 22 March 2018, (b) white snow in eastern Europe on 22 March 2018, and (c) orange snow in eastern Europe on 25 March 2018.

    • Sandstorm is caused by strong winds carrying sand and/or dust, and can be well monitored by the MERSI-II RGB true color image. The dusty weather appeared in Xinjiang at 1220 UTC 2 June 2018. In Fig. 9, the dusty area is mainly located in the western part of the South Xinjiang basin, covered with some clouds in the area as well. It is estimated that the visible dust area was about 190,000 km2. Affected by the dusty weather, the air quality and visibility declined in the above areas. By combining the conventional meteorological data, the MERSI-II true color image can be used to predict and track the sandstorm.

      Figure 9.  As in Fig. 8, but for the dust storm in Xinjiang region of Northwest China on 2 June 2018.

    • Algae can produce a class of toxin named microcystins or cyanoginosins, which are harmful to the creatures in the ocean. Monitoring the life evolution of algae can be well achieved with the MERSI-II true color imagery. There exist the highest nutrient contents in the sea during late spring and early summer, with abundant sunshine, rising water temperature, and nutrients, causing the proliferation of phytoplankton (Capuzzo et al., 2018). On 6 May 2018, a large area of algal bloom is shown on the MERSI-II true color image. The blooming algae is moving westward in North Sea (Fig. 10). The contrast between open water and algal bloom is clear and is largely benefiting from the nonlinear enhancement technique in the RGB image processing. The areas of light milky white and green colors may contain Coccolithophores and Diatoms, respectively. In addition, the image brightness can reflect the density of phytoplankton, while the different vortices and shapes can reflect complex movements of ocean currents, eddies, and tides.

      Figure 10.  As in Fig. 8, but for the algal bloom in the North Sea on 6 May 2018.

    • In summer 2018, the western part of the US experienced high temperature, dry air, and lightning. As a result, forest fires occurred in the region. Figure 11 is a MERSI-II true color image related to a large area of wildfire at 2145 UTC 28 July 2018. There were 11 large fire spots concentrated in four areas (California, Idaho, Nevada, and Utah), with an impact range of approximately 298.8 and 1.277 km2 for the open fire (Fig. 11). The grey smog was easily found over the dark green forests , which spread across the desert due to the gusty winds.

      Figure 11.  As in Fig. 8, but for the wild fires that occurred in the western US on 28 July 2018.

    • According to the 64th Satellite Typhoon Monitoring Report issued by the National Meteorological Satellite Center of the China Meteorological Administration, the 22nd typhoon in 2018, named as Mangkhut, was observed from the FY-3D MERSI-II true color image. At 0540 UTC 16 September 2018, Mangkhut approached the eastern coast of China. At 1000 UTC 16 September 2018, the center of Typhoon Mangkhut was located at 20.8°N, 115.0°E, with a maximum wind speed of 48 m s−1 and minimum air pressure of 945 hPa near its center. Figure 12 shows Typhoon Mangkhut on 19 September 2018 when it reached its full maturity, and the peripheral spiral cloud and rainfall belt began to affect most of Guangdong Province.

      Figure 12.  As in Fig. 8, but for Typhoon Mangkhut that ocurred on 19 September 2018.

    5.   Summary and conclusions
    • This paper presented an algorithm for generating the MERSI-II RGB true color images and introduced associated environmental applications. The atmospheric correction, nonlinear enhancement, and weighted splicing method are developed for the true color image processing. In the atmospheric correction, ozone and water transmittance, total $T_{\rm R}^ \downarrow $ and $T_{\rm R}^ \uparrow $, as well as S values are all parameterized and updated. The nonlinear enhancement aims at discriminating the dark targets, and the DWPB splicing method is developed to create a smooth true color image between adjacent orbits and different granules.

      From the MERSI-II true color images, environmental disasters and natural events such as sandstorm, snow, algae, fire, and typhoons are well detected and monitored. The results show that the true color image improves the contrast of dark targets and restores the true color of surfaces. MERSI-II has the capacity to provide the detailed texture and useful information in interpreting natural events.

Reference (14)



    DownLoad:  Full-Size Img  PowerPoint