Thursday, September 5, 2019
Theories of Satellite Imagery and Fractal Concepts
Theories of Satellite Imagery and Fractal Concepts Introduction Many applications based on using satellite imagery in a quantitative fashion require classification of image regions into a number of relevant categories or distinguishable classes. Classification is a means of complementing retrieval. Satellite image classification is a clustering method based on image features, the classification results are represented by visualization techniques [Ant05]. Fractal geometry provides a suitable textural image classification framework by studying the nature irregularity shapes in the image, since it allows to easily describing such fractal images. The fractal geometry can recognize small image segment that characterized by its spectral uniformity, this necessitate first to segment the image before the classification. The main characteristics of fractal images are that they are continuous but not differentiable that allows showing the fine details at any arbitrarily small scale [Iod95]. This chapter presents an introduction to the fundamentals of satellite imagery and fractal concepts: Satellite imagery includes a brief description to the satellite imagery technology that useful to understand the main characteristics of satellite images. Whereas, the fractal concepts are mentioned to explain some regards when considering the fractal geometry techniques in digital image processing. Fractal characterizations are discussed to show the fractal features may found in satellite images. Also, some interested fractal measurements are mentioned to be applied on satellite images for purpose of classification. Later, the interested techniques of image segmentation and classification are given. Satellite Imagery Satellites are greatly used in remote sensing imagery, they have several unique characteristics enable to remote sense the Earths surface [Pan96]. The satellite senses electromagnetic energy at different wavelengths reflected by objects to produce the satellite images as shown in Figure (2.1). Visible satellite images are made of satellite signals received by visible channels that senses reflected solar radiation. Visible imagery is available only during daylight since it produced by reflected sunlight. The major advantage of using the visible imagery is due to it can gives higher resolution images than other imagery bands, so smaller features can be distinguished with visible imagery. The problem faces visible imagery is that clouds are shown white, while land and water surfaces are shaded. Clouds are Earths atmosphere, which absorb and reflect incoming solar radiation and leads to hide the fine details of the Earth surface under consideration [San04]. The two primary sensor types in the satellite are the optical and radar. Optical sensors are concerned with the imaging by visible and infrared radiation. While the radar sensors use microwaves to create an image, which enable the sensor to see through clouds and in night. In addition, multi-spectral, hyper-spectral and multi-polarization sensors are operated at different bands to improve the detection of objects under the sea or ground. Table (2.1) presents characteristics of the most popular optical satellites [ERD13]. Table (1.1) The most Popular optical satellites [ERD13] Satellite Mission life Spatial resolution (m) Panchromatic Multispectral Hyperspectral IRS 1988 / 03 0.80 73.00 Landsat 7 1999 / 07 15.00 30.00 60.00 IKONOS 1999 / 09 1.00 4.00 RapidEye 1999 / 12 5.00 ASTER 1999 / 12 15.00 30.00 Ã¢â¬â 90.00 MODIS 1999 / 12 250.00 500 1000 EROS 2000 / 12 0.50 Ã¢â¬â 0.90 QuickBird 2001 / 10 0.61 2.40 SPOT 5 2002 / 02 2.50 Ã¢â¬â 5.00 10.00 OrbView-3 2003 / 09 1.00 4.00 ALOS 2006 / 06 10.00 WorldView-1 2007 / 09 0.40 GeoEye-1 2008 / 09 0.41 1.65 WorldView-2 2009 / 06 0.41 1.80 Pleiades 2011 / 07 0.50 1.00 Many satellite imagery platforms are designed to follow an orbit determined by the direction from north to the south of the Earth, which is conjunct to the Earths rotation (from the west to east of the Earth). This setting of satellites allows them to cover most of the Earths surface (The coverage is called swath) over a certain period of time. More details about the satellite orbit and swath are given in the following subsections [Asr89]: Satellite Orbit Orbit is the path followed by a satellite. Satellite orbits are determined according to the capability and objective of carried sensors. The selection of orbit is depending on altitude, orientation, and rotation of the satellite relative to the Earth. Geostationary satellites revolve at speeds are matching the rotation of the Earth at altitudes of approximately 36000 km as Figure (2.2-a) shows. This makes the satellites to observe and collect information continuously over the considered areas. The common types of such orbits are found in weather and communications satellites. Whereas, most of satellite imagery are set to be used the near polar orbits, which indicates that the satellite moves northward along one side of Earth and then toward the southern pole on the second half of its orbit as Figure (2.2-b) shows. This trajectory is called ascending and descending passes, which are clearly shown in Figure (2.2-c). Moreover, there are sun-synchronous satellite orbits that cover each a rea on the Earths surface at a constant local time of day called local sun time. The ascending pass of sun-synchronous satellite is almost covers the shadowed side of the Earth while the descending pass is covering the sunlit side. This motion credits same illumination conditions when imaging specific area in periodic seasons over successive years [Pan96]. Swath Swath is imaged area on the surface of the Earth when the satellite around revolves, as Figure (2.3) shows. Swath covers an area is varying between tens and hundreds of kilometers wide. When the satellite rotates about the Earth from pole to pole, it seems to be shifted westward due to the rotation of the Earth (from west to east). This motion enables the satellite swath to cover a new area at each successive pass. The satellites orbit and Earths rotation work together to make complete coverage of the Earths surface when completing one orbital cycle. In near polar orbits, areas at high latitudes will be imaged more frequently than that ling at equatorial zone, this is due to the overlapping occurred in adjacent swaths when the orbit paths come closer together near the poles [Cam02]. If the satellites orbit start with any randomly selected pass, then the orbit cycle will be completed at the time when the satellite retraces its path, this is happen when the same point on the Earths surface become directly below the satellite (such point is called the nadir point) for a second time. The exact time period of the orbital cycle will vary at each satellite. Such that, the time interval required for the satellite to complete its orbital cycle is not the same as the revisit period [Sab97]. Satellite Image Scanning The satellite scanning produces digital images using detectors to measure the brightness of reflected electromagnetic energy. The scanner employs a detector with a narrow field of view which sweeps across the terrain, the parallel scan lines are combined together to produce an image as Figure (2.4) shows [Add10]. The most widely type used scanners is across-track (such as a Whiskbroom used in Quick Bird satellite), which uses rotating mirrors to scan the Earths surface from side to side perpendicular to the direction of the sensor platform. The function of rotating mirrors is redirecting the reflected light to be focused at the sensor detector(s). In such case, the moving mirrors create spatial distortions that can be corrected by processing the received data before delivering the image data into the user. The most significant advantage of whiskbroom scanner is the fewer sensor detectors that keeping data calibration. Another type of scanner is the along-track scanner (such as the pu sh broom scanner used in Spot satellite) that does not use rotating mirrors, it uses a sensor detectors are arranged in a rows called a linear array. Instead of scanning from side to side as the sensor system moves forward, the one dimensional sensor array captures the scanned line at once. Furthermore, some recent scanners are step stare based scanners, they contain two-dimensional arrays in rows and columns for each band. It is important to mention that the push broom scanner is smaller, lighter, and less complex due to fewer moving parts than whiskbroom scanner. Also push broom scanner gives better radiometric and spatial resolution. A major disadvantage of push broom scanner is the calibration that required due to a large number of detectors found in the sensor system [Bui93]. A multi-spectral scanner is a space borne remote sensing system that simultaneously acquires images of the same scene at different wavelengths. The sensors of a multi-spectral scanner are normally working in specific parts of the spectral range from 0.35Ã ¼m up to 14Ã ¼m. These specific parts of the spectrum in which remote sensing observation are made, are called bands or channels. The number of bands or channels varies largely from system to another [Add10]. Two important advantages of multi-spectral scanning are [Lil04]: Objects at the surface of the earth have varying reflection behavior through the optical spectrum; they can be recognized and/or identified more easily using several spectral bands than using just one band. A large number of objects do not reflect radiation very well in the visible part of the spectrum. Remote sensing observations outside the visible wavelengths or in combination with observations in the visible spectrum produce a much more contrasting image, which is helpful to identify objects or to determine their condition. Satellite Image Resolution Image resolution is the capability of sensor to observe the smallest object clearly with distinct boundaries. Resolution is often referred to count pixels in digital image. Usually, the pixel resolution is described by set of two positive integer numbers, where the first number is the width of the image (i.e. number of pixel columns) and the second is the height of the image (i.e. number of pixel rows). Whereas, the cite resolution is the total count of pixels in the image, which typically given in megapixels, and can be calculated by multiplying width by height of the image and then dividing by one million as Figure (2.5) shows [Zho10]. In satellite imagery, ground resolution indicates the Ground Sample Distance (GSD) that refers to the size of ground area covered by one pixel. For an image of 0.6m ground resolution, each pixel records an average reflected color of area 0.6m by 0.6m. The fewer meter per pixel, the higher the resolution of the image. The particular ground resolution is an important parameter when taking vertical aerial images. Satellites of various ground resolutions are listed in Table (2.2) [ERD13]. In addition to GSD, there are four types of resolution when discussing satellite imagery, they are: spatial, spectral, radiometric, and temporal. More details about each one are given in the following subsections [Ren99]: Table (2.2) Most interest satellite features [ERD13] Feature QuickBird Landsat -7 GeoEye -1 IKONOS WorldView -2 Pleiades GSD 0.61m 15m 0.41m 1m 0.5m 0.5m Swath width 16.5 km 185km 15km 13km 16.4km 20km Multispectral Yes yes yes yes yes yes Revisit time 3-4 days 16 days 2-3 days 1-3 days 2-3 days 2-3 days Spatial Resolution The discrimination of image details is depending on the spatial resolution of the sensor, which refers to the ability of detecting the smallest possible feature in the image. Spatial resolution of sensors depends primarily on their Instantaneous Field of View (IFOV).The IFOV is related to the angular cone of visibility of the sensor that determines the ground are seen from a given altitude at one particular moment in time. The size of the viewed area is determined by multiplying the IFOV by the distance from the sensor to ground. This ground area is called the cell resolution and determines the maximum spatial resolution of the sensor [Sab97]. Spectral Resolution Many remote sensing systems use several separate wavelength ranges at various spectral resolutions when imaging the ground areas. These are referred to as multi-spectral sensors. Advanced multi-spectral sensors called hyperspectral sensors, which detect hundreds of very narrow spectral bands in the visible, near-infrared and mid-infrared portions of the electromagnetic spectrum. The use of spectral resolution improve the informatic store of specific ground area since there are very high spectral resolution facilitates leads to fine discrimination between different targets based on their spectral response when using each narrow bands [Lil04]. Radiometric Resolution The radiometric characteristics describe the actual information of image contents. Radiometric resolution is the sensitivity of the sensor to the magnitude of the electromagnetic energy that describes its ability to discriminate little differences in the energy. The finer radiometric resolution of sensors the more sensitive for detecting small differences in reflected or emitted energy [Bui93]. Digitally, radiometric resolution is the number of bits comprising each pixel in the image, which indicates the brightness level of current pixel. Brightness levels are digitally represented by a positive numbers varies from 0 to a selected power of 2. The available maximum brightness level depends on the number of bits used to represent the recorded energy. Thus, if the sensor uses 8bits for data recording, then there are 28=256 digital values are available within the range from 0 to 255 [San04]. Temporal Resolution Temporal resolution is related to the revisit period of a satellite sensor. The temporal resolution is the period of a remote sensing system that images the same area appeared at the same viewing angle for second time. The actual temporal resolution is typically measured by days, it depends on three factors: satellite capabilities, swath overlapping, and latitude. The ability of image collection in same area at different periods is an important element for applying remote sensing data. Spectral characteristics of given areas may change over the time and these changes can be detected by collecting and comparing multi-temporal images. When imaging on a continuing basis at different times, change on the Earth surface whether they are naturally occurring or induced by humans can be monitored [Lev99]. Fractal Theory In the 1970s, Benoit B. Mandelbrot introduced his discovery as a new field of mathematics named as fractal geometry (from Latin fractus, i.e. irregular fragmented). He claimed that the fractal geometry would provide a useful tool to explain a variety of naturally occurring phenomena [Man83]. A fundamental characteristic of fractal objects is that their measured metric properties such as length or area are a function of the scale of measurement [Sun06]. Mandelbrots fractal geometry is the best approximation and the most widely used successful mathematical model [Man88]. Fractal objects can be found everywhere in nature such as coastlines, fern trees, snowflakes, clouds and mountains. Self-similarity is one of the most important properties of fractals, invariant scale, and non-integer dimension [Man83]. Fractal geometry is not concerned with the explicit shape of objects. Instead, fractal geometry identifies the value that quantifies the shape of the objects surface by the fractal dimension DF. For example, a line is commonly thought of as 1D object, a plane as a 2D object, and a prism as a 3D object. All these dimensions have integer values. However, the surfaces of many natural objects cannot be described with an integer value, such objects are said to have a fractional dimension. According to Mandelbrot, the fractal can be defined as Ã¢â¬Å"A rough or fragmented geometric shape that can be subdivided in parts, each is (at least approximately) a reduced size copy of the wholeÃ¢â¬ . In term of Mathematics, fractal can be defined as Ã¢â¬Å"A set of points whose fractal dimension exceeds its topological dimensionÃ¢â¬ [Man83]. Fractal geometry uses the fractal features to describe the irregular or fragmented shapes of natural features as well as other complex objects that the tra ditional Euclidean geometry fails to analyze [Ana11].