Physics-Based Visualization of Dense Natural Clouds. I. Three-Dimensional Discrete Ordinates Radiative Transfer

Share Embed


Descrição do Produto

Physics-based visualization of dense natural clouds. II. Cloud-rendering algorithm Sean G. O’Brien and David H. Tofsted

We discuss the representation of aerosol-scattering properties, boundary information, and the use of these results in line-of-sight rendering applications for visualization of a modeled atmosphere based on a discrete ordinates three-dimensional radiative-transport method. The outputs of the radiativetransfer model provide spatial and angular distributions of limiting path radiance, given an input density distribution and external illumination conditions. We discuss the determination of the direct attenuated radiance, integrated path radiance, and background radiance for each pixel in the rendered scene. Orthographic and perspective projection approaches for displaying these results are described, and sample images are shown. OCIS codes: 010.1290, 010.1300, 010.1310, 280.1310, 290.1090, 290.4020, 010.1320.

1. Introduction

The ability to visualize the appearance of natural atmospheres accurately is critical for many simulation tasks. For the determination of sensor performance, for realistic training, for mission rehearsal, and for systems development, the appearance of the natural atmosphere in which systems must operate has many unique features. Spectral-scattering properties of clouds influence the light that reaches the surface. Line-of-sight ~LOS! information is also dependent on scattering behavior. Although many recent studies have highlighted the ability to model the appearance of clouds, virtually all include high orders of approximation for the treatment of radiance properties. In this study, the second in a series, the visualization of cloud fields is considered with the outputs of a highly accurate three-dimensional ~3-D! radiative transfer ~RT! code1 and with parameterized representations of the cloud-scattering properties. Recently, Hembree et al.2 discussed the development of real-time flythrough visualization of cloudy atmospheres against background simulated terrain, with the clouds constructed with appearance overlays. Similar methods were used by Max,3 in which

S. G. O’Brien is with Physical Science Laboratory, New Mexico State University, Las Cruces, New Mexico 88003. His email address is [email protected]. D. H. Tofsted is with U.S. Army Research Laboratory, White Sands Missile Range, New Mexico 88002. His email address is [email protected]. Received 11 December 1997; revised manuscript received 20 May 1998. 7680

APPLIED OPTICS y Vol. 37, No. 33 y 20 November 1998

simplified cloud geometries and slab-illumination models were employed. In many cases the volumetric properties of the atmosphere have been studied by employment of the small-angle approximation ~Kuga et al.,4 Zardecki and Tam5!. However, these studies are limited to small fields of view. For larger fields of view, such as commonly encountered by airborne sensor systems,6 the properties of different LOS’s can be significantly altered by spatially dependent aerosols. A related area of study is the appearance of surfaces and their interdependence with the state of the atmosphere. Gerstl and Zardecki7 considered coupled canopy–atmosphere issues. Kimes and Kirchner8 developed detailed models of surface structure appearance as a function of varying direct and diffuse illumination conditions. In a treatment of the inverse problem under similar assumptions, Diner et al.9 used models of atmosphereysurface coupling to perform the analysis and extraction of surface properties in the presence of varying illumination. Our problem is to develop tools for understanding and characterizing the appearance of the surface and the atmosphere under arbitrary illumination and weather conditions. The ability of imaging sensors to perform within these environments can be predicted with such characterizations,10 and the information derived can be useful in distributed simulation applications.11 The spatial and radiometric quality of a remotely sensed target image can be reduced by a number of atmospheric phenomena. For example, atmospheric optical turbulence over a long LOS can cause long-exposure

blurring or short-exposure shifting or breakup of images. Another degradation mechanism occurs when haze layers or natural water clouds occupy a considerable portion of the field of view; in that instance, atmospheric absorption, scattered radiance ~at visible wavelengths!, or thermal emission ~in the infrared! can also reduce the apparent contrast of a target against its background. We focus on the simulation of this latter effect. In the first paper,1 we considered the adaptation of the discrete ordinates method ~DOM! solution of the RT equation for optically thick natural clouds. In this paper we are concerned with the accurate conditioning of aerosol phase function and concentration parameters that are used as input to the DOM solution, and we display some of the methods and results that have been used to create image maps of visibleband-radiance outputs from the DOM model. In particular, we examine scattered-radiance effects as they pertain to the generation of a three-channel color @red-green-blue ~RGB! or National Television System Committee# ~NTSC! image of an 8-km-wide 3 8-km-long 3 4-km-high example volume occupied by optically thick clouds and haze. Here we describe how scattering phase functions ~PFN’s! for haze aerosols are determined according to an aerosol class ~rural, urban, or maritime! and relative humidity,12 as embodied in the U.S. Army’s EOSAEL ~Electro-Optical Systems Atmospheric Effects Library! PFNDAT PFN database.13 We also use two U.S. Air Force models to assist in this task: the MODTRAN14 and CSSM15 codes. MODTRAN, a moderate spectral resolution one-dimensional ~1D! RT code, is used to evaluate incident direct and diffuse radiances at the top of the model volume. MODTRAN is run in its irradiance mode to compute direct solar or lunar radiance and in its multiple-scattering mode for diffuse radiance in each of the downward stream directions. CSSM ~the Cloud Scene Simulation Model! builds a 3-D map of the cloud liquid-water content ~LWC, in g m23! for a rectilinear grid. Coupling the CSSM output with information about cloud type and cell height above cloud base enables us to estimate the particle size distribution ~PSD! for each cloud cell from CLTRAN,16 the Army’s EOSAEL cloud transmission model. Using the computed PFN’s for a family of PSD’s allows us to assign scattering properties. Figure 1 shows the processing stages needed to generate the input data for the DOM RT model. In the figure processing steps are contained within ovals and data sets are contained within rectangles. The stages contained in the dashed box on the left-hand side represent preprocessing to generate a series of PFN’s for the family of PSD’s, which are then indexed to the conditions in the CSSM output. After the DOM RT model is run with these aerosol data, two basic methods for projecting scene radiance onto an image plane are used to create a physically correct image of a target embedded in a 3-D environment. Orthographic projection emulates what would be seen by an observer at distances large compared with any linear dimension in the target’s en-

Fig. 1. Processing procedure for the RT model. The sequence starting with ~1! produces the modeled volume internal structure. Sequence ~2! generates upper-boundary conditions. Sequence ~3! provides surface feature information.

vironment. Hence all calculated radiances point in a single direction ~toward the distant observerysensor!. The image plane in this instance is simply the geometric projection of the target’s environment onto a plane that is perpendicular to the look direction. The second rendering method views the target scene in a perspective projection, in which the sensor is close enough to the scene that the radiance from different spatial elements converge from significantly different directions on the sensor. The sensor itself could be modeled as a camera, with an objective lens situated at the sensor’s exact position. A focal length and an image-plane size can then be specified to delineate the solid angle imaged by the sensor. A schematic diagram of the ray-trace geometry for perspective-projection image generation is shown in Fig. 2. The two image-projection methods were implemented in independent computer codes. The two code versions use the same input radiance and aerosol-concentration data files. Both codes can treat laterally periodic boundary conditions, in which the lateral diffuse and direct radiances exiting one volume tile enter the volume of an adjacent tile. Both codes have been configured to produce either monochromatic ~256-level gray scale! or color ~RGB! image outputs. Output files for each version of the orthographic and perspective projection codes have standard RGB formats to simplify subsequent display on the Silicon Graphics equipment used in this study. 2. Representation of Aerosol Properties

To generate aerosol optical propagation data, the LWC data produced by CSSM is coupled to the Khrgian–Mazin ~K–M! PSD parameterizations con20 November 1998 y Vol. 37, No. 33 y APPLIED OPTICS

7681

Fig. 2. Schematic illustration of perspective-projection geometry.

tained in CLTRAN. The K–M PSD17,18 is a modified gamma distribution, n~a! 5 A a2 exp~2Ba!,

(1)

where a is the particle radius. Here n~a! is related to the number density N ~particles cm23! and the mode radius a through parameters A and B as N5

*

`

n~a!da 5

0

2A , B3

a5

1 N

*

`

a n~a!da 5

0

3 . (2) B

N is related to the LWC ~wL in g m23 of liquid water droplets per volume of air! by means of the bulk density of water ~rw! and a as N 5 1.07 3 1027

wL . rw a3

(3)

Thus, given LWC from CSSM, the K–M model requires only an estimate of a to produce a PSD. Low and O’Brien16 parameterized mode radius as a function of the height above cloud base for different cloud types. These curves are modeled as linear with a maximum value cutoff ~Table 1! by use of the formula a 5 min~ A0 1 A1 H, AM!,

(4)

with H the cell height above cloud base in kilometers. Although other models of PSD for various cloud types exist, the K–M approach allowed for a solution

Fig. 3. Phase functions for K–M PSD aerosols for 0.55-mm radiation, with varying mode radius. The inset shows the forward peak region.

based on available data. In the future, given augmented input data, we expect to improve this approach. To exploit the relationship between the CSSM output and the K–M family of PSD’s, the associated scattering properties for this family must be evaluated: extinction coefficient ~s, in km21!, single-scattering albedo ~v!, and PFN’s representing the probability distributions of angular scatter @P~m!, in inverse steradians, where m is the cosine of the scattering angle#. A Mie scattering model developed to evaluate these properties19 was used. Twenty K–M PSD’s were evaluated with a ranging from 1.5 to 30 mm in increments of 1.5 mm. PFN results were evaluated at nine wavelengths spanning the visible band ~0.35– 0.75 mm in increments of 0.05 mm! and at 65 scattering angles primarily concentrated in the forward-scattering region. Each calculation was based on wL 5 1 g m23. Results at 0.55 mm are shown in Fig. 3. 3. Implementation of Rendering Method A.

Data Preparation

The DOM RT model described in Ref. 1 produces radiance output for each spectral band in 24 directions for each cell face of a 3-D spatial array of cubical cells. A processing code that performs two major functions then conditions these data for subsequent manipulation. First, radiance results produced by the DOM model are used to evaluate the limiting path radiance Ls in a sequence of 42 sample directions V. The continuous-form relation is given as

Table 1. Mode Radius Vertical Structure for Various Cloud Types

Cloud Type

A0 ~mm!

A1 ~mmykm!

AM ~mm!

Stratus Altostratus Nimbostratus Stratocumulus Cumulus Humilis Cumulus Congestus

2.0 2.0 2.5 1.5 1.5 2.0

16 5 9 16 9 13

13.5 6.5 10.5 12.5 6.5 30.0

7682

APPLIED OPTICS y Vol. 37, No. 33 y 20 November 1998

Ls~r# , V! 5 v

*

I~r# , V9! P~r# , V, V9!dV9,

(5)

4p

where r# is a position within the scattering medium, V9 is the direction of incident radiance I, v is the aerosol single-scattering albedo at position r# , and P~r# , V, V9! is the phase function for the scattering me-

Fig. 4. G-band limiting-path radiance for a horizontal layer of an aerosol cloud field. Forward-scattering direction ~left! and backscattering direction ~right!.

dium for incident direction V9. The phase function conforms to the normalization

*

P~r# , V, V9!dV9 5 1.

(6)

4p

The numerical output from the DOM model is actually treated with discrete forms of these relations ~at each cell center position!: 42

Lsi 5 v

( I# P DV 9, j

ij

j

(7)

j51

42

( P DV 9 5 1, ij

j

(8)

j51

where the subscripts i and j denote the output and input stream directions, respectively; DVj9 is the solid angle subtended by stream j; and I# j is the unscattered cell-averaged diffuse radiance flowing in direction V9. Figure 4 shows examples of limiting path radiance Ls for the fourth horizontal layer above the ground in a 32 3 32 3 16 DOM output grid ~or 8 3 8 3 4 km in external dimensions!, for a plane-parallel solarillumination source and downwelling diffuse radiation ~sky background! computed by MODTRAN14 at the upper boundary of the study volume. The view in the figure looks downward, such that the 1Y ~north! axis points upward on the page and 1X ~east! points to the right-hand side. With the convention that azimuth is measured counterclockwise from 1X, the incoming plane-wave vector points in a direction that has a zenith angle of 148° and an azimuth of 163°. The output direction for each cell in the grid on the left-hand side of the figure is along the incoming plane-wave direction ~forward scattering!; the righthand side shows results for the backscattering direction. The results are weighted for the NTSC G band, are normalized to the same gray scale, and show dark regions where Ls is small, owing to selfshadowing by the cloud aerosol. Lighter regions correspond to volumes that are strongly illuminated by direct sunlight at cloud edges or in clear air. The second conditioning operation performed by the data-processing code is designed to reduce storage requirements for the limiting-path-radiance data

Fig. 5. Comparison of the finely gridded concentration produced by the CSSM with the coarse grid produced by the DOM model.

derived above. This method ~which is analogous to some commonly used graphics-compression techniques20! uses templates or bins that classify the angular pattern of the normalized limiting path radiance into a restricted catalog of shapes. Classes or bins within the shape catalog are distinguished by variances from the mean for both individual streams and composite stream patterns. The template catalog is expanded by subdivision until either zero variances are achieved in all classes or the number of templates reaches a set maximum value. With this approach, an individual cell’s 42-stream limitingpath-radiance pattern for each of the RGB color bands is characterized by a class number and a maximum limiting-path-radiance value. Some accuracy is lost by the binning procedure, but it is estimated that this amounts to no more than 10% for any given stream. Compression factors achieved by this method are in excess of 10 –20 for normal data sets. B.

Grid Preparation

The visual rendering of a 3-D cloud scene is performed once the limiting-path-radiance and aerosolconcentration data are determined. In general, it is assumed that the limiting-path radiance displays more gradual spatial variation than does the aerosol concentration ~or optical density!.21 This assumption derives from the observation @as can be seen in Eqs. ~5! and ~7!# that the limiting path radiance is integrated over all solid angles. As a consequence, in normal scenarios, the concentration data are specified on both fine and coarse spatial grids. Cloudconcentration models such as the U.S. Air Force CSSM15 are employed to produce the fine-spatialresolution grid, which may require considerable storage for even small volumes. The coarse spatial grid is used by the DOM code in its determination of limiting path radiances at each grid point and permits reasonable execution times for large spatial volumes. Coarse grid concentrations are derived from the fine grid by nearest-neighbor interpolation about each coarse grid point, with each neighbor’s contribution weighted with a normalized Gaussian function of distance. Figure 5 compares the CSSM fine and interpolated coarse grids for horizontal layers that are near 900 m above ground level. 20 November 1998 y Vol. 37, No. 33 y APPLIED OPTICS

7683

Fig. 6. Comparison of the radiance maps of limiting path radiance for a wide-angle sensor; side scattering ~left-hand side! and near backscattering ~right-hand side!.

The fine grid cubical cell edge length is 62.5 m, and the coarse grid cell size is set at 250 m in the figure. Before the path integration is performed, and if the scene is to be orthographically rendered for a particular observerysensor look direction, the 42-direction limiting-path-radiance data at each point in the coarse grid are interpolated at zenith and azimuth angles with cubic splines to obtain the limiting-path radiance for that single direction. Perspective rendering of the laterally periodic space will cause the evaluation direction for the limiting-path radiance to vary with position in the image plane. Thus, in this latter case, the limiting-path radiance is calculated ~and stored! for five exit directions that correspond to the center and four corners of the viewing window, at each point in the coarse grid. As is shown below, these five values are useful for angular interpolation at a later stage of the perspectiverendering process. Figure 6 shows gray-scale maps of Ls for a physical situation identical to that used for Fig. 4. The scene is viewed by a sensor with a 10-mm focal length and a 24-mm square image plane, so that opposite corners in the image plane subtend an angle of approximately 119°. The sensor looks straight down. In this case, the right-hand side of the figure shows an Ls output map of the same layer as in Fig. 4, for the viewport corner direction nearest to the solar backscatter direction ~with a scattering angle of 146.5° relative to the incident solar direct radiation!. The left-hand side of the figure shows the Ls map for the opposite corner of the viewport, which is in the side-scattering domain ~scattering angle of 91.6°!. The same gray-scale normalization is used for both scattering directions and shows the expected characteristics that the side-scattered Ls is weaker and displays less contrast than the backscattered result. C.

Line-of-Sight Ray-Trace Rendering

For both image-projection methods, we generated each pixel by first determining the pixel position in the image plane and the direction from which scene radiance impinges on it. The integrated path and attenuated background radiances seen by each pixel are calculated by backward integration from the pixel 7684

APPLIED OPTICS y Vol. 37, No. 33 y 20 November 1998

Fig. 7. Schematic diagram showing variable-integration step size keyed to concentration gradient.

position, in the direction opposite the direction of incidence. The discrete summation terminates when the LOS encounters a background surface, a large range limit is exceeded, or the total optical depth of the path exceeds a large fixed value ~and hence no further significant path radiance or transmission contributions can be seen!. In the examples given in this paper we selected a maximum optical depth cutoff of 5. The initial ~and maximum! path-integration step size is chosen to be comparable to the width of a fine grid ~concentration! cell. Before the visualization code is executed, another utility checks each cell in the concentration array to determine whether any of the cell’s 26 nearest neighbors have a density much larger or smaller than its own. If such a gradient is identified, the cell is flagged with an integer whose magnitude corresponds to that of the largest observed gradient. Later, during the LOS radiance integration phase of visualization, the gradient index is checked at the end of each step. If a large gradient is indicated, the step size is reduced in proportion to the gradient’s magnitude, as shown schematically in Fig. 7 with the smaller steps in the vicinity of the darker ~more dense! regions. In this way, LOS path segments that are close to cloud edges are more finely divided than are segments in uniformly thin or thick regions. Fine spatial details at cloud edges or in small puffs are thus registered, but not with such abruptness that the cloud boundaries appear excessively jagged. After the position, pointing direction, and characteristics of the viewing sensor are specified, each pixel in the image plane is examined to calculate an appropriate integrated path radiance. In discrete form, the integration proceeds as a path summation with variable step increments as described above. The assumption that the limiting-path radiance Ls is constant over each path segment allows the equation of transfer to be integrated in a given wave band from

the sensor to some point on the background, yielding the result N

N

Lp 5

(T

m21

~1 2 Tm! Lsm 1 Lb

m51

)T

m

,

(9)

m51

where Lp is the total integrated path radiance, Tm21 is the total transmission between the observer and the beginning of path segment m, Tm is the transmission of LOS segment m, and Lb is the background radiance. 4. Visualization Model Performance

The visualization approach discussed above was tested by creating a sample partially cloudy situation with the CSSM package. The 8 km 3 8 km 3 4 km base tile was populated at middle levels by a 30% cumuliform cloud cover and at high levels by a 25% stratiform cover. The resulting LWC concentration field was then combined with a vertically structured background aerosol with a surface visibility of 40 km. To model this volume, we created a 128 3 128 3 64 cell extinction-coefficient field. Multispectral Landsat images of an area near Las Cruces, New Mexico, were chosen to model the volume base. The Landsat images have a resolution of ;30 m and were used to estimate relative surface albedos in the three spectral channels that had the closest correspondence to the NTSC RGB bands. The average surface albedo in the visible wavelength region was set to 0.14. As in the examples cited above, the direct beamillumination source was the Sun, placed at a zenith angle of 32° and an azimuth of 217° ~measured by our south-of-east convention!. A diffuse blue-sky background radiance source was calculated with MODTRAN. The extinction-coefficient field was regridded to a 32 3 32 3 16 cell array for use in RT calculations. The DOM model described in part I was then employed to calculate the multiply scattered diffuse radiance at each cell in the base tile, as well as the surface irradiance. The DOM calculations were performed for seven wave bands in the visible, positioned such that the dual-peak R and single-peak G and B bands were adequately represented. A.

Fig. 8. Orthographic view looking up; the 90° azimuth is toward the bottom of the page.

C.

Spatial-Resolution Considerations

Close examination of images rendered with the technique described above reveals anomalies consistent with the granularity of the RT and LOS calculations. In certain viewing geometries, cell edges are apparent in the integrated radiance field, especially at cloud edges or at cloud surfaces viewed under backscattering conditions. In part, such problems arise from rapid concentration transitions at cloud boundaries for the low resolution of even the fine output grid. When this situation is combined with the

Orthographic Viewing Examples

The 8-km square patch of surface was rendered under orthographic projection for a view looking straight up from the surface and straight down from above the top of the situation volume. Figures 8 and 9 show the respective results for the scaled Landsat surface albedo map. In both illustrations, the center of the figure is at the coordinates X 5 11 km, Y 5 11 km, measured relative to the center of the base tile. B.

Perspective Viewing Examples

With the sensor focal length set at 15 mm and the square image plane width set at 24 mm, perspective projection yielded the images shown in Figs. 10, 11, 12, and 13 for an uplooking path, a downlooking path, and two near-horizontal slant paths, respectively.

Fig. 9. Orthographic view looking down; the 90° azimuth is toward the top of the page. 20 November 1998 y Vol. 37, No. 33 y APPLIED OPTICS

7685

Fig. 10. Upward-looking slant path in perspective view; the center zenith angle is 45° and the azimuth angle is 80°.

Fig. 12. Lateral view in perspective; the center zenith angle is 70° and the azimuth angle is 0°.

speed-mandated practice of interpolation over only the eight closest grid points to a given LOS segment location, seams in the radiance field appear when the LOS passes between regions bounded by different sets of nearest neighbors. Another problem is observed when the sensor looks in near-forward-scattering directions ~toward the Sun!. Some enhanced radiance is expected at the cloud edges ~silver lining!, but rendered images do not show this phenomenon with the anticipated contrast or continuity around the cloud edge. These

effects would be visible if the DOM were run with a higher-order stream model. Unfortunately, current limitations of computer memory storage and computation time present practical limitations. The phase function delta-correction procedure removes some energy from small angles about the forward-scattering direction. The near-forward-scattered radiance predicted by the model will thus be lower relative to off-axis radiances than is actually the case. Also, the relatively strong forward peak in the corrected phase function will cause a large gradient in the limiting

Fig. 11. Downward-looking slant path in perspective view; the center zenith angle is 140° and the azimuth angle is 180°.

Fig. 13. Perspective view slant path; the zenith angle is 70° and the azimuth angle is 97°.

7686

APPLIED OPTICS y Vol. 37, No. 33 y 20 November 1998

path radiance near the cloud surface for nearforward-scattering directions. Under these conditions, the validity of combining a coarse radiance grid with a fine-optical-density grid becomes problematic. This shortcoming is reflected by discontinuities in the near-forward radiance scattered along the projected perimeters of unshadowed clouds. 5. Conclusions

The visualization approach outlined above provides multiband results that have superior physical realism to those that simply render visually plausible images of liquid water content ~LWC! or aerosol concentration. The technique uses results from a threedimensional ~3-D! radiative transfer ~RT! model for multiple scattering in an inhomogeneous volume and so is also superior to visualizations that rely on single-scattering, plane-parallel geometry or on homogeneous finite-volume transport models. The model thus provides a basis for more accurate estimates of target contrast for objects of arbitrary shape, reflectivity, and location within the examined volume. The accuracy of simulation for the response of autoranging sensors is improved with the above imagegeneration approach, particularly for those instances in which the sensor has a large angle of acceptance. Large variations of scattering efficiency over the field of view are accommodated, as are shadows on the ground or on the clouds themselves. Especially important is the ability of the method to render surface diffuse radiance levels properly inside shadows under partly cloudy conditions. Sensors with a restricted dynamic range may not be capable of imaging the entirety of individual targets or groups of targets if they span cloud shadow boundaries. The extent of this limitation ~which cannot be easily estimated by simplified geometry RT models! is readily determined from the above visualization method. The coupling of high-resolution concentration data to low-resolution limiting-path-radiance-data appears to be justified if speed of execution or amount of available storage in the rendering process are important considerations. Results are reasonably accurate and are obtained in practical amounts of time, although near-real-time image generation would require extensive integration of the LOS calculations and the extinction and radiance databases into the hardware. Although it is not fast, the visualization method described above can provide a benchmark with which more rapid and approximate approaches can be compared. The speed of the visualization could be considerably improved by its implementation on parallel processing machines. Individual pixel radiances are essentially independent, so groups of pixels could be parceled off to individual processors. To maintain individual processor cache sizes to practical levels, one would need to use the visualization software to allocate specific zones in the volume of each cache. This allocation would need to be dynamic, because, under periodic boundary conditions, most of the

study volume might be traversed when the LOS is not parallel to any coordinate axis. One problem observed during the development of the visualization software was the occasional blocky or terraced appearance of cloud surfaces, particularly when they are viewed tangentially or under strong direct illumination. As noted above, this difficulty was apparently caused by abrupt changes in aerosol concentration near the cloud surface rather than by changes of limiting-path radiance. Speed-of-execution considerations precluded the use of more nearby array cells in the interpolation of a given LOS integration point, so that the tactics of gradient-keyed variable step size, convolution of the concentration array with a scanning function, and interpolation over the logarithms of concentration values had to be used. These approaches do ameliorate the blockiness problem, but they do not eliminate it. A more permanent solution involving higher-concentration grid densities and interpolation over more than eight nearest-neighbor grid points awaits improvement in computer storage and speed capabilities over those currently available for this study. The authors acknowledge the contributions of B. T. Davis and A. Miller, whose development of a modified Mie scattering code enabled the generation of scattering property data for the aerosols studied. References 1. D. H. Tofsted and S. G. O’Brien, “Physics-based visualization of dense natural clouds. I. Three-dimensional discrete ordinates radiative transfer,” Appl. Opt. 37, 7718 –7728 ~1998!. 2. L. Hembree, S. Brand, W. C. Mayse, M. Cianciolo, and B. Soderberg, “Incorporation of a cloud simulation into a flight mission rehearsal system: prototype demonstration,” Bull. Am. Meteorol. Soc., 78~5!, 815– 822 ~1997!. 3. N. L. Max, “Light diffusion through clouds and haze,” Comput. Vision Graph. Image Proc. 33, 280 –292 ~1986!. 4. Y. Kuga, A. Ishimaru, H.-W. Chang, and L. Tsang, “Comparisons between the small-angle approximation and the numerical solution for radiative transfer theory,” Appl. Opt. 25, 3803–3805 ~1986!. 5. A. Zardecki and W. G. Tam, “Iterative method for treating multiple scattering in fogs,” Can. J. Phys. 57, 1301–1308 ~1979!. 6. W. Baer, “New approach to earth surface modeling for realtime rendering perspective views,” in Image Modeling, L. A. Ray and J. R. Sullivan, eds., Proc. SPIE 1904, 208 –221 ~1993!. 7. S. A. W. Gerstl and A. Zardecki, “Coupled atmosphereycanopy model for remote sensing of plant reflectance features,” Appl. Opt. 24, 94 –103 ~1985!. 8. D. S. Kimes and J. A. Kirchner, “Radiative transfer model for heterogeneous 3-D scenes,” Appl. Opt. 21, 4119 – 4129 ~1982!. 9. D. J. Diner, J. V. Martonchik, E. D. Danielson, and C. J. Breugge, “Application of 3-D radiative transfer theory to atmospheric correction of land surface images,” in Proceedings of the IEEE Geoscience and Remote Sensing Society ’88 Symposium ~Institute of Electrical and Electronics Engineers, Piscataway, N.J., 1988!, pp. 1215–1218. 10. D. H. Tofsted and S. G. O’Brien, “Characterizing the effects of natural clouds on scene simulations,” in Targets and Backgrounds: Characterization and Representation III, W. R. 20 November 1998 y Vol. 37, No. 33 y APPLIED OPTICS

7687

11.

12.

13.

14.

15.

Watkins and D. Clements, eds., Proc. SPIE 3062, 188 –198 ~1997!. S. G. O’Brien and D. H. Tofsted, “Visualization of dense cloud radiation data in modeling and simulations,” in Visualization of Temporal and Spatial Data for Defense Applications, N. L. Faust, ed., Proc. SPIE 3085, 82–93 ~1997!. E. P. Shettle and R. W. Fenn, “Models for the aerosols of the lower atmosphere and the effects of humidity variations on their optical properties,” Rep. AFGL-TR-79-0124 ~Air Force Geophysics Laboratory, Hanscom Air Force Base, Mass., 1979!. D. H. Tofsted, B. T. Davis, A. E. Wetmore, J. Fitzgerrel, R. C. Shirkey, and R. A. Sutherland, “EOSAEL 92 aerosol phase function data base PFNDAT,” Rep. ARL-TR-273-9 ~Army Research Laboratory, White Sands Missile Range, N.M., 1997!. A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN 7,” Rep. GL-TR-890122 ~U.S. Air Force Geophysics Laboratory, Hanscom Air Force Base, Mass., 1988!. M. E. Cianciolo and R. G. Rasmussen, “Cloud scene simulation modeling, the enhanced model,” Rep. PL-TR-92-2106 ~Phillips Laboratory, Hanscom Air Force Base, Mass., 1992!.

7688

APPLIED OPTICS y Vol. 37, No. 33 y 20 November 1998

16. R. D. H. Low and S. G. O’Brien, “EOSAEL 87, cloud transmission module CLTRAN,” Rep. TR-0221-9 ~Atmospheric Sciences Laboratory, White Sands Missile Range, N.M. 1987!, Vol. 9. 17. H. R. Pruppacher and J. D. Klett, Microphysics of Clouds and Precipitation ~Reidel, Boston, 1980!. 18. A. M. Borovikov, I. I. Gaivoronskii, E. G. Zak, V. V. Kostarev, I. P. Mazin, V. E. Minervin, A. Khrgian, and S. M. Simeter, Cloud Physics, ~Fizika oblakov! Translated from the Russian by Israel Program for Scientific Translation, Jerusalem, Israel, 1963; available from the Office of Technical Services, U.S. Dept. of Commerce, Washington, D.C. 19. A. Miller, “Mie code AGAUS 82,” Rep. ASL-CR-83-0100-3 ~U.S. Army Atmospheric Sciences Laboratory, White Sands Missile Range, N.M. 1983!. 20. C. W. Therrien, Decision Estimation and Classification: An Introduction to Pattern Recognition and Related Topics ~Wiley, New York, 1989!. 21. D. W. Hoock, “Modeling time-dependent obscuration for simulated imaging of dust and smoke clouds,” in Characterization, Propagation, and Simulation of Sources and Backgrounds, W. R. Watkins and D. Clement, eds., Proc. SPIE 1486, 164 – 175 ~1991!.

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.