Welcome to Modeling Agencies | Ny Modeling Agency | Fashion Modeling Agencies


Thursday, April 05, 2007

CLIMATE CHANGE AND WATERSHED PLANNING IN WASHINGTON STATE1

This paper draws on interviews with Washington State Watershed Planning Leads (Planning Leads) and interactions with local watershed planning units to identify factors that may influence the inclusion of climate change in watershed planning efforts in Washington State. These factors include the interest of individual planning unit members in climate change; Planning Lead familiarity with climate impacts; the influence of trust, leadership, and "genetic knowledge" on planning units; and perceptions of strategic gain. The research also identifies aspects of the planning process that may create opportunities for addressing climate impacts in future planning. These aspects include continuation of watershed planning units after plans are developed; commitment to updating watershed plans; recognition of climate impacts in planning documentation; dedicated incentive funding; and the availability of hydrologic modeling tools for assessing hydrologic impacts. Additional types of technical assistance that could support integration of climate impacts are also identified. It is hoped that the insight provided by this analysis will help individuals involved in stakeholder-based watershed planning recognize the various dynamics potentially affecting the inclusion of climate change in watershed planning and in doing so, contribute to the development of planning approaches and tools that will support local efforts to adapt to climate impacts.

(KEY TERMS: climate change; watershed planning; Washington State; Pacific Northwest; watershed management.)
Advertisement

INTRODUCTION

Climate change is expected to exacerbate water resource challenges in the Pacific Northwest (PNW) by altering the type and timing of precipitation throughout the region. These changes, which are projected to result in increased winter precipitation, reduced spring snowpack, and reduced summer streamflows, have the potential to disrupt natural systems, local economies, and community lifestyles (see Mote et al., 1999, 2003). Early recognition and assessment of potential climate impacts at a local level give communities time to develop the capacity to adapt to climate impacts, potentially reducing disruptive effects.

Washington State's Watershed Planning Program provides an important opportunity to begin addressing the potential impacts of climate change on PNW water resources at a watershed scale. The long term perspective required by the program, the number of watershed planning issues potentially affected by climate change (Figure 1), and the potential for binding agreements on water supplies, infrastructure, and instream flow targets point to the importance of recognizing climate impacts in these planning efforts. Conversations with state watershed planning staff indicated, however, that information on climate impacts was not readily finding its way into the watershed planning process.

The purpose of this research is to gain insight into the challenges and opportunities for integrating climate change information into watershed planning in Washington State. More specifically, the research

* Identifies the extent to which climate impacts were being discussed and incorporated into watershed planning conducted under the Watershed Planning Program prior to 2003.

* Considers factors that appear to influence the inclusion of climate impacts in watershed planning efforts prior to 2003.

* Considers key aspects of the watershed planning process that may create opportunities for integrating climate impacts in future watershed planning efforts.

* Identifies additional types of support that may facilitate including climate change impacts in watershed plans.

The analysis draws on information gathered through interviews and meetings with Washington State Department of Ecology watershed planning staff in 2002 and 2003, as well as presentations and other informal exchanges with Washington watershed planning groups. It is hoped that the information presented will help researchers, agency staff, and other individuals involved in stakeholder-based watershed planning recognize the various dynamics potentially affecting the inclusion of climate change in watershed planning and thus contribute to the development of planning approaches and tools designed to support adapting to climate impacts at the watershed scale.

CLIMATE CHANGE IMPACTS IN THE PACIFIC NORTHWEST

Several studies have evaluated the impacts of climate change on PNW climate, hydrology, and water management (see Hamlet and Lettenmaier, 1999; Lettenmaier et al., 1999; Leung and Wigmosta, 1999; Mote et al., 1999, 2003; Miles et al., 2000; Payne et al., 2004). A key concern identified in these studies is the impact of climate change on regional snowpack and streamflow. Most PNW watersheds are highly dependent on the accumulation of winter snowpack for meeting summer (April-September) water supply needs. This reliance is shaped by the seasonality of PNW precipitation, given that approximately two-thirds of the PNW's annual precipitation falls between October and March, and the low storage capacity of many of the region's reservoirs in relation to total annual flow. The dependence on winter snowpack can leave water supplies sensitive to climatic variations and changes affecting snowpack and streamflow timing.

INFLUENTIAL LEADERSHIP

Success in reaching sales goals can depend on a leader's reaction to employees' questions

Editor's note:

This new column, to appear periodically, will offer guidance on a wide variety of management issues affecting top-line and bottom-line growth. Demmie Hicks, president and CEO of DBH Consulting, whose work with independent agencies was profiled in our March 2006 issue, will direct the column. This month the column is written by two of DBH's consultants.

What are the various ways in which you serve as a leader in your organization and in your private life? Leadership is not something that only a few at the top engage in; individuals at any level of an organization can assume such a role. When you don't wait to be told what to do, but think about what needs to be done; when you think outside the box and influence others to do the same; when you think creatively and bring others on board with your vision of how something should be done; when you look into the future and think about possibilities as opposed to obstacles, and inspire others to see the future in a similar manner, you are engaging in leadership no matter where you are in the organizational chart hierarchy.

So, do you see yourself as the leader that you have the potential to be? Have you embraced and given voice to the leader within you? The task is hardly to become a leader; the task is to learn to bring out the leader within you.
Advertisement

Leadership is often denned as the art and practice of achieving desired results through others. What are those qualities that make a leader an influential leader?

To begin with, let's focus on the word influence. According to the Merriam-Webster dictionary, influence has to do with "the act or power of producing an effect without apparent exertion of force or direct exercise of command; the power or capacity of causing an effect in indirect or intangible ways." Influential leadership, then, is the type of leadership that relies on influence as opposed to coercion. It is the type of leadership that creates followers who want to follow as opposed to followers who believe that they have to follow.

The DBH Consulting Influential Leadership Model breaks down the components of influential leadership so that it can be utilized by everyone who is interested in being a more effective leader.

The base of the model stresses that a leader needs to possess content knowledge and expertise. Such expertise may include knowledge of the business and industry, mergers and acquisitions, management discipline, and function. Followers would be hard pressed to follow a leader whose technical expertise and knowledge they don't trust.

A leader also needs to possess complementary skills. Fundamental among them are the skills of delegating, conflict resolution, energy awareness, system thinking, self-care, management of diversity, organizational culture awareness, stakeholder balance, collaboration, process thinking, and timing. While knowledge, expertise, and skills are fundamental necessities for any leader, the heart of influential leadership comes from the leader's core values, such as authenticity, integrity and service.

Influential leaders believe in the value of authenticity. That is, they are who they are; what they do is reflective of their personality and character. In his Authentic Leadership, author Bill George, the former chairman and CEO of Medtronic, says that "authentic leaders genuinely desire to serve others through their leadership. They are more interested in empowering the people they lead to make a difference than they are in power, money, or prestige for themselves. They are as guided by qualities of the heart, by passion and compassion, as they are by qualities of the mind."

Another core value of influential leaders is integrity; that is. being consistently honest, forthright, and ethical; doing what they say and saying what they do. They walk their talk. Followers need to be able to trust the leader, and without that trust, influence is impossible.

Influential leaders believe in humility; they are willing to acknowledge that they don't know everything; they are open to learning from others.

Influential leaders also hold service at a high value. They want to be of value to others, contributing to the benefits of others, whether it is their employees, their business, their industry, their family, or their peers.

Leadership skills

Along with the above values, influential leaders develop a set of fundamental influencing skills that appear to be deceptively simple. Those skills include listening, reflecting, dialoguing, modeling, and use of self.

There is an art to listi'iiinfj and engaging in dialogue. Think of the last time someone challenged you in a conversation. Were you truly listening? We often listen only half-heartedly. For example, if you have a tendency to formulate your rebuttal while someone is talking to you, you are not fully listening. Many problems in business occur because of poor communication, and good listening is the first step toward better communication.

STEP WISE, MULTIPLE OBJECTIVE CALIBRATION OF A HYDROLOGIC MODEL FOR A SNOWMELT DOMINATED BASIN1

The ability to apply a hydrologic model to large numbers of basins for forecasting purposes requires a quick and effective calibration strategy. This paper presents a step wise, multiple objective, automated procedure for hydrologic model calibration. This procedure includes the sequential calibration of a model's simulation of solar radiation (SR), potential evapotranspiration (PET), water balance, and daily runoff. The procedure uses the Shuffled Complex Evolution global search algorithm to calibrate the U.S. Geological Survey's Precipitation Runoff Modeling System in the Yampa River basin of Colorado. This process assures that intermediate states of the model (SR and PET on a monthly mean basis), as well as the water balance and components of the daily hydrograph are simulated consistently with measured values.

(KEY TERMS: Precipitation Runoff Modeling System; Shuffled Complex Evolution; Colorado; optimization; solar radiation; potential evapotranspiration; water balance; runoff.)
Advertisement

(ProQuest Information and Learning: ... denotes formulae omitted.)

INTRODUCTION

Runoff from winter snowpack is the main supply of water in the intermountain western United States (U.S.). NOAA National Weather Service (NWS) and USDA National Resources Conservation Service (NRCS) issue runoff forecasts for the western U.S. Both of these agencies are attempting to modernize their runoff forecasting tools by incorporating more spatially distributed hydrologie modeling techniques (e.g. Spatially Distributed Hydrologie Modeling, USDA-NRCS, 1998; Carter, 2005). To this end, the NRCS is currently configuring a version of the U.S. Geological Survey's (USGS) Precipitation Runoff Modelling System (PRMS) for 35 snowmelt dominated basins by 2006 with the possible extension to the entire western U.S. (K. Rojas and F. Geber, NRCS, personnel communication, May 2005). PRMS is a distributed parameter, physically based hydrologie model. The ability to apply a distributed hydrologie model to a large numbers of basins in a timely and efficient manner for runoff forecasting purposes requires a quick and effective calibration strategy.

Traditional approaches to calibration and evaluation of distributed hydrologie models compared observed and simulated runoff at the outlet of the basin. This traditional approach is not sufficient by itself in the evaluation of distributed hydrologie models (Refsgaard, 1997). While incorporation of spatial data into the calibration and evaluation process is ideal, research in this area has occurred mainly in heavily instrumented research basins (Refsgaard, 2000). In general, the data available for calibration and evaluation of distributed hydrologie models are limited for the basins in which NOAA and NRCS are forecasting runoff.

Gupta et al. (1998) proclaimed that hydrologie model calibration must consider the multiple objective nature of the problem. The use of multiple objective functions in the calibration of hydrologie models has become increasingly popular. For example, Hogue et al. (2000) examined recessions and low flows, higher flows, and base flows; Turcotte et al. (2000) examined droughts, annual and monthly flow volumes, high flows, high flow synchronization, and snowmelt runoff; Madsen (2000) examined the water balance, hydrograph shape, peak flows, and low flows; and Boyle et al. (2000; 2003) examined three components of the hydrograph described as driven, nondriven quick, and nondriven slow. While these studies used multiple objectives, the only data used was runoff different portions and time steps of the hydrograph were configured for the multiple objective calibrations. Intermediate variables computed by the hydrologic model (such as solar radiation, potential evapotranspiration, snow water equivalent, snow covered area, and soil moisture) could be characterized by parameter values that do not replicate those hydrological processes in the physical system.

Is this paper, a multiple objective calibration strategy that incorporates additional, easily obtainable, data sets is presented. Four variables simulated by PRMS are used as calibration data sets: (1) solar radiation (SR), (2) potential evapotranspiration (PET), (3) water balance, and (4) daily runoff components. The SR and PET datasets are monthly mean values derived from nationwide data sets, making them readily accessible for application in a large number of basins. The parameters influencing each of the model variables are calibrated in a step wise, multiple objective procedure similar to that presented by Hogue et al. (2000). This process gives the user higher confidence in the model output by assuring that intermediate states of the model (as described by monthly mean values of SR and PET), as well as the water balance and components of the daily hydrograph are simulated consistently with measured values.

STUDY AREA

The Yampa River basin at Steamboat Springs (USGS streamflow gaging Station 09239500) (USGS 2005a) in northwestern Colorado was chosen as the study area (see Figure 1). The Yampa River basin is a mountainous basin where the runoff is strongly dependent on snowmelt, peaking during May. The basin is 1,430 km^sup 2^ in area and ranges in elevation from 2,000-3,800 meters. Figure 2 shows the daily basin mean by month for precipitation and maximum and minimum temperature for two eight-year periods: Water Years (WYs) 1996-2003 and 1988-1995. WYs run from October through September. These two eight-year periods were chosen as the calibration and evaluation periods, respectively. The wettest month for the Yampa River basin is February and the driest month is June. The warmest months are July and August and the coldest are December and January.

Destination: Titan: this January, a small space probe will parachute to the surface of Saturn's Largest Moon

Around 4:30 A.M. eastern standard time oil the morning of January 14, 2005, a flyingsaucer-shaped object named Huygens will encounter an atmosphere for the first time since it left Earth, in 1997. In that atmosphere's thin, cold gas, the object, roughly nine feet in diameter and hurtling through space at 13,500 miles an hour, will make its first palpable contact with Titan, the largest moon of the planet Saturn. Ever so slightly, the friction with Titan's atmosphere will slow down the spacecraft, triggering a complex sequence of events that, in the ensuing few hours, should unravel some of the secrets of what could be the most exotic environment in our solar system. By all signs to date, that environment could be dominated by complex organic molecules and seas made of liquefied hydrocarbons. A day at a Titanian beach would be spent freezing to death under hazy skies made of methane and nitrogen, a scenario similar to what would have taken place oil the early Earth (though our young planet probably wasn't freezing).

The high-speed entrance of a flying saucer into Titan's atmosphere is not so much a beginning as a culmination of a much longer process, which started more than twenty, years ago. Not long after the two Voyager spacecraft visited Saturn, in 1980 and 1981, three planetary scientists, each studying different aspects of the outer solar system, hatched a plot to convince two space agencies, NASA and the European Space Agency (ESA), to mount a major mission specifically to study the Saturnian system.
Advertisement

Voyager 1 and Voyaager 2 had flown past Saturn and its retinue of satellites. Observations by those spacecraft had revealed the staggering beauty and complexity of the planet's ring system as never before, not to mention the diverse nature of its satellites and the enormous extent of the planet's magnetosphere, the magnetic "bubble" that surrounds Saturn. Those data, though, were mere snapshots, taken by two interplanetary tourists as they raced on epic journeys through and beyond our solar system. The three planetary scientists--Daniel Gautier of the University of Paris, Tobias C. Owen of the University of Hawai'i in Honolulu, and Wing-Huen Ip, then at the Max Planck Institute for Aeronomy in Katlenburg-Lindau, West Germany--felt that what the Voyagers had shown deserved a second look. A dedicated mission of long duration could refine and expand on the many tantalizing clues about the unique Saturnian system that investigators first glimpsed in the early 1980s.

Such a project would be neither cheap nor easy to mount, and numerous other scientific and technical projects would be competing for the resources of the space agencies. Persuading those agencies to undertake a new mission to Saturn would be an uphill battle. Furthermore, the time schedules announced by the two agencies for selecting and approving space missions rarely seemed to coincide. Yet, whether by good fortune, the power of their arguments, or the support of the relevant scientific communities in the United States and Europe, the three investigators succeeded. In the late 1980s both NASA and ESA approved a mission for which NASA would provide a craft to orbit Saturn and ESA would provide a dedicated probe. Once in the vicinity of Saturn, the orbiter would release the probe, which would descend through Titan's atmosphere to land on its surface.

Gautier, Owen, and Ip are among the latest in a long line of investigators that stretches back to the era just after the invention of the telescope. One of the first observers off Titan was the Dutch physicist and astronomer Christiaan Huygens, who deduced in 1655 that Titan was in orbit around Saturn. At the time, little information could be gleaned, aside from the fact that Titan was fairly large. The state of knowledge about the moon remained ranch as Huygens had left it until 1944, when Gerard Kuiper, a Dutch astronomer working in the U.S., detected the unmistakable signature of methane gas in the spectrum of Titan, revealing the existence of an atmosphere.

The Voyager flybys were a great leap forward in the human knowledge of Titan. Precise measurements pegged the moon's diameter at 3,200 miles, larger than the diameter of Mercury. But it is Titan's atmosphere, not its size, that makes the moon of such enormous interest. The Voyager spacecraft revealed that atmosphere to be thicker than Earth's--its column mass, the total mass of gas in a column extending from the surface to the top of the atmosphere, is some ten times that of our own planet's atmosphere. (Titan's relatively small surface gravity, compared with the Earth's, makes its pressure at "sea" level, as measured by Voyager 2, about 1.5 times that of the Earth's surface atmospheric pressure.) Furthermore, the composition of Titan's atmosphere is complex: it is made up primarily of nitrogen, but a variety of hydrocarbons, such as methane, ethane, ethyne, and propane, are also present.

Titan is the only satellite in the solar system to possess an atmosphere, though trace amounts of gases are present on a few other satellites. But Titan's atmosphere is unique in the solar system, primarily because of its mixture of hydrocarbons. Theoretical models predict that methane gas exposed to ultraviolet light (even the small amount that reaches Titan from the distant Sun) undergoes a series of chemical reactions. Those reactions could give rise to the approximate concentrations of the various gases other than methane that the Voyager flybys observed. Furthermore, the theoretical models predict that, with time, the ultraviolet radiation would supply the energy needed to form increasing quantities of complex, long-chain hydrocarbon molecules. The most intriguing conclusion of this line of reasoning is that among the longchain molecules could be precursors of molecules needed for life.

Monday, April 02, 2007

WEATHER WORRIES

Brokers report market & pricing problems within 100 miles of coast; softness elsewhere

By Phil Zinkewicz

Property and casualty insurance agents and brokers these days are probably feeling a bit Dickensian in that, in terms of the availability of carrier markets, it appears as though these are the best of times and the worst of times.

As for the best of times, consider a recent property and casualty insurance market index survey conducted by the Council of Insurance Agents and Brokers (CIAB). For the second quarter of this year, the survey found that 51% of brokers responding said average premium rates for small accounts were down between 1% and 20%. An additional 28% of brokers registered no changes in small account rates compared with renewals in the first quarter of the year.

The drop in renewal rates was even steeper for medium and large accounts, according to the survey. Nearly six in 10 of the brokers reported that medium and large accounts were down 1% to 20%. An analysis of the Council's survey data by Lehman Brothers said commercial premium rates declined by an average of 3% for all sizes of accounts during the second quarter of 2006. Among individual property/casualty lines, all experienced a decrease except commercial property, which increased 9.3% during the second quarter. So, now we come to the worst of times. It appears that the reason insurers are chasing after this segment of the marketplace is that they have to make up for premium dollars lost as the result of their overall shunning of the areas most susceptible to catastrophe property losses.

The CIAB survey showed brokers and agents reporting that premium rates for coastal properties were up 300% to 500% - and some even by 600%-and that the impact was being felt as far as five miles inland.

Brokers said that higher property rates and deductibles and lower coverage limits were the industry standard during this year's late spring and early summer months, with significant differences in the way catastrophe-exposed risks were being underwritten. "The market is changing daily," said a broker from the Southwest. "Capacity is scarce, and it's a great concern that later in the year, there may not be any capacity left." The broker was referring to the Southeast Gulf Region and Texas in particular.

Another broker from that region said, "Rates are up 300% to 500% on commercial property and builders risk. Deductibles increased 200%, and it is also deductible by location, not by occurrence."

That broker also reported, "Carriers are fighting insureds on all aspects of storm claims, business interruption, property, equipment and marine. Any large claim gets delayed through carrier claims reviews and sign-offs for advance payment."

A significantly larger number of agents and brokers cited concerns about capacity as one of their top three market worries in this survey. More than half-55%-listed capacity, compared with 40% who identified it as a top concern in the first quarter survey.

Capacity and pricing problems were not confined just to at-risk properties along the coast, the survey showed. Commercial earthquake insurance is increasing 50% to 100% for renewals, several brokers reported, and there are also significant increases in deductibles.

One independent insurance wholesaler based in New York has experienced first hand the problems of brokers in the current catastrophe insurance market. In an interview with Rough Notes, J. C. Sparling, executive vice president of Mercator Risk Services, said that risk managers and business owners recently had to swallow a bitter pill during annual insurance renewals for properties that are exposed to catastrophe, including steep price increases and new limitations on policies, pointing to a severe shortage in the total amount of commercial and multi-family property insurance available.

"Property coverage for tier-one wind and flood is near capacity," Sparling said. "Ratings agencies are scrutinizing many insurers for 'cat' exposure, auditing their surpluses to make sure they can cover possible losses from this year's hurricane season."

Sparling said that catastrophe capacity shortages no longer affect just Florida. "Underwriters for the Gulf Coast, especially Houston and the Carolinas are also feeling the pinch," he said. "As a result, some underwriters are now redefining 'coastal' as any property within 100 miles of the coast. The standard benchmark is within 50 miles of the water."

The wholesale broker said also that revised catastrophe modeling systems are showing higher probable maximum losses from wind and flood, causing property underwriters to reassess their exposures on renewals and perhaps increase their attachment points on cat risks.

"In the midst of this crisis, retail brokers for owners of smaller middle-market coastline properties are facing a steep challenge in placing coverage," said Sparling. "The smart retail brokers are explaining this market climate to their insureds far in advance, putting the wholesaler and the client together in terms of expectations and goals. In an environment where risk managers and business owners are firing retail brokers because they were unprepared for the capacity shortage fallout, and hiring new ones, retail brokers that triple-check valuations and design submissions that underwriters will take seriously are going to keep their clients and probably attract some new ones," he said.

Unconventional exploration technologies: take another look: some have been around for many years, others are new. They can all prosper in a boom marke

There's something about human nature that wants it both ways. We like it when some simple technology, something supposedly overlooked, succeeds wildly. It's a bit cultural too: The less educated especially like it. It's like poking a stick in the eye of megabucks PhD research and development.

Conversely, we are suspicious of anything that's too cheap, too easy. Surely, we think, the "big boys" with all their money and know-how, didn't overlook this simple idea. They probably looked into it, and deemed it unworthy.

All too often, the inventor or practitioner of the technology is unwilling to allow the technique to be critiqued, examined or make any attempt to prove its utility. "Why should I? I'll find all the oil and make all the money!" they would say. (But that doesn't stop them from asking me to publish them!) In such cases, it is fitting that their technology should remain largely unused.

Finally, there's the disruptive effect that such technology, especially when it's cheap, could potentially have on the status quo. Sometimes, the disruptive effect is real, such as when railroads and automobiles replaced wagons and horses. Sometimes, it's just logic with a touch of paranoia, such as when people believe that technology breakthroughs are being bought and squelched to prevent them from encroaching upon billions of dollars of current investment. Such conspiracy theories are almost always wrong. What follows are technologies that the author neither endorses nor ridicules, but out of the large number of unconventional, even maverick, technologies, these, in the author's opinion, have the potential to reduce exploration risk.

SURFACE EXPRESSION

In many ways, the expression of an oil or gas reservoir can trigger anomalous readings across many technologies. Structurally tilted strata that form deep traps can become shallow or outcrop, possibly resulting in anomalous readings that relate to the formation, such as mineralogy, radioactivity or electrical conductivity, and only coincidentally relate to hydrocarbon pore fluids.

Surface expression of seepage along transmissive faults, bedding planes or directly upward (microseepage) is often related to a deeper reservoir. This expression, in turn, can be revealed in alteration of microbial communities and the presence of soil gases, such as methane, ethane, butane, etc.

Sometimes, you can see the surface expression with your eyes, either as an early or late seasonal color change caused by stress in vegetation, plant species distribution (Fig. 1), crown density or vigor (dwarfs or giants). More subtle changes due to seepage are shown in spectral reflectivity, sometimes called hyperspectral analysis. Even early versions of Landsat, with a relatively small number of channels, showed field outlines.

[FIGURE 1 OMITTED]

Landsat continues to be used in exploration work today (Fig. 2), as do several other newer satellite systems, although more sophisticated airborne platforms yield much better spectral information on seeps, vegetation, mineralogy, and so on. An extensive library of spectral signatures is held by government agencies such as NASA (ASTER) and USGS. In all cases, ground truthing is needed for fine calibration.

[FIGURE 2 OMITTED]

Furthermore, seepage effects can result in mineralogical changes from oxidation/reduction reactions that might be revealed in changes in some attribute of the overburden, including electrical properties, such as capacitance and conductivity, magnetic properties and radiological properties. These can take the form of anomalous concentrations, deficits or halos. For example, consider the following case of radiometric anomalies.

An interesting study was done over Helez and Kochav oil fields in Israel. These fields have halo-type radioactivity anomalies associated with depletion of eU, eTh and K-40 in sediments overlying these fields, Fig. 3. Since their aqueous chemistries are quite different, it is difficult to account for such a uniform depletion--with concomitant flanking highs--by a process of vertical aqueous transfer of daughter nuclides; especially from a reservoir in which redox changes have led, primarily, to uranium accumulation. Nor can continuous gaseous-transfer processes, requiring Rn daughters of Th and U, be responsible for the surface elemental distribution (K has no gaseous precursor). (2)

[FIGURE 3 OMITTED]

"However, these elements do tend to behave similarly when entering the crystal structure of rock-forming minerals, and when adsorbed onto clay surfaces, mainly due to similarity in the ionic radius of these large cations. It thus appears that an upward flux of radionuclides leaking from the reservoir does not explain the observed features. Rather, it may be the hydrocarbon flux itself that can be corrosive, which leads to mineral alteration in the overlying sedimentary rock and the release of associated cations. These would tend to migrate laterally, away from the altered area, and become immobile at the periphery by adsorption on clays." (2)

Acquisition logistics excellence: an Internet listing tailored to the professional acquisition workforce

Shared systems and tools to help the federal acquisition community and the government's business partners conduct business efficiently.

Acquisition Community Connection (ACC)

http://acc.dau.mil

Policies, procedures, tools, references, publications, Web links, and lessons learned for risk management, contracting, system engineering, total ownership cost.

Advanced Concept Technology Demonstrations (ACTDs)

www.acq.osd.mil/actd/

ACTD's accomplishments, articles, speeches, guidelines, and POCs.

Aging Systems Sustainment and Enabling Technologies (ASSET)

http://asset.okstate.edu/asset/index.htm

A government-academic-industry partnership. ASSET program-developed technologies and processes increase the DoD supply base, reduce time and cost associated with parts procurement, and enhance military readiness. www.safaq.hq.af.mil/

Policy; career development and training opportunities; reducing TOC; library; links.

Air Force Materiel Command (AFMC) Contracting Laboratory's FAR Site

http://farsite.hill.af.mil/

FAR search tool; Commerce Business Daily announcements (CBDNet); Federal Register; electronic forms library.

Army Acquisition Support Center

http://asc.army.mil

News; policy; Army AL & T Magazine; programs; career information; events; training opportunities.

Assistant Secretary of the Army (Acquisition, Logistics & Technology)

https://webportal.saalt.army.mil/

ACAT Listing; ASA(ALT) Bulletin; digital documents library; ASA(ALT) organization; links to other Army acquisition sites.

Association for the Advancement of Cost Engineering International (AACE)

www.aacei.org

Promotes planning and management of cost and schedules; online technical library; bookstore; technical development; distance learning; etc.

Association of Old Crows (AOC)

www.crows.org

News; conventions, courses; Journal of Electronic Defense.

Committee for Purchase from People Who are Blind or Severely Disabled

www.jwod.gov

Information and guidance to federal customers on the requirements of the Javits-Wagner-O'Day (JWOD) Act.

Defense Acquisition University (DAU)

www.dau.mil

DAU Course Catalog; Defense AT & L magazine and Defense Acquisition Review Journal; course schedule; policy documents; guidebooks; training and education news for the AT & L workforce.

DAU Alumni Association

www.dauaa.org

Acquisition tools and resources; government and related links; career opportunities; member forums.

DAU Distance Learning Courses

www.dau.mil/registrar/enroll.asp

DAU online courses.

Defense Advanced Research Projects Agency (DARPA)

www.darpa.mil

News releases; current solicitations; "Doing Business with DARPA."

Defense Electronic Business Program Office (DEBPO)

www.acq.osd.mil/scst/index.htm

Policy; newsletters; Central Contractor Registration (CCR); assistance centers; DoD EC partners.

Defense Information Systems Agency (DISA)

www.disa.mil

Structure and mission of DISA; Defense Information System Network; Defense Message System; Global Command and Control System.

Defense Modeling and Simulation Office (DMSO)

www.dmso.mil

DoD Modeling and Simulation Master Plan; document library; events; services.

Defense Systems Management College (DSMC)

www.dau.mil

DSMC educational products and services; course schedules; job opportunities.

Defense Technical Information Center (DTIC)

www.dtic.mil/

DTIC's scientific and technical information network (STINET) is one of DoD's largest available repositories of scientific, research, and engineering information. Hosts over 100 DoD Web sites.

Director, Defense Procurement and Acquisition Policy (DPAP)

www.acq.osd.mil/dpap

Procurement and acquisition policy news and events; reference library; DPAP organizational breakout; acquisition education and training policy, guidance.

DoD Defense Standardization Program

www.dsp.dla.mil

DoD standardization; points of contact; FAQs; military specifications and standards reform; newsletters; training; nongovernment standards; links.

DoD Enterprise Software Initiative (ESI)

www.esi.mil

Joint project to implement true software enterprise management process within DoD.

DoD Inspector General Publications

www.dodig.osd.mil/pubs/

Audit and evaluation reports; IG testimony; planned and ongoing audit projects of interest to the AT & L community.

DoD Office of Technology Transition

www.acq.osd.mil/ott/

Information about and links to OTT's programs.

DoD Systems Engineering

www.acq.osd.mil/ds/se

IPolicies, guides and other information on SE and related topics, including developmental T & E and acquisition program support.

Earned Value Management

www.acq.osd.mil/pm

Implementation of earned value management; latest policy changes; standards; international developments.

Electronic Industries Alliance (EIA)

www.eia.org

Government relations department; links to issues councils; market research assistance.

Federal Acquisition Institute (FAI)

Ozone hole: a longer recovery

The ozone hole over the South Pole is taking longer to recover than previously thought, according to a recent study by the National Aeronautics and Space Administration (NASA) and two other federal agencies. The study, carried out using a new computer modeling system, projects that the protective ozone layer high over Antarctica (in the stratosphere) will take nearly 20 years longer to fully mend than scientists had estimated. Earlier projections indicated that the ozone layer would recover by 2050; the new findings put recovery in 2068.

Ozone depletion occurs throughout the stratosphere but is pronounced at the poles, where the ozone layer becomes so thin at times that it is nearly nonexistent--that is, a "hole" forms. The largest hole appears over Antarctica during the Southern Hemisphere's spring and exposes nearly the entire continent to higher doses of the sun's ultraviolet radiation. Human activity takes most of the blame: Air currents carry the anthropogenic chemical compounds chlorofluorocarbons (CFCs) and halons into the stratosphere, which produce chlorine and bromine gases that, in turn, destroy ozone.

The modeling system combines several data sources. These include estimates of future Antarctic chlorine and bromine levels based on current amounts (from observations by NASA satellite, National Oceanic and Atmospheric Administration ground-level stations, and National Center for Atmospheric Research aircraft); likely future emissions; the time it takes for the transport of those emissions into the Antarctic stratosphere; and assessments of future weather patterns over Antarctica. The 1987 Montreal Protocol and other international agreements banned the production of CFCs and other chemicals linked to ozone depletion, but these compounds and their effects still persist in the environment. Scientists with the NASA project say that the hole over Antarctica has not shrunk significantly since the ban went into effect, and the ozone layer may not begin to show real progress toward recovery until 2018.

The stratospheric ozone layer (not to be confused with harmful ground-level ozone, a primary ingredient of smog) prevents 90-99 percent of the sun's ultraviolet radiation from reaching the Earth's surface. Without such protection, such radiation can cause skin cancer, genetic damage, and eye damage and can harm marine life.

Insuring benevolence: if enough rain doesn't fall in the Horn of Africa this year, drought victims in Ethiopia could benefit from the global insurance

The United Nation's World Food Programme is the first international humanitarian agency to tap into the evolving weather derivatives market with a $1 million policy from AXA Re, which will pay out if the rainfall in areas of Ethiopia falls below a prescribed level.

"The idea is to make sure the family doesn't deplete its assets and sell off their livestock," says Ulrich Hess, chief of business risk planning at the global food agency's headquarters in Rome. "We want to give people enough to buy food and keep their children in school and keep going."

Analysts say the development of the weather derivatives market and parametric risk modeling over the past decade has opened up a window of opportunity for humanitarian agencies and developing countries, as insurers get a chance to spread their risks to new parts of the globe.

"It's a way to manage the risk of humanitarian problems prospectively," says Warren Isom, a senior vice president at Willis Re, "to look at these risks through the risk management lens rather than a humanitarian lens." And as relief agencies slowly transform their outlook on how to secure trading for handling disasters from droughts to earthquakes, insurers and reinsurers can shift their catastrophic risks outside traditional areas such as the hurricane-prone East Coast of the United States or the tumultuous earthquake zones of Japan.

"Spread is important in catastrophic insurance, and right now these low-frequency, high-severity risks occur in a relatively small number of places," says Isom, citing Northern Europe, the West Coast and Mexico as other locales. "This gives the market some more spread."

And this diffusion is exactly what enticed AXA Re.

"We saw this as a way to diversify our risk into an area where we have no exposure," says Jean-Christophe Garaix, who's in charge of weather coverage at AXA Re, part of the Paris-based AXA Group. Right now, most of the French reinsurers risk lies in the United States, Europe and Japan.

At the same time, developing countries--if armed with insurance coverage against perennial natural disasters like drought, floods, windstorms or earthquakes--could benefit by gaining more economic stability.

"This can provide a more stable environment for investment and economic growth," says Brian Tobben, vice president of weather at PartnerRe Ltd. in Greenwich, Conn. "Aid can save lives, but this type of mechanism can also help build an overall economy."

Analysts agree the index-based products aren't able to cover the cost of food or housing for people displaced by manmade disasters, such as civil conflicts that turn citizens into refugees in their own country or send them fleeing across borders. The absence of a measurable, objective criteria in such scenarios--such as low rainfall or rising water levels--would create an underwriting nightmare.

"There's no index for civil conflicts ... so there would be no market for that type of insurance," says Hess of the WFP.

But the development of data-sensitive insurance products has created a small success story for the U.N. food agency as it fights poverty and hunger in Ethiopia.

"It's a new concept ... parametric-based instruments that use risk-modeling techniques. You can understand what the risk is and how to price it," says Eugene Gurenko, lead financial services specialist for the World Bank in Washington, D.C. "It provides objective verification."

That's exactly what persuaded AXA Re to assume the risk.

"In Africa, it's not very easy to have good data. But this let us define what the risk is and gave us precise data," says Garaix, adding that the company reviewed decades of Ethiopia's weather history to determine no trends existed. "We have no expectations ... negative or positive."

So 26 weather stations spread around this northeastern African nation of nearly 75 million people, except for southern areas close to Somalia, will be monitoring rainfall dining the two rainy seasons that run from March 10 to Oct. 31. If rainfall dips below a certain level at the end of the covered period, the policy will pay out $100, in the form of cash or food, to each affected household in the drought-stricken area.

The annual premium of nearly $1 million was financed with a $930,000 grant from the U.S. Agency for International Development. The payout could tally $7.1 million if all 67,000 households were impacted by drought. The local branches of USAID and WFP will oversee the distribution of any claims by the Ethiopian government's food security bureau.

By having an insurance mechanism in place, Ethiopia can distribute relief to farmers in a smooth manner and avoid waiting for humanitarian agencies to gather funds from donor countries, which sometime supply aid with unwanted strings attached, Hess says.

"This is a way for countries like Ethiopia to get out of the food-aid business and to have the money in people's hands within a few days," Hess adds.