WTF ... IS WTF!?
We are a collective of people who believe in freedom of speech, the rights of individuals, and free pancakes! We share our lives, struggles, frustrations, successes, joys, and prescribe to our own special brand of humor and insanity. If you are looking for a great place to hang out, make new friends, find new nemeses, and just be yourself, WTF.com is your new home.

Crazy RedHeaded BITCH.

_Kitana_

Angel of Death
4,674
16
0
#1
Come on guys lets see how bad you can make yourself/hair look.

My pic, took 5 minutes.

Laughed my ass off...

down side... Hair won't lay down.

Giggles
 

Jung

???
Premium
13,981
1,399
487
#5
Background, Current State, and Categorization of Proposed Technology

Earthquakes in urban centers are capable of causing enormous damage. The recent January 16, 1995 Kobe, Japan earthquake was only a magnitude 6.9 event and yet produced an estimated $200 billion loss. Despite an active earthquake prediction program in Japan, this event was a complete surprise. The 1989 Loma Prieta and 1994 Northridge earthquakes were also unexpected and caused billions of dollars of damage as well as loss of lives. Future similar scenarios are possible in Los Angeles, San Francisco, Seattle, and other urban centers around the Pacific plate boundary.

Two new technologies, one providing the data and the other the means, make it possible to pursue the development of complex, sophisticated models for predicting the behaviors of fault systems. Comprehensive, real-time datasets are being collected using NASA-developed space systems demanding new technologies for storage, handling, transmission, visualization and analysis. These include surface geodetic data, primarily space-based GPS and InSAR data collected under the auspices of the NASA Earth Science Enterprise, augmented by more traditional land-based datasets, such as seismicity, strong motion, and other remotely sensed data. These data provide the necessary constraints for carrying out realistic simulations of fault interactions. Information technology provides the means for clearly defined accessible data formats and code protocols as inputs to the simulations. It provides a framework for documentation of codes and standards as well as visualization and data analysis tools. Without such tools it will be impossible to construct the more complex models and simulations necessary to reduce future losses from major earthquakes.

NASA has recently committed a great deal of effort and funding to develop the means to observe and characterize the movements of the earth's crust that arise from plate tectonics leading to catastrophic earthquakes. Recent research indicates that the phenomena associated with earthquakes occur over many scales of space and time. Understanding the dynamic processes responsible for these events will require not only a national commitment to develop the necessary observational datasets, but also the technology required to use these data in the development of sophisticated, state-of-the-art numerical simulations and models. The models can then be used to develop an analytical and predictive understanding of these large and damaging events, thus moving beyond the current, more descriptive approaches now routinely employed. Approaches emphasizing the development of predictive models and simulations for earthquakes will be similar to methods now used to understand global climate change, the onset of the El Niño-Southern Oscillation events, and the evolution of the polar ozone depletion zones.

We specifically propose to standardize and integrate the wealth of space geodetic results and modeling codes produced under the auspices of NASA/ESE's geodynamics program to allow rapid and effortless sharing of information, retrieval of data, and model development and validation. While we highlight General Eaerthquake Models (GEM), the data and codes will be in a format useable by researchers in numerous disciplines related to solid earth science and natural hazards. We propose to investigate and evaluate the use of existing technologies such as XML and web servlets to develop a standardized GEM data and computational environment. The GEM computational environment will make use of distributed object technology using a three-tiered computing model in which a web server provides the interface between a user and an object-based program.

The proposed program addresses categories 3, 4, and 6 in the ESE Request for Information. 3) Data and Information Production. We propose to organize high-value Earth Science data products such as GPS crustal velocities and InSAR displacement maps for XML-based Internet lookup and efficient delivery of precisely specified data subsets. The goal is real-time transfer of machine-readable data products in response to both interactive and automated requests. 4) Analysis Search and Display. We propose to develop browsable web-based catalogues for content-based search and retrieval, visualization tools, and programs that will result in mining of the data. 6) Infrastructure. We will develop the infrastructure for a distributed computational framework that employs open systems interfaces, identified protocols and metadata structures and standards. The resources will be accessible on the Internet via the web.

We also expect to have a major impact on the autonomous operations component of category 5) Systems Management. Autonomous hazard-monitoring systems in the future will require a sophisticated combination of onboard and ground-based processing capabilities to generate the basic information required by an automated decision support system. One of the six major questions in the NASA Administrator's strategic outlook asks how we can develop predictive natural disaster models. These events (e.g. volcanic eruptions) are potentially catastrophic if not flagged quickly and reliably. The datamining and pattern recognition software to be developed in this program will be directly applicable to early warning systems. Our approach is based on the development of general pattern recognition and datamining. By integrating these concepts within an object-oriented framework based on open standards, we can provide a blueprint for the design of fully autonomous continuously monitoring systems of heterogeneous assets that will underpin much of NASA/ESE’s needs in the future.

Capitalizing on these information technologies will bring data collection and modeling efforts into a cohesive program within ESE’s Natural Hazards Program. Surface geodetic data, together with seismicity data, are providing the major constraints necessary to construct GEM simulations. The global GPS Network provides the tectonic framework and boundary conditions for plate rates. The Southern California Integrated GPS Network is providing the detailed surface velocities necessary to constrain fault slip rates and geometries, and crustal structure and rheologies. GPS measurements have provided invaluable time dependent deformation data following several recent earthquakes. Interferometric Synthetic Aperture Radar (InSAR) data provide spatially detailed information about earthquakes and interseismic deformation. The velocities and displacements determined from these methods are used as inputs to various models. Digital elevation models (DEMs) provide information on gravitational forces, and uplift and erosion history. At present, researchers laboriously transform the data formats into ones useable by their own individual programs. The current methods include FTP of data sets larger than needed, parsing of data by such UNIX commands as awk, sed, and grep, or transcription of information directly from printed papers.

The modeling and simulation programs are similarly fragmented; however, most of the pieces exist to construct powerful GEM simulations. Object oriented programs served over web interfaces accessing standardized data will join the programs into simulations that will assimilate data and be self-consistent on many scales. Kinematic models use surface deformation data, and digital elevation models (DEMs) to estimate plate and microplate motions. Elastic models use surface observations to estimate fault geometries and slip rates. Viscoelastic calculations incorporate surface deformation, DEMs, and lab, heat-flow, stress, seismic and geologic data into models that produce estimates of crustal rheology and structure, fault geometry, fault slip and stress rates, heat flow, gravity, and refined estimates of surface deformation. These provide the inputs for quasi-static dynamic models of single faults and systems of faults. The quasi-static dynamic models provide space-time patterns, correlations, and information about fault interactions.

Simulations are critical to understanding the behavior of fault systems, a major problem in earth science, because earthquakes occur in the real earth at irregular intervals on timescales of hundreds to thousands of years. The simulations generate arbitrarily long seismicity catalogs and provide a numerical laboratory in which the physics of earthquakes can be investigated from a systems viewpoint. The simulations will provide invaluable feedback for the planning and design of future data collection efforts under the NASA Natural Hazards program. Developing information technology to handle the data and analysis will result in a revolutionary change in the manner in which earth scientists explore and analyze data while greatly decreasing the cost and increasing the utility of the data. The feedback loop between the simulations and design of data collection efforts will furthermore increase the value and utility of the data.

This project is well-suited to a modular implementation increasing its utility in the near- and far-term. We discuss here establishment of the GEM Computational Infrastructure (GEMCI). On a timescale of 1–2 years the space geodetic results can be standardized into easily accessible formats. Pilot computational projects can be launched and visualization tools standardized. By the end of the two years a user will be able to retrieve data, feed the data to a program, and visualize and perform statistical analysis on the output. The volume of space geodetic data is rapidly expanding. The volume of legacy codes is fairly small, resulting in a clean implementation of the GEMCI.

In the mid-term (2–7 years) we expect to see a significant transformation in the analysis and interpretation of solid earth science data. Scientists who are not experts in fields such as GPS or InSAR will be able to mine the data to the fullest extent possible. All of the data formats will be standardized and accessible to the broader solid earth science community. We expect to see seamless flow between data and codes and between codes, along with statistical analysis and visualization of the results of the simulations. We propose to employ datamining, machine learning and automated pattern recognition and analysis methodologies for recognizing the coherent space-time structures and correlations in the data. We expect new data and codes to follow the established codes and protocols and for the data and codes to be documented with examples and test cases.

Beyond the scope of this program we expect that new experiments will be designed and deployed to capture essential missing components identified by the simulations. The computational infrastructure will provide the framework for new space-based experiments whose results will be fed back into improved simulations.

2. Applicability of Proposed Technology to Earth Science Systems

Several new geodetic missions will be launched in the next decade. The Shuttle Radar Topography Mapper (SRTM) and the LightSAR satellite will provide unprecedented DEMs over much of the world. LightSAR will also provide coseismic displacement fields resulting from earthquakes and detailed surface deformation between them. The Southern California Integrated GPS Network will be fully operational within two years providing a wealth of continuous surface deformation data. A proposed follow-on to SCIGN, the Plate Boundary Observatory, is rapidly gaining momentum and will provide GPS data for over 1000 sites across the entire western region of the United States. The Gravity Recovery and Climate Experiment (GRACE) scheduled to launch in 2001 will provide information about mass changes within the earth from which the structure and dynamics of the mantle can be inferred—both important inputs to GEM.

To fully mine the data from these missions a seamless data handling/modeling environment must be established. GEMCI would result in numerous investigators utilizing solid earth data and models produced by NASA/ESE rather than a handful. The cost per scientific output would drop dramatically. Instead of spending long hours manipulating data, investigators will be exploring and interpreting the data. The value will further increase because more sophisticated models can be easily constructed under GEMCI. For example, data assimilation tools will allow investigators to constrain and compare models to large volumes of data rather than a small subsets; three-dimensional adaptive meshing technology for constructing finite element meshes makes three-dimensional finite element modeling of complex interacting fault systems practicable.

GEMCI provides the outlet for millions of end users to explore the data. Between earthquakes the general public can visualize tectonic motions observed with NASA technology. Displacement fields highlighting the region and extent of deformation due to earthquakes can be rapidly posted on the web. Potential aftershock zones can be identified with GEM tools. Utilities and emergency services will know where they should focus their efforts following earthquakes. Hazard maps will be greatly improved with the assimilation of space geodetic data into GEM even before earthquakes occur in the region. Retrofit priorities can be established mitigating future damage and loss of life.

3. Timeline, Cost to Develop, and Commercial Application

The cost to develop the GEM Computational Infrastructure (GEMCI) is $7.5M over a ten-year period. By the end of the second year, at a cost of $1M in workforce, data and code standards, and prototype modeling tools will exist. Beginning toward the end of this period a series of workshops and classes will begin to educate the broader community on the value and utility of the tools. During the next five years the GEMCI will be developed into a fully operational state at a cost of about $4.5M. Beyond the scope of this program the environment will be improved as new missions and experiments are launched and deployed, costing about $2M over three years. Hardware exists to carry out many of the models, but we anticipate about $1M in additional hardware costs, such as workstations for development and visualization, over the life of the project.

The ancillary benefits of the GEMCI to the commercial sector are manifold. Insurance companies will be able to establish sound rate structures given better hazard maps and information. Utilities will be more efficient handling the aftermath of earthquakes knowing where major damage has occurred and where aftershocks are likely to occur. Businesses can mitigate potential damage by determining the risk they face. Lives will be saved through properly constructed buildings and highways. None of this can be accomplished without the computational infrastructure to mine, assimilate, and analyze the great wealth of data that is rapidly accumulating
 

_Kitana_

Angel of Death
4,674
16
0
#11
lighten up guys.

it was ment for a little laugh while i was drunk

hehe
 

dustinzgirl

Banned - What an Asshat!
26,094
178
0
#13
_Kitana_ said:
lighten up guys.

it was ment for a little laugh while i was drunk

hehe
You do realize the crowd that you are talking too?

oh, and our hair looks almost the same, but mine is brown and completely unmanageable. LOL