Computer simulation

Computer simulations reproduce the behavior of a system using a mathematical model . Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics ( computational physics ), astrophysics , climatology , chemistry and biology , human systems in economics , psychology , social science , and engineering . Simulation of a system is represented as the running of the system’s model. It can be used to explore and gain new insights into new technologyand to estimate the performance of systems too complex for analytical solutions . [1]

Computer simulations are computer programs that can be small, run almost instantly on small devices, or large-scale programs that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. Over 10 years ago, a desert-battle simulation of one force invading another involved modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait , using multiple supercomputers in the DoD High Performance Computer Modernization Program. [2] Other examples include a 1-billion-atom model of material deformation; [3]a 2.64-million-atom model of the complex protein-producing organelle of all living organisms, the ribosome , in 2005; [4] a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), started in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level. [5]

Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantization . [6]

Simulation versus model

A computer model is the algorithm and equation used to capture the behavior of the system being modeled. By contrast, computer simulation is the actual running of the program that contains these equations or algorithms. Simulation, therefore, is the process of running a model. Thus one would not “build a simulation”; instead, one would “build a model”, and then “run the model” or equivalently “run a simulation”.

History

Computer simulation developed with the rapid growth of the computer, following its first broad-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation . It was a simulation of 12 hard spheres using a Monte Carlo algorithm . Computer simulation is often used as an adjunct to, or substitutes for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; their common feature is the attempt to generate a sample of a large number of scenarios in which a complete enumeration of all possible states of the model would be prohibitive or impossible.

Data preparation

The external data requirements of simulations and models vary widely. For some, the input might be just a few numbers (for example, simulation of a waveform of AC electricity on a wire), while others may require terabytes of information (such as weather and climate models).

Input sources also vary widely:

  • Sensors and other physical devices connected to the model;
  • Control surfaces used in the simulation
  • Current or historical data entered by hand;
  • Values ​​extracted as a by-product from other processes;
  • Values ​​output for the purpose by other simulations, models, or processes.

Lastly, the time is being varied:

  • “invariant” data is often built into the model code, or because the value is truly invariant (eg, the value of π) or because the designers consider the value to be invariant for all cases of interest;
  • data can be entered into the simulation when it starts up, for example by reading one or more files, or by reading data from a preprocessor ;
  • Data can be provided during the simulation run, for example by a sensor network.

Because of this variety, and because they have many common systems, there is a large number of specialized simulation languages. The best-known may be Simula (sometimes called Simula-67, after the year 1967 when it was proposed). There are now many others .

Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read text from gains in gold binary files, what is much harder Knowing what is the accuracy(Compared To measurement resolution and accuracy ) of the gains are. They are expressed as “error bars”, a minimum and maximum deviation from the value range within which the true value is expected. Because it is useful to perform a “error analysis” [7] to confirm that values ​​will be used effectively.

Even small errors in the original data can accumulate into a substantial error later in the simulation. While this is the subject of “GIGO” ( garbage in, garbage out ) restriction, this is especially true of digital simulation. Indeed, observation of this inherent, cumulative error in digital systems is the main catalyst for the development of chaos theory .

Types

Computer models can be classified according to several independent peers of attributes, including:

  • Stochastic or deterministic – see below for examples of stochastic vs. stochastic or deterministic deterministic simulations
  • Steady-state or dynamic
  • Continuous or discrete and discrete event DE discrete event
  • Dynamic systems simulation , eg electric systems, hydraulic systems or multi-body mechanical systems (eg primarely by DAE: s) or simulations simulation of field problems, eg CFD of FEM simulations (described by PDE: s).
  • Local or distributed .

Another way of categorizing the models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:

  • Which store their data in their grids and require only stencil codes . Many CFD applications belong to this category.
  • If the underlying graph is not a regular grid, the model may belong to the meshfree method class.

Equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems.

  • Dynamic simulations model changes in a system in response to (usually changing) input signals.
  • Stochastic models random number generators model chance or random events;
  • discrete event simulation (DES) marriages events in time. Most computer, logic-test and fault-tree simulations are of this type. In this type of simulation, the simulator maintains a queue of events that are expected to occur. The simulator reads the tail and triggers. It is not important to execute the simulation in real time. It is often more important to be able to access the data produced by the simulation and to discover logic defects in the design or the sequence of events.
  • continuous dynamic simulation performs numerical solution of differential-algebraic equations or differential equations ( or partial or ordinary ). Periodically, the simulation program solves all the equations and uses the numbers to change the state and output of the simulation. Applications include flight simulators, construction and management of simulation games , chemical process modeling , and simulations of electrical circuits . Originally, these kinds of simulations were actually implemented on analog computers, where the differential equations could be represented directly by various electrical components such as op-amps . By the late 1980s, HOWEVER, most “analog” simulations Were we run conventional digital computers That emulate the behavior of an analog computer.
  • A special type of discrete simulation that does not rely on a model with an underlying equation, but can not be represented formally, is agent-based simulation . In the case of agent-based simulation, the individual entities (such as molecules, cells, or trees or consumers) in the model are directly (rather than by their density or concentration) and agent’s state is updated from one time-step to the next.
  • Distributed models run on a network of interconnected computers, possibly through the Internet . Simulations dispersed across multiple host computers as often referred to as “distributed simulations”. There are several standards for distributed simulation, including Aggregate Level Simulation Protocol (ALSP), Distributed Interactive Simulation (DIS), High Level Architecture (HLA) and the Test and Training Enabling Architecture (TENA).

Visualization

Formerly, the output data from a computer simulation has been presented in a table or a matrix showing how data has been affected by numerous changes in the simulation parameters . The use of the matrix is ​​related to the use of the matrix concept in mathematical models . However, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Though observers could not necessarily read out or quote a mathematical argument, they might be able to predictcoordinates . Such intense graphical displays, which transcend the world of numbers and formulas, sometimes also lead to lack of a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, the weather forecasting models are moving towards the clouds.

Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.

Other applications of CGI computer simulations are being developed to graphically display large amounts of data, in motion, as changes occur during a simulation run.

Computer simulation in science

Computer simulation of the osmosisprocess

Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:

  • A numerical simulation of differential equations that can not be solved analytically, theories that involve continuous systems such as phenomena in physical cosmology , fluid dynamics (eg, climate models , roadway noise models, roadway air dispersion models ), continuum mechanics and chemical kinetics fall into this category.
  • A stochastic simulation, typically used for discrete systems where events occur probabilistically and which can not be described directly with differential equations (this is a discretesimulation in the above sense). Phenomena in this category include genetic drift , biochemical or gene regulatory networks with small numbers of molecules. (see also: Monte Carlo method ).
  • multiparticle simulation of the response of nanomaterials at multiple scales to an applied force for the purpose of modeling their thermoelastic and thermodynamic properties. Techniques used for such simulations are Molecular dynamics , Molecular mechanics , Monte Carlo method , and Multiscale Green’s function .

Specific examples of computer simulations follow:

  • This paper is based on a large number of input profiles, such as the prediction of equilibrium temperature of receiving waters , allowing the gamut of meteorological data to be input for a specific local. This technique has been developed for thermal pollution forecasting.
  • Agent based simulation has-been used Effectively in ecology , Where It is Often called “individual based modeling” and is used in situations for qui individual variability in the agents can not be neglected, Such As population dynamics of salmon and trout (most Purely mathematical models assume all trout behave identically).
  • time stepped dynamic model. In hydrology there are several such hydrologic transport models such as the SWMM and DSSAM Models developed by the US Environmental Protection Agency for river water quality forecasting.
  • Computer simulations have also been used to formally model theories of human cognition and performance, eg, ACT-R .
  • computer simulation using molecular modeling for drug discovery . [8]
  • computer simulation for studying the selective sensitivity of molecules by mechanochemistry during grinding of organic molecules. [9]
  • Computational fluid dynamics simulations are used to simulate the behavior of flowing air, water and other fluids. One-, two- and three-dimensional models are used. A one-dimensional model might simulate the effects of water hammer in a pipe. A two-dimensional model could be used to simulate the drag forces on the cross-section of an airplane wing. A three-dimensional simulation of a large building.
  • An understanding of statistical molecular thermodynamics is fundamental to the appreciation of molecular solutions. Development of the Potential Distribution Theorem (PDT) allows this complex subject to be simplified to down-to-earth presentations of molecular theory.

Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows ‘ World3 used in the Limits to Growth , James Lovelock ‘s Daisyworld and Thomas Ray’ s Tierra .

In this paper, [10] which also includes qualitative and quantitative methods, and which interviews with experts, and which forms an extension of data triangulation.

Simulation environments for physics and engineering

Graphical environments to design simulations have been developed. Special care was taken to handle events (situations in which the simulation equations are not valid and have to be changed). The open project Open Source Physics was started to develop reusable libraries for simulations in Java , with Easy Java Simulations , a complete graphical environment that generates code based on these libraries.

Computer simulation in practical contexts

Computer simulations are used in a wide variety of practical contexts, such as:

  • analysis of air pollution dispersion using atmospheric dispersion modeling
  • design of complex systems such as aircraft and also logistics systems.
  • design of noise barriers to roadway effect noise mitigation
  • modeling of application performance [11]
  • flight simulators to train pilots
  • weather forecasting
  • forecasting of risk
  • simulation of electrical circuits
  • simulation of other computers is emulation .
  • forecasting of financial markets (for example Adaptive Modeler )
  • behavior of structures and other conditions under stress and other conditions
  • design of industrial processes, such as
  • strategic management and organizational studies
  • reservoir simulation for the petroleum engineering to model the subsurface reservoir
  • process engineering simulation tools.
  • robot simulators for the design of robots and robot control algorithms
  • urban simulation models that simulate dynamic patterns of urban development and responses to urban land use and transportation policies. See a more detailed article on Urban Environment Simulation .
  • traffic engineering to plan or redesign parts of the road network to a national highway network to transportation planning, design and operations. See more detailed article on Simulation in Transportation .
  • modeling car crashes to test safety mechanisms in new vehicle models.
  • crop-soil systems in agriculture, via dedicated software frameworks (eg BioMA , OMS3, APSIM)

The reliability and the trust people put in computer simulations depends on the validity of the simulation model , therefore verification and validation are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulations , where random numbers should actually be semi-random numbers. An exception to reproducibility are human-in-the-loop simulations such as flight simulations and computer games. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.

Vehicle manufacturers make use of computer simulation to test safety features in new designs. By building a simulation environment, they can save the money that would otherwise be required to build a unique prototype. Engineers can step through the milliseconds simulation at the exact time of the prototype. [12]

Computer graphics can be used to display the results of a computer simulation. Animations can be used in real-time simulation, eg, in training simulations . In some cases, animations can be used faster than real-time or even real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization .

In debugging, simulating a program execution under test (rather than executing natively), it is possible to detect the same thing as the hardware itself and to detect the same time. This technology can detect aussi buffer overflowand similar “hard to detect” errors as well as Produce performance information and tuning data.

Pitfalls

Although sometimes ignored in computer simulations, it is very important to perform a sensitivity analysis to ensure that the accuracy of the results is well understood. For example, the probabilistic risk analysis of the effects of oilfield exploration on the effects of a Monte Carlo method . If, for instance, one of the key parameters (eg, the net ratio of oil-bearing strata), then it may be misleadingly) be presented as having four significant figures.

Model calibration techniques

The following steps should be used to produce accurate simulation models: calibration, verification, and validation. Computer simulations are good at portraying and comparing theoretical scenarios. It should be made and calibrated so that it matches the area being studied. The calibrated model should then be verified to ensure that the model is operating as expected based on the inputs. Once the model has been verified, the final step is to validate the model by comparing the outputs to the historical data of the study area. This can be done by using statistical techniques and ensuring an adequate R-squared value. Unless these techniques are employed,

Model calibration is achievable by adjusting the method of operation. For example, in traffic simulation, typical parameters include look-ahead distance, car-following sensitivity, discharge headway, and start-up lost time. These parameters influence driver behavior such as how long it takes to change lanes, how much distance to driver leaves, and how quickly a driver starts to accelerate through an intersection. Adjusting these parameters has a direct effect on the amount of traffic. These are examples of calibration parameters that can be fine-tuned to match the field at the study location. Most traffic models have a number of different values, but they need to be better.

Model verification is achieved by the production of data from the model and the supply of data. For example, in traffic simulation, volume can be verified to ensure that volume throughput in the model is reasonably close to traffic volume input into the model. Ten percent is a typical threshold used in traffic simulation to determine output volumes are reasonably close to input volumes. Simulation models handle model inputs in different ways so that the network enters, for example, may or may not reach its desired destination. Additionally, traffic congestion may exist, if congestion exists. This is why model verification is a very important part of the modeling process.

The final step is to validate the model by comparing the results of the study. Ideally, the model should produce something similar to what has happened historically. This is usually verified by nothing more than the R-squared statistic of the fit. This statistic measures the fraction of variability that is accounted for by the model. A high R-squared value does not necessarily mean the model fits the data well. Another tool used to validate models is graphical residual analysis. If model output values ​​are drastically different from historical values, it is probably an error in the model. Before using the model as a basis for producing additional models, it is important to verify that each is accurate. If the outputs do not reasonably match the values ​​of the validating process, the model should be reviewed and updated. It is an iterative process that helps to produce more realistic models.

Validating traffic simulation models depends on the road traffic and transit systems. Initial comparisons are for trip interchanges between quadrants, sectors, or other broad areas of interest. The next step is to compare traffic estimated by the models to traffic counts, including transit ridership, crossing contrived barriers in the study area. These are typically called screenlines, cutlines, and lines and may be imaginary or actual physical barriers. Cordon lines surrounds such areas as a city’s central business district or other major activity centers. Transit ridership estimates are commonly validated by supplying them to current patronage crossing lines around the central business district.

Three sources of error can cause weak correlation during calibration: error input, model error, and parameter error. In general, input error and parameter error can be easily adjusted by the user. Model error, however, is caused by the methodology used in the model. Simulation models are typically built with several different types of modeling that can produce conflicting results. Some models are more generalized. If model error occurs as a result, it may be necessary to adjust the model.

In order to produce good results, these steps are necessary to ensure that they are effective. Simulation models can be used as a tool to verify theories, but they are only valid if calibrated properly. Once satisfied estimates of the parameters for all models have been obtained, the models must be checked to ensure that they adequately perform the functions functions. The validation process builds the credibility of the model by demonstrating its ability to replicate actual traffic patterns. The importance of model validation underscores the need for careful planning, thoroughness and accuracy of the data collection program that has this purpose. Efforts should be made to ensure that data is consistent with expected values. For example, in traffic analysis, it is typical for a traffic engineer. The resulting models and forecasts will be used for estimation and validation.

References

  1. Jump up^ Strogatz, Steven (2007). “The End of Insight”. In Brockman, John. What is your dangerous idea? . HarperCollins. ISBN  9780061214950
  2. Jump up^ “” Researchers stage largest Military Simulation ever “,Jet Propulsion Laboratory,Caltech, December 1997,
  3. Jump up^ “Molecular Simulation of Macroscopic Phenomena” .
  4. Jump up^ “Largest computational biology simulation mimics life’s most essential nanomachine” (news), News Release, Nancy Ambrosiano,Los Alamos National Laboratory, Los Alamos, NM, October 2005, webpage:LANL-Fuse-story7428.
  5. Jump up^ “Mission to build a simulated brainbath”, project of the institute at theSwiss Federal Institute of Technology in Lausanne(EPFL), Switzerland, New Scientist , June 2005.
  6. Jump up^ Santner, Thomas J; Williams, Brian J; Note, William I (2003). The design and analysis of computer experiments . Springer Verlag . Retrieved 12 June 2013.
  7. Jump up^ John Robert Taylor (1999). An Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements . University Science Books. pp. 128-129. ISBN  0-935702-75-X .
  8. Jump up^ Atanasov AG, Waltenberger B, Pferschy-Wenzig EM, Linder T, Wawrosch C, Uhrin P, Temml V, Wang L, Schwaiger S, Heiss EH, Rollinger JM, Schuster D, Breuss JM, Bochkov V, Mihovilovic MD, Kopp B, Bauer R, Dirsch VM, Stuppner H.doi:10.1016 / j.biotechadv.2015.08.001Discovery and resupply of pharmacologically active plant-derived natural products: A review.] Biotechnol Adv. 2015,PMID 26281720.
  9. Jump up^ Mizukami, Koichi; Saito, Fumio; Baron, Michel. Study on grinding of pharmaceutical products with an aid of computer simulationArchived2011-07-21 at theWayback Machine.
  10. Jump up^ Mesly, Olivier (2015). Creating Models in Psychological Research. United States: Springer Psychology: 126 pages. ISBN 978-3-319-15752-8
  11. Jump up^ Wescott, Bob (2013). The Every Computer Performance Book, Chapter 7: Modeling Computer Performance . CreateSpace . ISBN  1482657759 .
  12. Jump up^ Baase, Sara. A Gift of Fire: Social, Legal, and Ethical Issues for Computing and the Internet. 3. Upper Saddle River: Prentice Hall, 2007. Pages 363-364. ISBN 0-13-600848-8.