# Computer simulation

A **computer simulation** or a **computer model** is a computer program that attempts to simulate an abstract model of a particular system. Computer simulations have become a useful part of mathematical modelling of many natural systems in physics (Computational Physics), chemistry and biology, human systems in economics, psychology, and social science and in the process of engineering new technology, to gain insight into the operation of those systems. Traditionally, the formal modeling of systems has been via a mathematical model, which attempts to find analytical solutions to problems which enables the prediction of the behaviour of the system from a set of parameters and initial conditions. Computer simulations build on, and are a useful adjunct to purely mathematical models in science, technology and entertainment.^{[1]}^{[2]}

## Contents |

## [edit] History

Computer simulation was developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an adjunct to, or substitution for, modeling systems for which simple closed form analytic solutions are not possible. There are many different types of computer simulation; the common feature they all share is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible. Computer models were initially used as a supplement for other arguments, but their use later became rather widespread.

## [edit] Types of computer simulation

Computer models can be classified according to several criteria including:

- Stochastic or deterministic (and as a special case of deterministic, chaotic)
- Steady-state or dynamic
- Continuous or discrete (and as an important special case of discrete, discrete event or DE models)
- Local or distributed.

For example:

- Steady-state models use equations defining the relationships between elements of the modelled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modelling case before dynamic simulation is attempted.
- Dynamic simulations model changes in a system in response to (usually changing) input signals.
*Stochastic*models use*random number generators*to model chance or random events; they are also called Monte Carlo simulations.- A
*discrete event simulation*(DE) manages events in time. Most computer, logic-test and fault-tree simulations are of this type. In this type of simulation, the simulator maintains a queue of events sorted by the simulated time they should occur. The simulator reads the queue and triggers new events as each event is processed. It is not important to execute the simulation in real time. It's often more important to be able to access the data produced by the simulation, to discover logic defects in the design, or the sequence of events. - A
*continuous dynamic simulation*performs numerical solution of differential-algebraic equations or differential equations (either partial or ordinary). Periodically, the simulation program solves all the equations, and uses the numbers to change the state and output of the simulation. Applications include flight simulators, racing-car games, chemical process modeling, and simulations of electrical circuits. Originally, these kinds of simulations were actually implemented on analog computers, where the differential equations could be represented directly by various electrical components such as op-amps. By the late 1980s, however, most "analog" simulations were run on conventional digital computers that emulate the behavior of an analog computer. Ordinary differential equations can be solved numerically by analog computers, but partial differential equations cannot, which was very important for von Neumann when building the first computer of the so-called von Neumann architecture.^{[3]} - A special type of discrete simulation which does not rely on a model with an underlying equation, but can nonetheless be represented formally, is
*agent-based simulation*. In agent-based simulation, the individual entities (such as molecules, cells, trees or consumers) in the model are represented directly (rather than by their density or concentration) and possess an internal*state*and set of behaviors or*rules*which determine how the agent's state is updated from one time-step to the next. - Distributed models run on a network of interconnected computers, possibly through the Internet. Simulations dispersed across multiple host computers like this are often referred to as "distributed simulations". There are several military standards for distributed simulation, including Aggregate Level Simulation Protocol (ALSP), Distributed Interactive Simulation (DIS) and the High Level Architecture (HLA).

## [edit] Computer simulation in science

Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:

- A numerical simulation of differential equations which cannot be solved analytically, theories which involve continuous systems such as phenomena in physical cosmology, fluid dynamics (e.g. climate models, roadway noise models, roadway air dispersion models) fall into this category.
- A stochastic simulation, typically used for discrete systems where events occur probabilistically, and which cannot be described directly with differential equations (this is a
*discrete*simulation in the above sense). Phenomena in this category include genetic drift, biochemical or gene regulatory networks with small numbers of molecules. (see also: Monte Carlo method).

Specific examples of computer simulations follow:

- Statistical simulations based upon an agglomeration of a large number of input profiles, such as the forecasting of equilibrium temperature of receiving waters, allowing the gamut of meteorological data to be input for a specific locale. This technique was developed for thermal pollution forecasting .

- Agent based simulation has been used effectively in ecology, where it is often called
*individual based modeling*and has been used in situations for which individual variability in the agents cannot be neglected, such as population dynamics of salmon and trout (most purely mathematical models assume all trout behave identically).

- Time stepped dynamic model. In hydrology there are several such hydrology transport models such as the SWMM and DSSAM Models developed by the U.S. Environmental Protection Agency for river water quality forecasting.

- Computer simulations have also been used to formally model theories of human cognition and performance, e.g. ACT-R

- A computer simulation using molecular modelling for drug discovery was developed Quantum 3.1 to achieve the level of accuracy of simulation equal to the chemical or in-vitro tests.

- Computational fluid dynamics simulations are used to simulate the behaviour of flowing air, water and other fluids. There are one-, two- and three- dimensional models used. A one dimensional model might simulate the effects of water hammer in a pipe. A two-dimensional model might be used to simulate the drag forces on the cross-section of an aeroplane wing. A three-dimensional simulation might estimate the heating and cooling requirements of a large building.

Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows' World3 used in the *Limits to Growth*, James Lovelock's Daisyworld and Thomas Ray's Tierra.

## [edit] Pitfalls in computer simulation

Although generally ignored in computer simulations, in strict logic the rules governing floating-point arithmetic still apply. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method. These include normal, lognormal, uniform and the triangular distributions. However, a sample from a distribution cannot sustain more significant figures than were present in data or estimates that established those distributions. Thus, abiding by the rules of significance arithmetic, no result of a simulation can sustain more significant figures than were present in the input parameter with the least number of significant figures. If, for instance the net/gross ratio of oil-bearing strata is known to only one significant figure, then the result of the simulation cannot be more precise than one significant figure, although it may be presented as having three or four significant figures.

Another consideration can be a weakness for production use of the simulation, although it can be quite useful in testing, is that pseudorandom number generators, such as UNIX/LINUX `random()`

will produce the same pseudorandom number when given the same initialization value, variously called a seed or a nonce. If Version 1 of the simulation produced some verifiable results of some function `foo`

that should not be changed by Version 2, during the development of the new version, initial tests should use the same seed. If the expected result changes, the programmer has made an error that affects `foo`

, and may have other undesired effects. More changes should not be made until the reason `foo`

changed has been found and corrected.

## [edit] Computer simulation in practical contexts

Computer simulations are used in a wide variety of practical contexts, such as:

- Analysis of air pollutant dispersion using atmospheric dispersion modeling
- Process design of chemical plants, petroleum refineries and natural gas processing plants.
- Design of complex systems such as aircraft and also logistics systems.
- Design of Noise barriers to effect roadway noise mitigation
- Flight simulators to train pilots
- Weather forecasting
- Strategic Management and Organizational Studies
- Instruction Set Simulators for debugging and automatic error detection

The reliability and the trust people put in computer simulations depends on the validity of the simulation model, therefore verification and validation are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulations, where random numbers should actually be semi-random numbers. An exception to reproducibility are human in the loop simulations such as flight simulations and computer games. Here a human is part of the simulation and thus influences the outcome in a way that is hard if not impossible to reproduce exactly.

Computer graphics can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time e.g. in training simulations. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization.

In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow and similar "hard to detect" errors as well as produce performance information and tuning data.

## [edit] References

- ↑ Richard McHaney (1991).
*Computer Simulation: A Practical Perspective*, 1st Edition. Academic Press. ISBN 0-12-484140-6. - ↑ Naim A. Kheir (Editor) (1996).
*Systems Modeling and Computer Simulation*, 2nd Edition. CRC Press. ISBN 0-8247-9421-4. - ↑ "1.2 An automatic computing system is a (usually highly composite) device, which can carry out instructions to perform calculations of a considerable order of complexity—e.g. to solve a non-linear partial differential equation in 2 or 3 independent variables numerically." Quoted from: "First Draft of a Report on the EDVAC" by John von Neumann, IEEE Annals of the History of Computing, Vol. 15, No. 4, pp.27-75, 1993.

Some content on this page may previously have appeared on Citizendium. |