BU CAS CS 680 and CS 591: Readings in Computer Graphics

Class commentary on articles: Natural Phenomena



Jeremy Biddle

Particle Systems --- A Technique for Modeling a Class of Fuzzy Objects
William T. Reeves                                      Lucasfilm, Ltd.

This paper discusses the use of particle systems to model difficult to
render objects such as fire, clouds, and water.  The bulk of the paper
is devoted to the discussion of fire and explosions, particularly with
respect to fire wall in Star Trek II:  The Wrath of Khan.  The problem
with modeling such  "fuzzy" ojects as fire, clouds,  and water is that
these objects  do not lend well to  typical geometric representations,
for instance with texture mapped polygons.

A  particle  based  system  was devised,  whereby the  particles  have
attributes such as position,  velocity, size, color, etc.  By applying
stochastic methods when computing movement and change to the particle,
the system appears varied and chaotic, simulating the natural movement
of such  systems in nature.  Particle  systems are composed  of either
particles, or more particle systems, as in the cocentric rings of fire
used in the modeling of the Genesis effect.  The particles can then be
rendered as anti-aliased, transparent, motion-blurred circles in order
to create  a realistic image.  Obviously these effects  were not meant
for real-time rendering.

The main problem  I found with this paper is  that it does not address
the issue  of interaction between  particles.   This issue  would most
likely become much more relevant in a discussion of modeling clouds or
water, but these  are not the focal points  of the paper.  Interesting
aspects  of it are  that they provided  a working model  (and one that
many people have seen!),  that it was most  probably advanced work for
thirteen  years ago, and that particle based  systems seem very useful
for different areas, not being limited to a single application.

----------------------------------------------------------------------

Particle Animation and Rendering Using Data Parallel Computation.
Karl Sims

This is another paper dealing with particle animation, this time a bit
more advanced than the previous one (although at 7 years later, it
should be). Karl Sims uses parallel computing techniques to quickly
and efficiently render complex organizations of particles.  The paper
is broken into three sections: description of the particle animation
language, system for rendering, and specific applications.

The particles are controlled with several different mechanisms.
Position (explicitly setting a particle), velocity, and acceleration
control particle movement in obvious ways.  A vortex operation is
introduced that rotates particles around to simulate a vortex,
depending on the parameters used.  Some interesting acceleration
operations include damp, undamp, spiral, and bouncing off planes and
spheres.  Rendering is straightforward as far as graphic technique
used.  In order to deliver an efficient number of particles to the
various processors for computation without exceeding the number of
particles, virtual processors are created to serve each particle, and
sent to the physical processors when they are free.  To render the
particles, the processor splits the particle into spans (scan lines)
and breaks that up into fragments (pixels) which are then rendered,
using anti-aliasing and motion blurring.  Depth sorting is also used,
which makes it possible to integrate the graphics generated by the
particle system with other types of rendering systems seamlessly.

Roberto Downs


Bob Gaimari


"Particle Systems - A Technique for Modeling a Class of Fuzzy Objects"

This paper discusses using groups of particles to represent "fuzzy"
objects, i.e. objects such as water or fire which have no fixed shape which
can easily be modeled with polygons, and which change in a random fashion
over time.  Objects are represented by a "cloud" of particles modeling its
volume.  These particles are placed and move about according to stochastic
procedures, restrained by certain parameters.  These particles also can be
"born" and "die".

There are five steps to using this technique.  They are as follows:

  1) Generate new particles, the number of which can be determined either
randomly or according to screen resolution of the object.

  2) Assign attributes to the new particles.  These attributes determine
properties such as initial position, velocity, size, color, transparency,
shape, and lifetime.

  3) Remove particles that have reached the end of their lifetime.

  4) Move and transform the particles according to their attributes.  Not
only can position and velocity change, but so can attributes such as color
or size.

  5) Render the particles in the screen buffer, to display the image.

They also discuss using hierarchies (using particle systems as particles in
a larger system), to give more high-level control to the designer.

Section 3 discusses the application of this technique in a movie called
"Star Trek II: The Wrath of Khan", in transforming a dead planet into a
life-filled one, in a way which looks like a wildfire spreading.  Section 4
discusses other applications for this technique, such as fireworks,
explosions, and grass beds.



"Particle Animation and Rendering using Data Parallel Computation"

This paper addresses one of the problems I had with the previous paper.  In
the previous one, step 4 is to move and transform the particles.  This must
be done for each frame.  However, in Section 3, they mention that at one
point there are 400 particle systems with 750,000 particles.  This would
take a lot of computing power to do 30 times per second.

In the current paper, they discuss using a CM-2 parallel computer to do the
animation and rendering.  Each particle could have its own "virtual
processor" to do the approprriate calculations.

Section 3 of this paper describes a set of operations developed for moving
particles: set the position, set the velocity, apply the velocity
(including a "vortex" operation which gives rotation around a vortex axis,
like an orbit), and apply accelerations, such as constant or random
acceleration, attraction, damping, spiraling, and bouncing.

Section 4 describes rendering the particles.  Each particle has a head (the
particle) and a tail (showing where the particle is coming from, to create
a blurring effect).  Each particle (head and tail) is "diced" into
fragments; each fragment is a particular pixel which is effected.  Each
fragment gets its own "virtual processor" in the CM-2.  The images often
have to be done in patches (subsections), so that the computer doesn't run
out of memory.  Each fragment has its color, opacity, and depth calculated.
Then, the fragments representing the same pixel are combined, leaving a
single set of values to be displayed at that screen position, and these
pixels can be displayed.

Section 5 discusses results and application areas, such as snow and wind,
water, and fire.

Daniel Gentle


John Isidoro

	The papers we had to read this week have to do with particle systems. 
 Its sort of ironic that in reading these papers, I have gained a greater 
appreciation for the kinematics of hot chocolate!  Let me explain..

	Particle systems are a way to model non-rigid / non-solid phenomina
such as water, smoke and fire.  In a way particle systems are an 
approximation, to the molecular dynamics behind such systems, each particle 
has certain properties which cause it to behave like an atom.  Unfortunatlely
the particle systems descibed ignored the interactions between the particles 
in the system. This interaction is quite necessary in some cases..

	One thing that particle systems is particlarly useful for modelling is 
fluids.  However, most of the fluids being modeled have a low viscosity, 
meaning that atoms withing the fluid are not bound to each other very strongly,
and have motion defined mostly by their own intrisic? kinematics..  However,
as fluids get more viscous, the interactions between the molecules become 
stronger and stronger..  THe question that arises is, how can this interaction 
be modeled specifically between particles in a particle system.. Some questions
that arise are..

	How can this be done without calculating distances between each pair of
particles in the world..  Perhpas fo an N-dimensional particle system by 
indexing all particles in a N-dimentional hash table (with chaining) indexed 
by position in the N dimentions so that any particle can find and be affected 
by neigbors within a certain number of regions..  (This technique has been used 
before for color quantization in three dimentions)..  Its quite an interesting 
problem..
	
	Getting back to the Hot Chocolate bit.....
	While reading the papers I was drinking hot chocolate, and thought how 
good particles systems would be to model the movement of the thin layer of 
foam on top of it..  The foam seems to have a high viscosity? and it tends to 
be attracted to it self to a certain degree..  When I move a spoon through the 
hot chocolate, not only the area near the spoon gets affected, but also the 
entire foam surface moves becuase of this viscosity, (and also the underlying 
currents of the Hot Chocolate). 


Oh yeah, if you have a PC and want to see a demo of some real time particle 
systems (~2000 particles)    get 
ftp.cdrom.com    /demos/incoming/TG96/in64/korso100.zip  

Dave Martin


		   Particle Systems --- A Technique for
		     Modeling a Class of Fuzzy Objects

			     William T. Reeves


This paper describes a simple stochastic particle system model and how it
was used to generate the moving wall of fire in the "Genesis" sequence from
Star Trek II: The Wrath of Khan.  Individual particles are created, execute
deterministic motion, and die after a certain number of frames.  Parameters
of the particle system determine the distribution of particle creation
rate, particle position, velocity, color, transparency, etc., according to
linear formulas of a uniform distribution.  The particle system also
specifies a generation shape, which can be seen as the space over which the
initial position is chosen.  Particle systems may also be hierarchical; a
level n system may produce many level n+1 systems.

In the Genesis sequence, each particle is a point light source.  This makes
rendering particularly easy, since no shadows or hidden surfaces are
possible: the color values of coincident pixels are simply added.  The
author later states that the points are streaked according to their current
dynamics in order to create a motion-blurring effect.  It is not completely
clear how these two features interact, but I imagine that pixel values are
summed after the antialiased lines have been rasterized according to the
streaking process.

In fact, the individual particles are not real light sources-- they do not
project any light.  The developers had to use suitable hacks to produce
glowing effects on nearby surfaces where they were desired.

The author briefly mentions some other applications of their particle
system: explosion dynamics in the movie Return of the Jedi and an image
rendered from particle system grass.

This paper is interesting mostly because it goes all the way from the model
specification to the movie sequence, which many of the intended audience
members would have had opportunity to see.  Particle systems as defined
here seem very easy, and I would not have believed that they would be good
enough to produce the kind of effects that they do. 

On the other hand, I never found the fire particularly convincing; it
looked like something very cool was happening on the planet surface, but I
wasn't so sure it looked like fire.  (Not that I know what a planet on fire
looks like, of course!)  In a recent TV commercial, a well-groomed and
happy young woman drives a sportscar through vast cornfields, and the
crusty old midwestern farmer is confounded when all of the corn pops into
popcorn as the car whizzes by.  This commercial had the same sort of look
as the Genesis bomb: it's a neat effect, but kind of weird-looking too.

    Particle Animation and Rendering Using Data Parallel Computation

				 Karl Sims


This paper describes another approach to particle system animation in terms
of a particle behavior language and its implementation on the
(idiosyncratic) CM-2 parallel computer.  The supported dynamics modules are
very rich, providing velocity, acceleration, vortex, collision with
nonparticles, damping, and spirals.  Each particle executes its dynamics
according to a program and is rendered in terms of its position, radius,
color, opacity, and global blurring parameters.

The images provided are convincing, and the frame times cited---"several
seconds to several minutes"--- are impressive.  However, the implementation
is specific to the CM-2 architecture.  One could hardly imagine a more
appropriate application for that machine: massively parallel, independent,
and simple calculations are precisely its forte.

The abstract claims a "particle behavior language" as a main contribution
of the paper, but the paper neither discusses the language nor gives any
examples of programs!  The subsections of section 3 describe what must be
the built-in functions of this language; apparently the author felt that
this was enough.  As a referee, I would recommend that the author remove
his claims about the language altogether.

John Petry


PARTICLE SYSTEMS -- A TECHNIQUE FOR MODELING A CLASS OF FUZZY OBJECTS,
by William T. Reeves

This paper describes the use of particles to visually model fuzzy objects.
These objects can't be described by traditional, whole-object methods.
For instance, it is not reasonable to try to render a flame as a surface
composed of triangles. 

Particles have a tremendous advantage in that the object is defined
procedurally and can be controlled by random variations.  This is very
similar to fractals, except that there is no requirement for self-similarity
across scales.  It permits great detail without a complex model.

When groups of particles are rendered, they do not necessarily occlude each 
other even if they lie along the same vector from the virtual camera.  
Instead, each particle contributes a certain amount of light and color to 
the shared pixel, depending on its transparency, color and intensity.

Particles are controlled using velocity vectors and sometimes an acceleration
factor to modify the velocity.  The system can be tweaked by adding time-
or position-dependendent changes in its characteristics, e.g. color fading.

I like this basic method of creating images.  The problem with more complex
images, such as clouds, is that it is necessary for users to understand
something of the constraints controlling actual cloud formation and motion
in order to recreate their appearance.  Not that the understanding needs
to be too detailed in terms of the physical process, but some type of higher-
order control is needed to achieve a realistic pattern controlling the motion
of the low-level particles.  It is not just a matter of choosing good initial
conditions so that a cloud-like object appears; it must move as well.


PARTICLE ANIMATION AND RENDERING USING DATA PARALLEL COMPUTATION,
by Karl Sims

The underlying topic of this paper is almost identical to that of the
preceding paper.  It differs chiefly in that it describes a data parallel
implementation of a graphical particle simulator.

A parallel computer is an obvious choice for running such an algorithm,
since the particle model typically contains tens or hundreds of thousands
of small (in terms of data) particle objects that move under a very simple
set of controls.  The author points out that he is not trying to create an
accurate model of a physical process (fire, waterfall, etc...), but a
realistic image of one.  For that reason, he can take shortcuts with the
implementation, such as using Euler's method to integrate the velocity and
acceleration of particles.  While this may lead to some discrepancies at
the level of individual particles, the number of particles conceals such
abnormalities while providing a great incentive for a fast algorithm.

This same philosophy can be seen in the way he handles blurring when an
object bounces off of a surface.  There, the tail is truncated so that
it does not bend back at the surface.  If a viewer were only watching one
particle, this sudden contraction of the tail would seem bizarre, but it
doesn't stand out when thousands of particles are moving.  Actually, there
are a number of such hacks in the physics he implements, such as eliminating
friction for small particles, but these seem unimportant as long as the
overall appearance is convincing.

The author is much more precise in his programming of the algorithm.
He takes a number of steps that optimize it for the machine he is using, 
a CM-2.  For instance, when particle fragments  lie along the same viewing 
vector, they can be aligned in the machine and sorted by depth in order to
use a built-in command to add or multiply a vector of numbers.  

In another optimization move, when particles are larger than one pixel,
they are diced into pixel-sized fragments for rendering.  However, the
number of such fragments can exceed the memory capacity of the machine.
Therefore, similar fragments are grouped into patches.  The area of the
pateches is computed dynamically, so that their number uses system resources
as fully as possible without exceeding their limits.

The author then gives several examples of scenes which can be generated with
this approach.  Most sound quite convincing, though a video sequence would
be much more interesting.  A couple of minor problems I had with the approach
involve particle physics.  For instance, in the waterfall demo particles
start as blue.  When they bounce off of rocks, they turn white for a period
until they fall back down.  It would seem more realistic to have the particles
fragment into multiple mist particles when they bounce up.  Then rather than
having the color and motion changes built in as a high-level addition, simple
low-level rules could be used to control this, e.g. saying that particle
color is a function of particle size (large=blue, small=white, interpolate
between), and that the forces acting on particles are a function of size
(gravity affects large; gravity + air resistance & air currents affect small).
 
Related to this, particles themselves don't seem to interact.  This gets to
be computationally prohibitive, I know.  But it would allow the above
mentioned mist particles to recombine, for instance.  The trick is not to
manage the extra small particles, but to determine when they intersect.
This might be possible at rendering time, though, when they are sorted by
depth and position.  It's an idea, anyhow.

Robert Pitts


Particle Systems--A Technique for Modeling a Class of Fuzzy Objects
by Reeves
=========

This article presents a paradigm for objects that are fuzzy (or
non-rigid) in their structure using a particle-volume-based approach.
It allows such objects to be modeled more easily than with traditional
methods, such as polygons or surfaces.  These models are used to
produce frames in animations of the fuzzy object.

Previous models have used particle systems to model objects; however,
they differ in that these models were not dynamic; the fuzzy objects
did not change over time, or they were not modeled stochastically,
which can give fuzzy objects "believable" behavior.

The evolution of a system of particles is modeled in a small number of
algorithmic steps. Particles are created with initial values for their
attributes: position, velocity, size, color, transparency, shape and
lifetime. Particles are removed when their lifetime has expired.  Those
that survive change according to the dynamic model that the particle
system implements.  Because this dynamic behavior is left unspecified,
implementors have many choices, including using physical models of the
system being simulated.

When a stochastic model is used for particle parameters in this system,
it is used in a simple way--to control initial parameter values.  These
initial values are chosen for each particle and parameter randomly with
the following formula:

     InitialParameter = MeanParameter + Rand() * VarianceParameter

This is a simple way to add variation (that does not vary too far from
a particular mean value).  The text suggests that particle parameters
were changed deterministically from initial conditions onward.  Other
parameter changes are sometimes applied externally during particle
evolution to simulated certain forces (such as gravity).

Methods are support that allow the numbers of particles generated to be
chosen based on the size of the screen they will cover and their frame
in the animation sequence.  The former allows the systems to scale
nicely and the latter allows the fuzzy object to change "intensity"
over time (a growing fire, e.g.).

There are other parameters that control the initial shape from which
particles emerge, initial directions in which particles move, and
shapes for particles.  In addition, the author allows hierarchies of
particle systems, where subsystems are spawned and inherit the
properties of their parents.  Clearly, he allows all these
possibilities so that particle systems may model a rich set of
phenomena.

Since the goal of the simulation is to produce rendered frames, the
author had to deal with many rendering issues that would be present in
any case, i.e., in non-particle rendering techniques.  A major
simplification he takes is that particles emit their own light (i.e.,
they are the light source).  Furthermore, he assumes these systems do
not intersect other objects (objects are composited in a later stage).
With these simplifications, intensities can be assigned to pixels
additively, without worrying at what depths various particles exist.

The author places the model in context by describing how initial
generation shapes, particle trajectories, and hierarchies of particle
systems were used to generate a particular visual effect from Star Trek
II.  He discusses how he introduced "motion blur" in the frame to frame
movement of particles to better approximate what the human eye would
detect in the real world.  This reduces some aliasing affects that
would be present otherwise.

An interesting aspect of this model is that an animation sequence can
be saved simply by saving the random numbers used to initiate the
sequence.  This is a big space savings, and also suggests that
rendering times for a sequence are not too long.

The author presents a couple other applications for which he has used
his particle system model.  Fireworks show a good use of particle
subsystems and color evolution over the lifetime of a particle.  In
addition, he uses his model for grass, in which particles are
essentially "drawing" the grass as they move along.  I found this to be
an interesting use since grass is not a fuzzy object at all.  I would
like the author to suggest what other non-fuzzy objects might be
appropriate for particle simulation.  Finally, suggestions for other
fuzzy objects to be modeled are given.  These new objects are not
modeled well by the initial simplification that each particle is a
point light source.

The concluding section praises the use of procedural models (such as
the stochastic model in this paper).  One of the strong points
mentioned is that stochastic models can generate varying levels of
detail.  This is one point that I would have liked to see delved into
in the paper.

This paper presents a useful paradigm for particle system models and
there are plenty of theoretic and computational issue to explore in
extensions to this model (e.g., to handle non-point-light source
particles).

Particle Animation and Rending Using Data Parallel Computation
by Sims
=======

This article describes how to parallelize particle system simulation on
an MPP architecture.  The parallel model used to generate animations is
a data parallel model in which the units of parallelism have been
chosen as _particles_ in the simulation stage, and _pixels_ in the
frame generation stage.  Therefore, a common set of operations are
performed on each particle or each pixel simultaneously.

The author chose simulating physical aspects of particles as his model,
i.e., he models things like velocity and acceleration.  He states that
while _physical_ simulations produce particle behavior that looks
correct, such particles can be difficult to control (to achieve a
certain effect, e.g.).  He chose a particular set of physical phenomena
to model, which I believe was chosen to produce a flexible set of
behavior, but to still allow enough control by the user.  For
convenience, particles are modeled with both a head and a tail, which
allows easy generation of motion blur (by rendering a particle from its
head to its tail).

The general operations that can be performed involve setting/adjusting
positions and velocities. Initialization of these parameters can be
done in a number of constant and random ways, typically generating
particles from within a certain shape.  These are reasonable because it
is easy to see evolving particles that emerge from a particular object
(e.g., fire) or that are initially contained within a certain general,
volumetric shape (e.g., clouds).

The operations that alter positions or velocity often involve
translations, rotations and scalings.  The author describes a class of
these adjustments to position and/or velocity that implement certain
physical effects.  These affects are:  simple acceleration, an
acceleration is applied to a particle's velocity that causes it to
accelerate towards a source of gravity, a point (orbit) or other
objects; vortex, a rotation about an axis (the angular velocity of
rotation is parameterized by the distance from the axis); damping
(undamping), a deceleration (acceleration) in the current direction;
spiral, particle velocities are rotated around an axis; and bounce, to
simulate bouncing off a surface (friction is a parameter of how much to
bounce).

The animation of particle systems is performed in a loop that adjusts
particle states (position, color, etc.) and then renders them.  Because
of the parallel methodology, systems can be "previewed" in real-time.
Since the author uses the term "preview," I assume some aspects of the
rendering are not implemented in a preview, but I was not sure what
those aspects were.

As mentioned above, motion blur is simulated by rendering particles
between a head and a tail.  Parameters of the particle, such as color,
are interpolated between the head and tail (i.e., the interpolation is
the temporal change in the parameter from one moment, when the particle
would be at the tail, to another moment, when the particle would be at
the head).  Since the CM-2 on which the parallelism was achieved uses a
"virtual processor" model, the rendering of particles was also able to
be parallelized beyond the particle level.  Particle streaks (from head
to tail) are broken up horizontal and vertically into fragments, such
that each fragment covers a single pixel in the display (although
fragments from different particles may cover the same pixel).  Pixels
are then each processed by "virtual processors."  The rendering
algorithm uses some of the parallel-optimized algorithms to order
fragments that correspond to a single pixel and determine how much each
contributes to color, etc.  In contrast to the last paper, this
algorithm renders mixed images (e.g., an image that might also have an
object made with polygons) at the same stage with particles by using
the same fragment rendering scheme described for particles.

The article describes what type of phenomena the author was able to
animate using these particle systems.  It becomes apparent at this
point that many of the choices of initial particle states and
transformation were appropriate.  They describe simulating snow and
wind, falling water, and fire.  The waterfall example illustrates the
usefulness of the bouncing primitive, as water bounces off of rocks.
The fire example uses groups of particles to simulate separate
"flicker" and hotter/cooler regions.  To achieve a natural affect, the
parameters of different groups are varied slightly--this sounds like
the kind of organization that was implemented well in a hierarchy of
particle systems in the last paper.  Recall it allows particle
subgroups to inherit their parent's properties with slight variations.
Perhaps then, this model would benefit from the same hierarchical
extension.

Some interesting suggestions for future work are given in the end,
including modeling more complicated particle systems in which particles
interact.

More notable is the issue of the parallel implementations.  Since, for
examples, symmetric multiprocessing machines and networks of
workstations are becoming a more popular model of parallel/distributed
processing than MPP's, one may raise the question of how to fit such an
algorithm into these models, which don't necessarily use data
parallelism?


Stan Sclaroff
Created: Mar 13, 1996
Last Modified: Apr 3, 1996