The moving image
The synthesized image, especially when interactive, is essentially mobile.
The famous trilogy modeling-animation-rendering is nonsense
In anyflo there is no database and any object (be it a graphic primitive, volume, trajectory, camera, light, ...) is an abstract
description first, an actor with programmable datas and programs.
As well as the modeling of a wave is inseparable from its movement, construction entities in anyflo are inconceivable without their fate in time.
This means that the notion of dynamic (in the broad sense) is important.
We will consider different types of animation, traditional to the most recent:
First animation type kinematic laws of motion trajectories and (very popular among professional animators but still
a manual method using only little potential of the machine).
Then type procedural animation, which is animation synthesis excellence
(widely used in production, but that requires teams of programmers).
And dynamic animation allowing a high degree of realism on inanimate objects.
Then the behavioral animation that, via the notion of actor, allows manage
complex numerous beings without having to control all parameters.
Then connectionist and evolutionary methods inspired by biological models and artificial life,
such as neural networks and genetic algorithms.
Finally, the optimization of these methods allows real time.
Animation trajectories and laws of motion
Principle
The trajectories and the
motion laws allow
to control objects and their properties over time::
Geometrie: Forms, positions and orientation of volumes, of cameras,
of lights
In a more general any parameters:
Particular vertex of an object.
Colors.
illumination models.
Extension curve coefficients.
etc...
For this trajectory (objects of type 'traj') are associated with animate objects and the laws of motion associated
with these trajectories.
traj traj gives the syntax for commands.
Pratically
Practically just:
define encapsulated trajectories on the parametres of the entities to animate, or objects of type trajectory
affectes to these parameters.
possibly define trajectories on these trajectories, etc..
procedural animation
Principe
Previous methods simulate conventional techniques in particular animation, they have
the advantages (exhaustive control) but also disadvantages (inability to manage complex or
unpredictable events). The procedural animation is to generate an animated not by geometric
descriptions but procedures.
Pratically
Practically just run to each image, a function that analyzes the scene and deduces actions
(eg test collisions between objects and bounce).
For this, call back can intervene at different
levels of animation.
Dynamic animation
Principle
The previous technique allows a more systematic and can simulate an infinite number of effects,
among these are the laws of nature (eg mechanical) that just write the equations in the functions called for each image.
To simplify this task classical algorithms are written 'hard' (call control anyflo), this is the case, for example, the laws of
dynamic heavy bodies of the springs
the chocs, etc ...
Forces, masses, speeds and accelerations
A mass can be assigned to a volume using the command:
mass vol(id)=m;
The same force can be assigned to a volume:
force vol(id)=f;
Mass and force can also be individually assigned to one or more vertices of a volume::
mass vertex(s) vol(id)=m;
force vertex(s) vol(id)=f;
Force fields can be defined globally for all objects or be assigned to certain objects or vertices.
They can be uniform, central, linear, surfacic or even procedural.
Dynamic animation
During the animation, geometry, topology, masses and forces of all objects can be changed.
For example we can define a field of force by f = function (p) for a given point p.
Speed and acceleration are readable and writable by:
speed vol(id)
speed vertex(s) vol(id)
acc vol(id)
acc vertex(s) vol(id)
Collisions detection
The collision detection is handled by the command collision.
Springs
Springs can be assigned between vertices of a certain volume.
When masses were assigned to these vertices, their interactions are managed automatically if
yes dynamic is active.
The command spring vertex (s) vol (id) = r, v can specify stiffness r and viscosity v
spring associated with the vertex s of volume id.
Any movement of vertices produces an adequate response of the springs.
Particles system
Volume particle type can be defined, their summits can receive
geometrical properties, colorimetric, dynamic, etc ...
behavioral animation
The notion of actor
The above techniques apply well to the animation of physical objects but fail with living beings who obey,
in addition to other laws called 'behavioral' (with reference to the behavior of a being in a given situation ).
An actor is an object with a behavior, ie a set of local functions and internal memories.
Several actors can interact with each others running a local function of the other,
and the user can interact with the actors performing some of their local functions.
Management behaviors of players equipped can not be done with traditional procedural languages,
and languages must appeal to so-called object oriented in which data structures include no only
datas (physical) but also code (intelligence).
For this anyflo offers a (manuel.object.htm)object oriented language
to assign any object, in addition to its physical properties, a program (code memory) can occur independently of the programs
included in the other objects:
Commands (local) and (memory)
allow create such actors.
Local functions of an object
local(0) vol(id) = "f":
assignes to volume id a new local function as a copy of the function n med f.
The function f is then duplicated, compiled and static memories are possibly reserved, all these
elements are local to the volume id, that is unknown to the outside.
local(n)vol(id): Returns the text of the local function number n
of volume id.
local(n)vol(id) = "ttt": changes the local function number n
of volume id.
local("foo")vol(id): Retutns the text of ethe local function named "toto"
od volume id.
Any number of local functions can be associated with a volume, each of which can call one another by mere
invocation of his name (even if another function with the same name also exists).
When displaying a volume, if yes local is active, the first of its local functions will be executed.
It is possible to execute a local function, from the outside, by:
exec local(n) vol(id) var(p1) var(p2) ...
n = number or name (in quotes), the local function.
nid= volume identifier.
p1, p2, ... = parameters passed to the local func.
Local function of an other objects
Local functions can also be set to::
Lights: local(0)light(id) = "text"
Views: local(0)view(id) = "text" (allows defining
adaptative perspectives)
Local memories
In a local function of an object static memories ('static') can be booked, they are permanent variables but unknown to the outside.
Global memories of an object can also be declared by:
memory object
Connexionnist animation
Previous methods are very artificial. Paradoxically, a way to rediscover natural movement is to use the techniques
of Artificial Life,
and more particularly to neural networks.
Be found in the manual on the network a set of commands for
building and using such networks.
One method is to build virtual actors having:
1) A body as a hierarchical structure of volumes that are assigned dynamic properties.
2) Perceptions as sensors connected to the brain.
3) On brain in the form of neural networks whose inputs are
connected to the sensors and whose outputs are connected to the actuators acting on the dynamic component of the body volume.
4) Learning procedures (supervised or not) for train networks to correctly answer certain configurations of the environment
(see neural network manual).
See:
Une méthode comportementale de modélisation et d´animation du corps humain.
Interaction avec un danseur virtuel intelligent.
Evolutionary animation
Another artificial way to find the natural is by using evolutionary techniques,
particularly genetic algorithms. Be found in the manual on genetic
set of commands for building and using such algorithms. Be found in file
mouv_gen.func an example of such a technique.
An interesting way to build optimal neural networks is to create an arbitrary population of randomly defined networks
(and therefore inefficient), and submit this population to Darwinian evolution.
To do this we define a bijective mapping of all the networks on a set of genetic codes,
on which work genetic algorithms, optimizing some adaptation function.
For example, to move the actotrs, we can optimize the distance they travel.
Real time
c
Principle
A real-time module (interaction) can manage animations
in real time with the rendering allowed by OpenGL.
Certain types of maps, rendering Phong, adaptive prospects are not supported. Almost all types
of animation are possible.
generate interaction time allows reduce
computation times.
Types of animation
Actors.
Bitmaps.
Trajectories.
see demo1_traj.func.
Dynamic models.
Procedural models.
With callbacks giving access to each image, at interpreter level.
Behavioral models
With local functions.
Neural networks
Genetic algorithms.
Sensors
In indeterminate number, they can be used via the shared memory files, serial ports, USB, etc ...
Debugging tools
Allowing interactive development, in particular:
displ displaying all kinds of information
(volumes, trajectories, dynamic, neural networks, genetics, etc..
interaction debug providing access to graphic scales
on which you can enter the parameters (also usable in C).
All these methods allow to build interactive installations, defining actors equipped with physical properties
(managed by dynamic) with behavior (local functions), "smart" (neural networks), evolutionary (genetic algorithms),
interacting each other and with their environment and virtual reality (sensors).
To understand these principles we can read articles
and descriptions of articles.
Managing real time animation
Method
Command
interaction
puts real time interactive mode. Loop MainLoop OpenGL prioritary, but "call back" can intervene at all levels.
Stockage
Writing an image on the disk can slow the real-time interaction and to dephase inputs
(sound, movement, ...) and their interpretation by the program.
To solve this problem one can decompose the process into two processes that are run one after the other:
At first we make interaction with simplified images running well in real time (displaying wired,
without extension, ...).
Then in a second step the sensors are recorded:
interaction stock device
interaction stock(ni)device name("nn")
Initializes the storage of sensor data on ni images.
interaction device(val)
Called for every frame that command stores the current value val of the sensor in memory.
val du capteur en mémoire.
la commande
At the end of the animation stored values ??of the sensor are saved in the file nn.cap
and interventions (<... and ! ...) are saved in the file nn.eff.
To stall the start of storage with a sensor events can be:
1) Give a tempo of na images waiting for::
interaction stock(ni)device name("nn")wait(na)
2) Start when the storage modulus of the sensor exceeds a mod threshold by:
interaction stock(ni)device name("nn")module(mod)
Play
Then a third time one reads these files can affect the sensor outputs recorded values ??and reproduce events,
then we have all the time to calculate complex images and stock on the disk with the command:
interaction play device.
Bitmaps
interpol texture allows animating bitmaps stored in
3D images read by texture directory.
Delayed animation
I started writing anyflo at a time when machines were very slow and where there was no
graphics card with advanced features , this means that the real time was not possible.
As I still wanted to make films I have implemented procedures to automatically manage a
simulated interaction (which I called "endogenous") between, on the one hand, the system
and the real interactor (keyboard in the mouse, ...) and, secondly, between two virtual interactors
(two actors, two processes, ...) or, more generally, between different types of interactors.
I made this model for many films.
Command displ(n) can specify the level of complexity of the rendered image:
n=0: no display.
n=1: OpenGL complexity (real time).
n=2: z buffer complexity (developing).
n=3: ray tracing complexity (developing).
interaction write vol name("nnn")
will store the volume descriptions in files: nnn1.vol nnn2.vol ...
interaction write image name("nnn.eee")
will store images in files: nn1.eee nnn2.eee ... (eee is bmp, jpg, tga).
interaction stock image number(im,ni)
will store images in images numbered im to im+ni-1. These images can be play back by
play image number(im,ni).
Remarque:
Files are named: A1 A2 A3 A4 ... A10 A11 ... A100 A101 ...
To write them A0001 A002 ... A0010 ... A0100 do:
nn="A",(string("4D",im))
Post production
Tools allow to intervene on the image files:
smooth image read("...")number(nb)write("...")
in case of parasitic images.
image(id1,id2)image(id3)interpol coe(c1,c2) mixing
of image sequences
image(id1,id2)image(id3)interpol coe(1-c,c) crossfade
write image NP interpol: smoothing animation
read image directory write directoryinterpol
for concatenate directories with interpolation.
HIGHT DEFINITION IMAGE (DEVELOPING)
Launch anyflo with option:
anyflo hau=2
The image in main memory has 4 times the size of video. When a 'write image', this image will be
the size convolution video.
You can specify the image size anti-aliased (dimx, dimy) and antialiasage factor k:
anyflo x=dimx y=dimy gra=0 mem=1 hau=k
Higher order animation
The idea is to consider each frame of an animation (of order n) as the trace
(ie in no screen) an animation of order n-1.
Simply use the fact that a trajectory as
standard anyflo object can be animated
(other trajectories, dynamic or any other method).
Command
interaction stock image
manages this process.