-by Kelly Murdock
During your last look at 3D animations, you covered camera fly-bys, animated models and textures, and keyframes. There are still many ways to enhance your animations, both inside and outside the 3D package. This chapter covers several of these advanced techniques:
Once again, you'll be introduced to many different products to give you a well-rounded base from which to create.
Animation shows how things change over time; it would be pretty boring to have an animation where nothing changes! Most of the time, things change by moving their position or orientation, but another type of movement is when the surface changes. Think of how the ocean surface changes. An animation of the beach scene wouldn't show the ocean model moving back and forth, but the ocean surface rippling along.
To animate surface movement, you create a deformation object or matrix that defines how the surface moves. You can then move the object through this deformation object or move it over the stationary object. Take a look at an example of how trueSpace does this:
By using deformation objects, you can make objects move in different ways without going to the trouble of changing their surfaces. This works by moving the deformation object or moving the model. You can also use the Stop Deforming button, which makes the object unaffected by the deformation object. These two buttons together allow you to turn a deformation on and off as the animation progresses. In Figure 9.1, I've used a deformation object to bend a fish as it moves through the water.
When a car goes speeding by and you catch it out of the corner of your eye, it looks like a blur whizzing past. Motion blur is an advanced animation technique in which you tell the computer to blur objects that are moving quickly in the scene. This blur helps you understand the speed of objects and adds to the movement's realism.
Creating motion blur in trueSpace is very simple. It's a render option that you specify when you render an animation. Here's an example:
Note |
Field Rendering is how you set the focus of the rendered image. If you want the nearby and faraway objects blurred, then turn the Field Rendering effect on and change the setting to the focus distance you want. |
Tip |
To create a time-exposure effect, set the Blur Length setting equal to the total number of frames. |
In Figure 9.2, one frame of a motion-blurred animation is shown. To really see the effect, you need to run the entire animation. Motion blur applies to all moving items, whether they are models, textures, lights, or reflections.
Have you ever driven a car that suddenly left the road and floated above it, or thrown a ball into the air that never came down, or held a machine that suddenly fell into different parts? Controlling the movements of 3D objects can be difficult because they don't have to obey the laws of nature. That is, not unless you tell them to-that's where constrained motion and behaviors come in.
Constrained motion tells a model the limits of its motion-that it can move this far but no farther or rotate only halfway around. For example, you can make a head turn, but not turn all the way around. After all, it's not very realistic if your human figures have their heads on backwards. You can also link parts together so that they move in relation to one another.
With defining behaviors, you can, for example, tell your models to sit, roll over, and beg. By giving your models behaviors, they can interact with other models and the scene without being directed along every step. For example, if you give a ball a bounce behavior, then it automatically bounces off an object when it hits one. In the following example, Ray Dream Studio uses both constrained motion and defined behaviors:
Note |
Here are the other types of constrained motion: |
Note |
Other behaviors include Point At, Bounce, Inverse Kinematics, Track, and Alignment. |
With these settings, it becomes much easier to animate your models without worrying about parts flying away from the rest of the model. Behaviors can automate simpler motions to ease the animating process. Figure 9.3 shows the three-wheeler model with these settings.
Imagine you want to animate a scene in which a simple 2D animation you've created is playing on a television screen in a 3D scene. The question is-would you have to re-create the simple 2D animation using 3D models, or is there some way to use the existing 2D animation? The answer is, you can use the existing 2D animation-with rotoscoping. Rotoscoping is the technique of adding 2D animations into a 3D scene.
Not only will rotoscoping work for the television example, but it's also useful as another way to animate textures. Think of the ocean example; you could rotoscope a simple repeating animation of waves to create a realistic ocean scene.
Rotoscoping isn't limited to models, either. Many packages let you rotoscope background images, which is useful for creating the sense of motion without having to move models. Look at how Ray Dream handles rotoscoping:
The example in Figure 9.4 shows how rotoscoping displays an advertising message across a new line of furniture.
Figure 9.4: Advertising a new furniture line with rotoscoping.
Inverse kinematics is a buzzword in 3D graphics. A package that supports it can more easily animate complex systems of interrelated parts than a package that doesn't. The human body is an excellent example of a complex system of interrelated parts.
To make a human model do something as simple as walking, these are the steps you'd have to use in a 3D package without inverse kinematics: First move the body forward, then move the thigh forward, then the calf, then the foot, then the toes, then the toenails. Since the 3D package doesn't know the relationship between the different parts, each part has to move independently.
Kinematics helps define how these parts are connected. The connection helps determine how the parts move as a system; therefore, when the leg moves, the foot follows, and when the arm moves, the hand follows. Furthermore, you can apply constraints to these parts so that the system won't move in unnatural ways, like bending a knee backward (unless you're modeling an ostrich).
Any package that supports hierarchy structure has the advantage of kinematics; that is, child parts attached to a parent move along with the parent. The inverse part is where inverse kinematics differs. In packages supporting inverse kinematics, the child parts still understand the connections. This means you can pull a character's big toe and its leg will follow. Inverse kinematics makes it very easy to position animation keyframes.
Until recently, inverse kinematics was available only in high-end 3D packages, but now it's common to find it in low-end packages, such as Ray Dream Studio and Martin Hash's 3D Animation (MH3D). Take a look at how MH3D uses inverse kinematics to animate human characters:
Note |
The scene starts with a default light, camera, and camera target, but these can't be moved. Adding new elements replaces the default ones with ones you can move. |
Figure 9.5: A duck wadding across the screen, produced by using inverse kinematics.
Note |
The figure will probably be displayed in Bound mode, which makes it look like a group of boxes. This is the fastest draw mode. You can change the draw mode by opening the Attributes panel with the command Window | Attributes Panel. Under the Type box are buttons with a circle, a diamond, and a square-they represent the Curved, Vector, and Bound draw modes. |
In Figure 9.5, you can see a frame of the duck waddling across the screen. Although this motion is simple, using inverse kinematics is invaluable when you're animating complex motions.
Remember a few sections back when you learned how to use constrained motion to link objects together? Well, sometimes you don't want your models to be organized-you might want to have parts fly off chaotically into millions of pieces. These millions of pieces are called particles, and when grouped together are known as a particle system. These particle systems can be controlled and given behaviors.
Three common types of particle system movements are shatter, explode, and atomize. These special effects are so common that some 3D packages have them built in, and the results are, well, chaotic.
In Studio Pro, these three particle motions can be used to create some spectacular effects, like the following:
Figure 9.6: Images from an animation of a bowling pin atomized in Studio Pro.
3D packages sometimes play leapfrog with features, so to get the latest effects and features, you often have to buy the newest product released. To stop this insanity, 3D packages are beginning to support plug-ins. This way, additional functions can be added to a package without buying a whole new version.
In much the same way that Netscape and Photoshop have added functionality to their core products, plug-ins can add special effects or streamline tedious tasks. Although it's a new concept for some 3D packages, others have used plug-ins for a while.
The king of these 3D package plug-ins is the granddaddy of 3D
animation for the pc-3D Studio. 3D Studio uses what are called
IPAS routines, which exist as different programs that run
in the 3D Studio environment.
Note |
3D Studio is made by Autodesk, the same people who make AutoCAD, a popular pc-based CAD package. Autodesk recently renamed its multimedia tools group and now calls it Kinetix. |
There are several different types of IPAS routines available for 3D Studio; most of them are made by third-party vendors.
3D Studio isn't the only 3D package to support plug-ins. Lightwave and SoftImage also support plug-ins, and even Ray Dream Studio ships with an extension kit that helps third-party vendors build plug-in modules.
Because of the wide variety of plug-ins available, it wouldn't do justice to show one as an example. Over time, you can expect the more popular plug-ins to begin to show up within the actual program.
Remember in the last chapter how you used Photoshop to touch up some of your 3D images? You could do the same thing with animations if you edited each frame individually, but that would be a lot of work. A better solution is using a video-editing package like Adobe's Premiere; Adobe has created several great editing tools.
So what can you do with video-editing software? You can move the frames around and reorder them, you can add transitions, such as dissolves or fade-outs, and you can add sound to your animations.
Most animations move the camera or objects consistently from start to finish. To produce an animation with several different viewpoints, you need to re-render each animation with a different camera selected. These different segments can then be combined into a single animation by using a video-editing package like Premiere. Take a look at how this is done for an accident reconstruction sequence:
The CD-ROM shows the finished product, and Figure 9.7 shows the Premiere environment with the line-up compiled. There's a lot more you can do with Premiere, such as loading several animations and overlaying them. Before you leave Premiere, you'll look at another valuable addition that can liven up your 3D animations.
Adding sound to your animations is more of a requirement these days-nothing makes the experience more real than sound that's well synchronized with the action. Adding a sound track with Premiere is easy:
Tip |
Because of the size that sound data takes up, limit all sound that will go on the Web to 11 KHz, 8-bit mono sound. |
I'd show you the results of this example in a figure, but that wouldn't help much since it's an example of an animation with sound, so go to the CD-ROM to hear it.
Premiere lets you draw right on top of the individual frames, but similar packages are beginning to appear that allow you to add special effects like bubbles, fire, and lightning to your animations in what is called post-production, or the work done after the animation is rendered.
One such package is Strata's MediaPaint. The core product has
functions much like an image-editing package, but add-in packages,
such as Special Effects, Volume 1, can easily extend the features
of the core product. This special effects pack lets you easily
add otherwise difficult effects, such as bubbles, laser beams,
fire, fireworks, lens flares, and lightning, among others. Now
for an example.
Caution |
MediaPaint requires that your computer be set to 24-bit true color mode for the program to work properly. |
Adding special effects after an animation is complete offers you another way to enhance your 3D animations for your Web site. For this example, the sparks flying off the Apollo command module as it re-enters the atmosphere would have been difficult to create in a 3D package. Figure 9.8 shows the MediaPaint environment.
Figure 9.8: Using Strata's MediaPoint to add special effect to your animation.
You've now expanded on what you've already learned and have seen a sampling of what tools and technologies the real 3D pros use. To create that special scene you see in your mind's eye, you may need to use some of these advanced tricks to unfold your vision to the world. Not everything you need to be a professional 3D animator was covered, but at least now you know where to look.
So where do you go after learning about creating advanced rendered images? Well, anywhere you want actually, but I would recommend the following:
I used constrained motion and behaviors to animate my models, but their motions still don't look very realistic. Is there any way to improve my models' motions? | |
Most of the high-end 3D packages support motion capture data. Motion capture is the process of placing reflective tape or sensors on an actual person and recording the motions of these sensors as
the person performs some motion. The recorded motions are recorded in a format that 3D packages can read. You then line up the points of motion data with your model, and the model is controlled by the motion data.
These systems for recording motion data are very expensive, but the resulting motions are extremely realistic. Viewpoint Datalabs also sells a limited number of motion capture datasets. Contact Viewpoint if you're interested. | |
I'm interested in the plug-ins that add more functions to your 3D package. Is it possible to create your own plug-ins? | |
Several companies make their living from creating IPAS routines. Some of the more popular are 4Division, Pyros Partnership, The Yost Group, and Digimation. These teams have serious graphics programmers,
and their products represent the state of the art.
If you're a programmer and would like to try your hand at creating plug-ins, contact the company that produced your 3D package and ask them about an SDK or software developer's kit. These kits have the information you need to begin building your own plug-ins. | |
I noticed that you covered video-editing packages as a means of editing 3D animations. Can these products be used to integrate real-life video with 3D-generated models? | |
Yes, both Premiere and MediaPaint can be used to integrate real-life video with 3D-rendered models, scenes, and elements. Premiere does this by loading the 3D animation into one channel and the video into another, then marking how they overlay one another. |