AMZ Main

News, Reviews, & More





Science of Avatar

3D Tech

Avatar Game

Virtual Tour of Avatar looks at the 3D technology, virtual performance, shooting in simulcam and more
Excerpt: offers one of the best analytical online articles to date on Cameron's sci-fi adventure, The Virtual Tour of Avatar.

In this flash-media driven feature, looks at several aspects of Avatar's film technology including virtual performance, shooting in simulcam, reinventing lighting, virtual collaberation, rendering flesh, and much more.

The information provided is too much to even try to highlight, but here is an excerpt from the introduction article, titled Avatar team reinvents filmaking:

One of the decade's last movies is likely to be one of the next decade's biggest game-changers. In James Cameron's Avatar, every living being on the alien world of Pandora turns out to be physically connected to those around it, forming a huge data network, a single giant brain, with a consciousness the natives call Eywa, their mother goddess.

It may not be a coincidence that in making the movie, Cameron has reinvented moviemaking in Pandora's image, leaving behind the more-or-less linear, industrial process of prep-production-post and replacing it with a nonlinear, networked process where everything happens at once. It's also where, like Pandora, everything flows through a single Godlike consiousness: Cameron.

The director didn't want to shoot, wait months for visual effects and animated characters, then cobble the whole thing together. He wanted to see what all the CG elements would look like right now, during shooting.

That had been tried for pieces of movies, but never a whole picture. So the process had to be invented, much of it by Lightstorm, Cameron's company, and Weta Digital, the lead visual effects shop.

There's a ton of info to explore in Variety's Virtual Tour of Avatar, so click the source link above to begin.

Dolby 3D and Barco Projection Take Center Stage at Avatar World Premiere

Dolby Laboratories, Inc., and Barco announced today that the Empire Leicester Square will show the world premiere of the James Cameron movie Avatar on Thursday, December 10, using the Dolby® 3D solution for large screens. An invited audience of more than 1,300 guests will fill the Empire Leicester Square, which will act as the lead theatre for this much-anticipated premiere.

“Empire Leicester Square boasts one of the largest screens in the UK, which is why it regularly plays host to world premieres,” said Julian Pinn, Business Development Manager, Production Services Group, Dolby Laboratories. “Avatar promises to deliver state-of-the-art 3D special effects to audiences around the world.

To ensure every seat in Empire Leicester Square enjoys the fully immersing 3D experience, Dolby Production Services worked with Barco, Empire, and Twentieth Century Fox® to ensure the best combination of brightness, image sharpness, and color accuracy to display this spectacular new movie as director James Cameron intended.”

To meet Cameron’s exacting standards for Avatar’s premiere, Dolby and Barco worked to provide a customized solution, comprising two Dolby 3D large-screen systems, featuring four Barco DP-3000 ultra bright digital cinema projectors. Using dual stacked Barco projectors for the left eye and an additional dual stack for the right eye, each set features Dolby's 3D color filter technology.

Six Full Sail University Graduates Work on James Cameron's Blockbuster Film, AVATAR

Full Sail University ( would like to congratulate a total of six graduates who worked on the critically acclaimed James Cameron film, AVATAR.

This film was written and developed by Cameron himself over a decade ago, but was told the technology was not available to make this film. Now in 2009, with the help of Full Sail University's graduates, the film has been completed. The graduates worked on the production from the beginning of November 2006, through to November 2009.

The six graduates are: Malcolm Thomas Gustave; Devin Fairbairn; Joe Harkins; Jeff Unay; Jason Stellwag; Ronnie Menahem; and Juan Peralta. "I chose Full Sail over many other schools mainly for its reputation, track record and amazing facility," said Jeff Unay, Full Sail graduate and Facial Lead for AVATAR.

Pete Bandstra, Program Director for Computer Animation and Game Art further added, "Having a total of six graduates play key positions in an epic motion picture such as AVATAR is truly a testament to Full Sail's commitment to teaching our students to go beyond industry standards. It is the best possible way for them to achieve their dreams."

Can Sensio Bring Avatar 3D to Your Living Room?
Company picked by Ubisoft to encode 3D for the Avatar game
By Susan Krashinsky

If the big names in home entertainment have their way, TV viewers everywhere are going to need glasses. It's already happened in the theatres: Audiences have donned Elvis Costello-esque specs and forked out $3 extra per ticket to see 3-D releases such as Up, Monsters vs. Aliens and Ice Age. Soon, the industry plans to bring the trend home with new televisions, DVD players and video games.

But there's a technological challenge to bringing three-dimensional images into the living room, one that a Montreal company hopes it can make a small fortune from – if it can get electronics manufacturers to buy into its solution.

The company he runs, Sensio Technologies Inc., has spent 10 years preparing for this moment, developing compression technology that squeezes two feeds into one and splits them again at the other end, to make home entertainment work in all dimensions.

The company has already scored a deal to encode 3-D for the Avatar video game. Ubisoft Entertainment SA is launching the game alongside the release of James Cameron's 3-D epic in December – the most expensive movie ever made. Sensio is eyeing DVD releases next. “These guys want to do the releases for next year, 2010.

Electronic Image / Transfer

In November 1995, Modern Videofilm opened a specially designed Glendale facility to service studio clients in the preparation of product for electronic distribution.

Today, the Glendale facility not only sets the pace in feature mastering for D-Cinema and home distribution, but also provides a wide array of support services like film scanning, film recording, down conversions, QC, and duplication to almost any video or electronic file format.

Modern also provides a 33 seat D-Cinema theatre with a Stewart 24 foot micro-perf screen. Their scanning and recording hardware is network-optimized for seamless integration into our Digital Intermediate pipeline.

James Cameron: "The experience of color correcting Ghosts of the Abyss, in real time on the big screen in HD is a dream come true for me. After two decades of cursing the vagaries of photochemestry while color timing my films, it feels like we have rocketed out of medieval times into the 21st century.

I really enjoyed working in Modern VideoFilm's big theater, which exactly reproduces the size and feel of a movie theater, and correcting the color on my film at the touch of a trackball. And not only changing color, but hue and black values and contrast, and so on, and with power windows, actually changing the color in certain areas of the shots.

Working in the data environment, going from Inferno to I-Q to Da Vinci, we have performed hundreds of subtle manipulations to the shots without loss or noise being added, and the end result is as crisp and clean as the camera original.

And to top it off, we've found that what we see on the big screen at Modern VideoFilm translates perfectly to film without requiring scene to scene corrections at the printing stage, even in IMAX. So it has been a slam dunk success of an experience."

DLP Spearheads 3D Development
By Robert Fure

You may have noticed a lot of 3D related news lately, and with good reason. 3D is poised to gobble up a sizable portion of the market. OK, well that may be an exaggeration now, but there is no denying that by 2010, 3D will be a fixture in features and, hopefully, in your homes and gaming systems.

DLP makes 3D possible with the aide of the digital micro-mirror device. The DMD, often called “the chip” to make it simpler on the non-technological geniuses like myself, contains several hundred thousand tiny mirrors that are used to reflect light.

Your TV only has to spit out images at 60hz for your eye to recognize it as a full quality image. So in the same amount of time, this chip is sending twice the amount of information which is, uncoincidentally, the exact amount of information need to present a full quality, sharp 3D picture. Film, for comparison, is projected at 144hz.

Any TV you purchase that has the DLP logo on it is already prepared for 3D viewing. It’s just a matter of the content becoming available. You can play your favorite video games in 3D, provided developers continue to step up. Or, dual screens. Or, two completely separate images presented to two different people.

I mentioned 3D gaming, well, its definitely coming. The Avatar game, to compliment James Cameron’s movie, is being created in 3D. NVidia has recently partnered with Mitsubishi to bring 3D into the home and to the home PC.

Soon you’ll be able to pick up 3D ready monitors off of the shelf and blaze away at your favorite games in 3 beautiful dimensions. What’s the hold up? Primarily, content, though that may quickly become a non-issue. For 3D really to take hold in the world, it needs to generate money. To generate money you need product.

"Movie News, Movie Reviews,
and Unsolicited Attitude"


All of the articles on this page are excerpts, click on the source link for the complete article.

Anatomy of a Motion-Capture Scene in Avatar
For most of the film, Avatar, the performance-capture stage, known as the volume, was the center of the action. Here’s how James Cameron turned a stage performances into a rich 3D, CG scene.
By Anne Thompson | Source:

Stage render by Axel De Roy; Diagrams by Locografix

Most of James Cameron’s space epic, Avatar, was shot on a performance-capture stage, known as the volume, in Playa Vista, Calif. The volume was rimmed by 120 stationary video cameras, which could record the movements of all actors at once in 3D, with submillimeter precision. Data from the cameras was streamed into Autodesk software, which translates actors’ movements into digital characters in real time within a low-resolution computer-generated environment.

So riding a fake banshee mockup onstage instantly translated to CG footage. Multiple cameramen were used on set for reference video, but because the volume essentially captures performances from every angle at once, Cameron could digitally render whatever angles and shots he wanted after the performance, adjusting the camera movements while viewing playback.

Like many actors in Avatar, Zoë Saldana plays a fully computer-generated character, Na’vi princess Neytiri. To map her movements to her digital doppelgänger, Saldana wore a motion-capture bodysuit with reference markers and stripes. She also wore a head rig designed by Cameron that aimed a small video camera at her face. That camera tracked green ink dots, painted on Saldana’s face, throughout the scene, giving Cameron closeup-level detail of changes in expression to map to Neytiri’s CG face. Click the source link above to read the complete article about Avatar's motion capture.

How James Cameron's Innovative
New 3D Tech Created Avatar
By Anne Thompson | Excerpt:

Director James Cameron is known for his innovations in movie technology and ambitions to make CG look and feel real. His next film, Avatar, will put his reputation to the test. Can Cameron make blue, alien creature look real on the big screen?

With all the attention focused on the film, anything short of perfection may not be good enough. Here is how Cameron plans to make movie history with a host of new technologies and years of development.

The 280,000-square-foot studio in Playa Vista, Calif., has a curious history as a launching pad for big, risky ideas. In the 1940s, Howard Hughes used the huge wooden airplane hangar to construct the massive plywood H-4 Hercules seaplane—famously known as the Spruce Goose. Two years ago, movie director James Cameron was in the Playa Vista studio at a crucial stage in his own big, risky project.

He was viewing early footage from Avatar, the sci-fi epic he had been dreaming about since his early 20s. Cameron’s studio partner, Twentieth Century Fox, had already committed to a budget of $200 million (the final cost is reportedly closer to $300 million) on what promised to be the most technologically advanced work of cinema ever undertaken. But as Cameron looked into his computer monitor, he knew something had gone terribly wrong.

The film—although “film” seems to be an anachronistic term for such a digitally intense production—takes place on a moon called Pandora, which circles a distant planet. Cameron wrote his first treatment for the movie in 1995 with the intention of pushing the boundaries of what was possible with cinematic digital effects. In his view, making Avatar would require blending live-action sequences and digitally captured performances in a three-dimensional, computer-generated world.

Part action–adventure, part interstellar love story, the project was so ambitious that it took 10 more years before Cameron felt cinema technology had advanced to the point where Avatar was even possible. The scene on Cameron’s screen at Playa Vista—an important turning point in the movie’s plot—showed Na’vi princess Neytiri as she first encounters Sully’s Avatar in the jungles of Pandora. Everything in the forest is luminous. Glowing sprites float through Pandora’s atmosphere, landing on Sully as Neytiri determines if he can be trusted. Click on the source link above for the complete article.

Avatar: 21st Century Motion Capture
By Jody Duncan | Excerpt: Cinefex Magazin
Transcribed by SFMZ

As James Cameron originally envisioned the Avatar project, he would film live-action plates for the film in rainforests, and then add the CG Na'vi and avatar characters and creatures to those plates, just as Stephen Spielberg had populated his Jurassic Park, in part, with digital dinosaurs.

But, performance-wise, there was a huge chasm separating an ambling, expressionless CG Brachiosaur, on screen for a few minutes, and emotive, humanoid characters that would have to carry a feature-length film.

What Cameron needed to make Avatar a reality was a way to motion capture not just the gross body movement of an actor, but every nuance of that actor's facial performance.

"Our goal in using performance capture," noted Avatar producer and longtime Cameron collaborator Jon Landau, "was not to replace the actor with our computer animated character, but to preserve the actor - because what a great actor does and what a great animator does are antithetical to each other. Dustin Hoffman can sit there and do nothing in All the President's Men, and you are riveted. But no animator would ever animate a character to just sit there and do nothing. We wanted performance capture to be the 21st-century version of prosthetics, something that would allow actors to play fantastic characters that they could not otherwise play.

Gross body motion capture had beome a relatively routine filmmaking technique by the time Cameron and Landau began considering the Avatar project; and, in fact, Cameron had used motion capture to create digital background passengers aboard Titanic. "Even back on Titanic," commented Cameron, "we knew how to put big marker balls on people and get a skeleton from motion capture. But we didn't know how to capture faces. Faces were still hand-animated. The thing that had not been cracked, the thing that would have to be cracked to do this movie, was real human facial response in CG.

Nobody had done that yet." A thorough examination of existing facial capture technology led Cameron and Landau to the conclusion that it was still in its embryonic stages, and that Avatar's time had not yet come. "I went off and did Titanic, figuring I would look at Avatar again afterward. There was no rush to make it because it was a timeless project. I knew it would still be a good idea in two years - or 20 years - and so I was content to stick in a drawer for a while."

The new Avatar of technology in movies
The James Cameroon blockbuster uses performance capture technology, which creates computerized images from real human action
By Akanksha Prasad | Excerpt:

Starting last Friday suddenly, the world of cinema began talking about technology and movies. And the reason? No doubt the latest blockbuster from James Cameroon - Avatar. Introducing the 3D experience to a larger scale the movie is said to turn a new leaf in the history of movie making.

Remember, in an earlier story on the significance of technology behind the movie experience, Harry Potter and the software mystery we mentioned about the upcoming 3D experience and the new technology called stereoscopic 3D vision, which is catching up really fast in the market.

And now Avatar underscores that. According to reports, the movie uses performance capture technology, which creates computerized images from real human action. So every action is more real and palpable. The footage is built from around 70 per cent Computer Generated Imaginary (CGI), which means the cast had motion-capture suits that carried sensors. These sensors enabled the computers to register the movements of the body showing performance capture.

It is mentioned in a report that for Avatar, the makers of the movie had designed a different hi-def 3D camera system. It is around 50 pounds in weight and used a mix of two Sony HDC-F950 HD cameras at a distance of 2.5 inches in order to capture the stereoscopic separation of human eyes. In other words it has the two camera lenses that converge on a focal point with the help of a computer and helps in smooth camera moves and action sequences.

In an interview on the making of the movie, actors Sam Worthington and Zoe Saldana said that the entire experience of working on a rich movie is different. The challenge was that since the recording was following a motion-capture technology that records a 360-degree view, the actors have more freedom and they need to stress on the character play than the camera.

ILM steps in to help finish 'Avatar' visual effects
By Daniel Terdiman | Source:

About a year ago, with James Cameron's science-fiction epic "Avatar" well under way, it became clear that Weta Digital, the visual effects studio doing much of the computer generated imagery, or CGI, on the project, was a bit in over its head. At that point, the movie, which opened Friday, was about 40 minutes longer than it ended up being, and what was needed to finish the project was another company that could come in and lend a helping hand--and do so at the same, very high level, that Weta was working at.

And that's where Industrial Light & Magic came in, recalled John Knoll, the Oscar-winning visual effects supervisor tasked with parachuting in to help finish what was, more than on most films, the crucial job of crafting the "Avatar" CGI work.

What followed was months of coordination between ILM, Weta, and Cameron's production company, Lightstorm Entertainment, with a primary goal of ensuring that the two visual effects teams, one in San Francisco and the other in New Zealand, avoided any unnecessary duplication of effort, even as both sometimes found themselves working on effects for the same movie sequences. For ILM, this wasn't the first time it had been called in to help rescue another effects house, but it may well have been the first time it did so for one as big and as accomplished as Weta.

And while ILM's overall contribution to the finished film was minor compared to Weta's, the fact that "Avatar" came out on time and is being seen as a visual tour de force is certainly due, in part, to ILM's ability to come in and, if not save the day, at least contribute mightily to the day turning out well. For Knoll, the challenge of working alongside Weta was about identifying a body of work that limited the number of assets the ILM team had to develop and which would allow them to be the most helpful.

Ultimately, they were handed the keys to creating the visual effects for many of the specialized vehicles in the film, including the Valkyrie, a large shuttle used to move people and equipment, and several different types of helicopters, as well as the landscapes those vehicles lived in. ILM also did the effects work on the film's final battle scene, taking responsibility for the shots of all the vehicles taking off, as well as the sequence's cockpit interior shots.

Intensity3D to Debut Under the Dome at Euromax
By Erin McCarthy | Excerpt:

The forthcoming Euromax conference will see the launch of Intensity3D, a radical new concept in multi-dimensional video projection. Intensity3D combines the best in digital cinema and curved screen technology to provide new levels of image quality on a curved screen.

Developed by creative design house 7thSense in conjunction with Danish-based systems integrator and production services provider Sirius 3D, Intensity3D integrates a DCI-compliant playback server, 7thSense's Delta stereoscopic media server, and two digital cinema projectors (one image laid over the other) to produce a geometrically correct, extremely bright image up to 24m wide.

This allows content producers to show both 2D and 3D footage to its best effect, and takes the end-customer experience to a completely new level. Intensity3D uses a patent-pending, real-time image mixing algorithm that creates an encrypted path for content producers to access, meaning that the images created by its unique architecture are not just uniquely bright and clear, they are also protected from copying. The creation of such a secure workflow also opens up the possibility of a new business model, with venue operators saving costs by sharing secure libraries of 2D and 3D clips across multiple sites.

The venue that hosts this year's Euromax conference is Copenhagen's Tycho Brahe Planetarium, which is also upgrading its systems to the Intensity3D standard. "We are delighted to be showing Intensity3D formally to the industry for the first time at Euromax," says 7thSense co-founder and director Ian Macpherson. "The conference brings together some of the world's most influential and creative giant-screen theatre operators, 3D film producers, distributors and marketeers, and we will be using the newly upgraded Tycho Brahe Planetarium as a showcase for what our system can do."

"Ever since Tycho Brahe became the first theatre in the world capable of projecting digital 3D onto the dome in 2006, we have been working to increase the size, resolution, and brightness of the dome projection to unprecedented levels, while maintaining the robustness and ease of maintenance of the system," Iversen explains.

"The installation of Intensity3D is the culmination of this research, and it is fitting that in this, the International Year of Astronomy, people of all ages will be able to benefit from the system's fabulous imagery at Tycho Brahe, as they explore our universe and increase their knowledge of it. Together with our partners at 7thSense, we believe that many more Intensity3D installations will soon follow so that this experience can be shared by audiences all over the world." The next Euromax conference takes place on 14-15 June 2009 at Tycho Brahe Planetarium, Copenhagen, Denmark.

The Tech Behind 3D's Big Revival
By Erin McCarthy | Excerpt:

3D has been around for a century, but only now are we seeing 3D in Super Bowl ads and in big Hollywood 3D releases like Coraline and Monsters vs. Aliens. So what has convinced Hollywood that 3D is finally ready for its closeup? The short answer is that technology has finally caught up with the concept.

Hollywood is buzzing about 3D. Dreamworks Animation CEO Jeffrey Katzenberg has compared it to the introduction of color. Director James Cameron delayed the release of his stereoscopic epic Avatar in part to give theaters more time to convert to 3D capability.

A dozen or more stereoscopic films will be released in 2009, and more than 30 movies are in production. But stereoscopic films are not a revolutionary concept; in fact, audiences have been paying for them since The Power of Love in 1922. The golden age of 3D was in the 1950s, with a brief resurgence in the 1980s. Each time experts heralded the format as the next big thing in filmmaking, and each time, the surge quietly subsided.

So what has convinced Hollywood that 3D is finally ready for its closeup? The short answer may be that technology has caught up with the concept. Stereoscopic films of the past were plagued by problems. “People always liked 3D,” says David Cohen, an associate editor at Variety. “It just didn’t work very well.” The 3D format is now much more reliable, thanks to the introduction of digital technology and products developed by companies such as RealD, based in Beverly Hills, Calif., which spent 18 months combining stereoscopic science with digital projection.

How it Works: 3D Animation

A stereoscopic movie consists of two slightly different views of the same action—one for each eye—which creates the illusion of depth. An audience wearing special glasses is able to see the heroine below jump into “personal space” (action that appears to occur in front of the screen), while the pursuing robot remains in “world space” (action that seems to take place behind the screen). Most studios plan to release upcoming animated films in 3D; here’s how it’s done.

Animation: Animators capture the action with stereo camera rigs they create in a computer. To ensure that no elements in a scene appear flat, filmmakers shoot with as many as three dual rigs. Commercial software and proprietary tools allow animators to precisely control what appears in personal space and world space, and where the different camera views coincide.

Projection: A single digital projector, outfitted with a photo-optical device, superimposes the two films—each with a different polarization—on a special silver movie screen that is capable of showing both 3D and 2D films. The projector triple-flashes each frame for each eye for a total of 144 frames per second—double the rate of regular cinema.

The View: Audience members wear glasses with circular polarization, which allows only one image into each eye by restricting light; the brain combines those images into 3D. The glasses also maintain stereoscopic fusion.

Newly Developed 3D Digital
Real Image System
FUJIFILM FinePix Real 3D System

FUJIFILM Corporation today announces a radical departure from current imaging systems with the development of a completely new, real image system (3D digital camera, 3D digital photo frame, 3D print) that marks a complete break from previous attempts to introduce this technology.

The arrival of digital photography over a decade ago opened up so many new ways of enjoying images, not only through capture, but also through manipulation, printing and display. Sales of digital cameras, and other devices like camera phones or webcams have raced ahead of what experts had expected because of the sheer scope of what has become possible in digital imaging. So many more consumers are enjoying photography through their cameras, PCs and prints than was the case in the heyday of film.

The Technology Behind the 3D Camera

The 3D camera depends heavily on a newly developed chip called the ‘Real Photo Processor 3D’ which synchronizes the data passed to it by both sensors, and instantaneously blends the information into a single high quality image, for both stills and movies.

‘Built-in 3D auto’ determines optimal shooting conditions from both sensors. 3D auto means that as soon as the shutter is depressed, key metrics for the image, such as focus, zoom range, exposure, etc, are synchronized.

Viewing with the FinePix Real 3D System

A new 8.4 inch, “FinePix Real 3D Photo Frame” with over 920,000 pixels has also been developed. The LCD monitor on the camera and the stand alone display panel share similar technologies in that the problem of screen flickering and image ghosting, which has beset earlier developments, has been solved, giving crisp, high resolution viewing of images in glorious 3D or standard 2D. A newly developed “light direction control module” in the back of the LCD controls light to right eye and left eye direction. This light direction control mechanism enables easy and high quality 3D viewing without special 3D glasses.

Using know-how gained through years of development of Fujifilm Frontier, Fujifilm have developed a 3D printing system using a fine pitch lenticular sheet giving highprecision, and fine quality multiple viewpoint 3D like never before.

FinePix REAL 3D SYSTEM is also paving the way for new possibilities in 2D photo enjoyment. At the heart of the system, is a new concept camera fitted with dual lenses. Each lens can capture stills or movies from a slightly different position, producing the basis of the 3D image. By combining new dual lenses System, A new function was achieved.

For example, Image quality improvement function (Simultaneous Dual-Image Shooting: Multi- Expression). For users, this is just one possibility from a dual lens camera. Other fascinating possibilities include: Simultaneous Dual Image Shooting: (Telephoto and Wide Angle), Simultaneous Dual-Image Shooting (film modes), Ultra-Wide Panoramic Shots Shooting, and Movie-Still Simultaneous. New dimensions in imaging mean a wealth of new possibilities which will revolutionise the way consumers enjoy imaging.

AMZ Main

News, Reviews, & More





Science of Avatar

3D Tech

Avatar Game

Site Info | Site design by SFMZone. Copyright 2010 All Rights Reserved. | TOP^