By John Blyler
Disney’s original science-fiction classic, “TRON,” is credited with ushering in the age of computer-graphics (CG) animation. Today, improved CG animation quality makes it difficult for viewers to distinguish live-action elements from CG-generated ones, as in “Avatar.” In the recent sequel to “TRON,” called “TRON LEGACY,” CG animation was used to create a convincing younger version of the now-older protagonist, Kevin Flynn (played by Jeff Bridges).
The advances in CG filmmaking reflect the marvels of 30 years of advances in semiconductors and high-end, PC-server hardware and computer programming technology. What has gone unnoticed are equal advances made in the world of embedded systems. Let’s look at just one aspect of the improvements brought by the embedded and sensor technology revolution: illumination control.
By today’s standards, the original “TRON” movie had very little in the way of computer effects. At the time it was considered state-of-the-art. Yet many of the computer-generated scenes looked little better than the graphics-limited video games of the day. The suits in “TRON” weren’t illuminated. The “TRON look” was achieved by shooting the movie normally in black and white and then enlarging every frame to 8-x-10, black-and-white film positive cells for rotoscoping. This technique allowed animators to trace over live-action film movement, frame by frame, for use in animated films.
According to Alan McFarland, CTO for Nila—specialists in film and television lighting—the “TRON” animators would then colorize each one of those film cells by hand, using tint dyes and airbrushing and so forth. Each cell was then re-photographed on a backlit animation stand with colored gels. Selective double-exposure techniques gave everything that was supposed to glow its particular aura. Making a feature-length film in that fashion was very expensive by the standards of the day and would be prohibitive in today’s world.
“For ‘TRON LEGACY,’ the suits actually illuminated on their own,” explains McFarland (see Figure 1). “Motion-picture cameras have more than enough sensitivity to capture the illumination as is, making digital rotoscoping of the suits in post-production mostly unnecessary.”
|Figure 1: Self-luminescent suits with embedded electronics were used in “TRON LEGACY.” (Courtesy of Disney)|
Most suits also contained a detachable Identity Disc or Light Disc that was a key element of the story. Each disc contained all of the memories—everything seen, heard, or experienced—by the anthropomorphized programs in the virtual world of “TRON.” Although a program’s glowing disk could be detached from the body to use as a weapon, it was normally mounted on the user’s upper back.
Light-emitting diodes (LED)-based light discs and self-illuminating suits need power and control—even in the virtual world of “TRON.” This is where the use of tiny, low-power embedded systems can win out over the more costly expense of post-production CG animation.
The light discs used LEDs controlled by Xbee modules for lighting, recalls Patterson. “The only time that Nila controlled the disc lighting was when the disc was attached to the costume. The studio’s prop department handled the disc lighting when it was in the actor’s hand.”
The suits were another matter, because they had to be flexible and tailored to the shape of the actor. To achieve flexibility, each suit was illuminated with custom electroluminescent (EL) material that could be shaped into various patterns. McFarland points out that some of the suits, such as Sam Flynn’s costume and Clu’s outfit, had more than 50 individual pieces of EL material. That material totaled some 1100 square inches per costume—almost half of the entire area of a typical human body.
The EL material required 290 V AC at 1100 Hz—albeit at very low current. This unusual power requirement necessitated the creation of a custom 150-watt inverter to convert the direct current (DC) from the battery pack into alternating current (AC). Sam’s costume required two of these inverters while Kevin Flynn’s costume—with fewer but much larger pieces of electroluminescent material—required four inverters. The guards, Quora, and the other suits typically only needed one of the custom inverters.
These inverters were usually located in the Identity Disc hub, which was mounted to the back of the suit (see Figure 2). The available space inside the disc hub was about the same volume as a softball. Each disc contained at least two 150-watt EL inverters plus a daughtercard for the wireless-network lighting control and monitoring module. Power for this embedded system was supplied by batteries that were typically located on the waist of the actor and disguised to look like part of the costume.
|Figure 2: The light disc was mounted on a hub assembly, which contained all of the lighting control and monitoring electronics plus the batteries. (Courtesy of Disney)|
The new, super-high-energy-density batteries were developed especially for the Tesla Roadster. On the Sam Flynn and Clu costumes, these batteries provided about 11 minutes of runtime. They could be fully recharged in 15 minutes.
The limited disc space meant that all of the electronic components had to be as small as possible. Unfortunately, smaller inverters generate more heat, which limited the maximum runtime of a single movie take to about 8 minutes before the suit would overheat, notes McFarland. “Usually, this wasn’t a problem, as Joe Kosinski—the director—set up shots that ran considerably shorter.”
Keeping their cool
Heat-generating inverters reduce the performance of electronics as well as the actors that wear them. How did the performers, already overheating in rubberized fabric suits, keep their cool?
Coolness was achieved through the use of four 20-ton air conditioners, which chilled the set down to about 40 degrees. As McFarland remembers, “Those of us off-camera had to wear parkas to keep warm on the set.” To further ensure that the electronics and actors didn’t overheat, he used thermal epoxy to attach a National Semiconductor LM34 thermistor temperature sensor to the main inductor on the inverters. Temperature data was input to the wireless monitoring-control system module’s analog-to-digital converters (ADCs) before digitizing and transmission to the offset control computer for display.
Writing the code for temperature monitoring was also pretty cool. “Only one line of code was needed to read temperature from the analog-to-digital converter,” notes Wade Patterson, CEO of Synapse. Because the wireless monitoring system was bi-directional, it could be changed on the fly if it wasn’t working as desired. This ability to read sensor values quickly is important—especially as the number of nodes increases to the hundreds.
Wireless control and monitoring
An integral part of the embedded system used in “TRON LEGACY” was the wireless-network lighting control and monitoring module, which was developed by Synapse Wireless. By using the company’s “SNAP” network, Nila’s McFarland was able to turn the suit lighting on and off instantly. Furthermore, the SNAP wireless software returned data to the control computer screen, showing battery levels, runtime, and inverter temperature. This real-time data enabled the movie’s director to maximize the use of special effects by monitoring the suit battery life and inverter temperature.
An appropriately phrased “Sleepy Mesh” state in the SNAP network was used to awaken the radio frequency (RF) transceivers on the nodes as needed. “Sleepy Mesh” didn’t merely put one node to sleep. It could actually issue a network-wide sleep scenario. (Think of the first encounter with the Borg on Star Trek.) The power savings from having the entire network sleeping was significant—far greater than available with a traditional mesh network. In fact, each node’s battery life was extended up to the shelf life of the battery.
Although 104 suits were built to incorporate the wireless monitor-controller system, no more than 25 were ever used on the same day and far fewer used simultaneously. This was not a limitation of the technology, but rather a reality of the demands for shooting a movie. “I doubt I would’ve survived the show if we’d had any days that intense (requiring 104 suits to be controlled simultaneously),” said McFarland
Each Synapse system contained 8 ADC inputs, 10 to 20 digital-output interface control ports, and a low-power, 2.4-GHz, IEEE 802.15.4 personal-area-network (PAN) RF module. Data from each wireless module (see Figure 3) can use AES-128 encryption. “You can even get the SNAP software on a microcontroller the size of a fingernail,” explained Wade Patterson, CEO of Synapse.” We have one customer that has embedded such modules into clothing.”
|Figure 3: The Synapse RF Engine with antenna, part of the overall wireless control and monitoring system.|
Wireless networks—especially in crowded unlicensed bands like 2.4 GHz—are notorious for interference. Yet McFarland noticed little interference, even though the typical range from the controlling PC to the costumes was about 300 ft. Some of the bidirectional SNAP channels actually performed better than others, despite the presence of ZigBee-controlled studio props. To ensure the optimum coverage, McFarland used a USB extension cable to hang the SMA-equipped SNAP USB stick high above the set.
Another advantage of the bidirectional, peer-to-peer control and monitoring network was the improvements to communication coverage as more costumes were on the set. Peer-based wireless networks often improve performance and extend coverage as more nodes are added to the system.
How did Hollywood learn about Synapse Wireless, a small but growing, Alabama-based wireless control and monitoring company? That’s where Nila fits into our story.
Nila’s Alan McFarland was tasked with engineering the illumination and control of the suits for “TRON LEGACY.” McFarland had seen a Synapse technology demonstration in which the SNAP modules and language were used to radio-control a small tank. After comparing products by Xbee, Red Pine, and W-DMX of Sweden, only the Synapse module met the requirements of limited space, bi-directional communication, and ready access to technical support. Synapse engineers David Ewing and Mark Guagenti helped McFarland throughout the production of “TRON LEGACY” with both the development of Python-based SNAP software and engineering support.
Testament to the ease-of-use feature of the modules is that the code to control the suit lighting was developed in less than two weeks, according to Patterson. The coding was relatively easy, thanks to a Python virtual environment that separates the application development from the underlying network-protocol details. (The combination of SNAP and Python is referred to as SNAPpy.) “End-user wireless applications are compiled into processor-independent ’byte code‘ that is run on the virtual machine. This means that the same application can be run on any processor without the need for recompilation,” explained Patterson
Virtual machines running on wirelessly connected embedded processors? Doesn’t that sound vaguely like the world of cyberspace in “TRON?” Perhaps the writers of “TRON LEGACY” should’ve included at least one snappily attired program element in the movie to acknowledge the advances in the real world of embedded systems.
* Read the follow-on story at Entertainment Engineering magazine