Sophisticated systems create digital environments where designs can be built, used, and serviced before a single part is made.
By Jack Thornton
Say your project is to design a new car or jet fighter. What will it look like? How will people operate it? Maintain it? Assemble it? Experienced design engineers often have a good idea of what the answers will be. But of course, when entering new territory, it’s impossible to foresee everything.
Immersive engineering, which combines motion capture, virtual reality, and other technologies, is intended to answer the questions in detail—and discover potential problems—long before anything is actually built. Fundamentally, immersive engineering integrates virtual reality with motion tracking technologies, computer-aided design, simulation and analysis, and solid modeling.
Immersive engineering is a new way of probing engineering issues—all together and all at once—from product design to manufacturability, assembly, quality, productivity, lifecycle costs, maintainability, and ergonomics. Companies that use the technology say it saves them money on a number of fronts.
Technicians at Lockheed Martin wear motion tracking sensors (above) as they mime aircraft carrier deck tasks. The information captured animates digital avatars (below) in simulations.
Its proponents say that immersive engineering can shorten development time and reduce production and maintenance costs.
Ford Motor Co. and Lockheed Martin Co. are pioneers of immersive engineering and each has three installations. Managements at both companies see immersive engineering as a strategic and sustainable competitive advantage.
Immersive engineering is also in use at Bell Helicopter Textron near Dallas and at United Technologies Corp.’s Sikorsky helicopter unit in Stratford, Conn.
According to Jeff D. Smith, director of special projects at Lockheed Martin Space Systems Co., immersive engineering has “the means to fundamentally change the way we work.”
The value of the technology lies in uncovering potentially troublesome engineering issues early in the design phase, often months before designs are nailed down for approval. That allows modifications to be made in the digital model, well before product launches, when modifications are far less costly than they will be once production starts.
Troublesome engineering issues typically have several contributing factors and may have more than one root cause. Such problems are rarely easy to comprehend, and often they are way too big to easily display on desktop computer monitors. To help engineers, and everyone else cope, all those who can help solve a problem are brought into immersive engineering theaters. Technicians, production workers, managers, and engineers are immersed in real-time, life-size digital displays. The challenges are presented to them with motion-tracked, ergonomically accurate avatars—digital humans.
The fundamentals of immersive engineering were developed at the Lockheed Martin Aeronautics Co. in Fort Worth, Texas, with a system that it called the Ship/Air Integration Lab. The company also has an immersive engineering installation at its Center for Innovation in Suffolk, Va., to evaluate technology for various parts of its business.
Its newest system is the Collaborative Human Immersive Lab, or CHIL, which opened at the end of 2010 at Lockheed Martin Space Systems in Littleton, Colo. All three installations are linked so analyses done in one can be shared in the others.
Allison Stephens directs a study of the physical exertion of an assembly task—installing a console between a vehicle’s two front seats—at Ford’s Dearborn Ergonomics Laboratory in Michigan.
To prepare an immersive engineering simulation, tasks under study are performed or mimed by technicians and workers who wear a couple of dozen spherical reflectors or markers that enable the digital replication of their body movements—motion tracking. Whether attached to tight-fitting body suits or just mounted on heads and hands, markers locate and orient each joint in 3-D space and capture its movement for replay in the virtual world of an immersive simulation.
All the movement, up to six people working in sync on a complicated task, is captured and tracked by digital video cameras. Software sorts out and analyzes the overlaps in the digital images, then calculates object positions and human motion for use in additional simulations.
The Lockheed, Ford, and Sikorsky cameras and software, called Cortex, are provided by Motion Analysis Corp. of Santa Rosa, Calif. The company’s systems are also used in medicine and sports, for industrial measurement and control, and for animation in movie making.
Simulations can support production-engineering decisions on how a task can best be done, given workspace clearances, workers’ reaches, sequencing, and pacing. All these variables can affect manufacturing variability and quality. Simulations can also guide ergonomic decisions to minimize the risk of injuries due to heavy lifts, off-balance or awkward work positions, and so on. Ergonomic and conflict improvements can reduce process variance, boost quality, and speed up field service and maintenance.
Solving some immersive engineering problems requires technicians to alternate between motion tracking and watching their movements displayed through avatars in real-time 3-D video. This back-and-forth, real-time iterative capability requires technicians to wear both reflective markers and head-mounted displays, and is a key benefit in immersive engineering.
Engineers and others can collaborate, locally or remotely, in a lifelike, fully realistic virtual environment.
Lockheed Martin’s Ship/Air Integration Lab in Fort Worth supported and verified key engineering solutions and helped the company win the contract for the F-35 Lightning II aircraft, originally known as the Joint Strike Fighter.
Versions of the aircraft will be flown by the U.S. Navy, Air Force, and Marines, and the militaries of several allied countries. After the aircraft’s design was complete, SAIL’s mission was broadened to include safety analyses and facility reviews. SAIL is now called the Human Immersive Lab.
The CHIL in Littleton was created to find potential design and assembly difficulties and develop solutions while everything is still in digital formats.
“The purpose of CHIL is to ensure flawless execution, that is, getting it done right the first time,” said Mark D. Stewart, Lockheed Martin Space Systems’ vice president for assembly, test, and launch operations. That goal is especially important for spacecraft since nearly every one is unique, so each one presents new challenges to engineers and assembly technicians.
Smith, the director of Space Systems special projects, added, “CHIL is about the virtual creation of our products and associated processes in digital form before we build the physical products.”
Immersive engineering at Ford (above) demonstrates body positions during installation of major components. The development of the Orion crew capsule was guided by immersive engineering applications at Lockheed Martin Space Systems.
The immersive labs permit every assembly step and every connection in both its spacecraft and its military aircraft—tens of thousands of electronic and electro-mechanical components—to be verified and documented.
According to Lockheed Martin literature: “Using motion tracking and VR [virtual reality], the CHIL creates a unique collaborative virtual environment for exploring and solving problems quickly, and where hardware designs and manufacturing processes can be fine tuned before production or development begins.”
Engineers can identify bottlenecks, collisions, and worker issues before they happen. The company adds that immersive engineering also improves resource utilization and material flow, increases producibility, and reduces rework and scrap.
According to the company, its first immersive engineering lab in Fort Worth generated a 15-fold internal return. That calculation was based on the company’s total investment in applying immersive engineering to detailed aircraft design. SAIL is also credited with helping the company save more than $100 million, which would have been spent for modifications to the aircraft-carrier version of the F-35.
Ford operates an immersive engineering facility at its Dearborn Ergonomics Laboratory in Michigan. It also has immersive systems in Merkenich, Germany, near Cologne, and in Dunton, England.
On its Web site, Ford notes that it uses immersive engineering to pair “advanced motion-capture technology—commonly used in animated movies and digital games—with human modeling software to design jobs that are less physically stressful on workers. The benefits include fewer injuries, lower cost of tooling changes, higher quality, and faster time to market, and Ford is seeing improvement in every one of those metrics.”
Ford engineers and designers use “industry-exclusive virtual tools to shave months off the product development process, while improving the quality, comfort and appeal of Ford vehicles,” the automaker adds. “Ford product development is anywhere from eight to 14 months faster than it was five years ago.”
Allison A. Stephens, Ford’s global technical leader in assembly ergonomics, manages the Ford immersive engineering installation in Dearborn.
Ford says its U.S. workers’ compensation premiums fell to less than $15 million for 2007 from an average of $40 million in the early 1990s. By far the biggest portion of the drop was in repetitive-stress injuries, which ergonomic analysis plays a big role in preventing. The company said its medical records show significant reductions in injuries related to spinal compression, back and upper body strains, and shoulder and rotator cuff injuries.
She notes that in addition to helping avoid repetitive stress injuries, she strives to reduce what Ford calls “parts churn.” That is the number of parts that have to be modified, replaced, or retooled as a new vehicle gets into production.
Immersive engineering doesn’t really change the basics of engineering. That is still about getting the product, whatever it may be, made exactly to the customer’s specification, on or under budget and delivered on time with no glitches or snags.
Instead, immersive engineering lets everyone involved see and understand what’s going on, spot problems, and recommend solutions to issues involving quality, productivity, manufacturability, lifecycle costs, and ergonomics across the broad spectrum that ranges from development to manufacturing and maintenance. The advantage is that those issues can be discovered and addressed early, when changes are less costly than they are after a company has invested in tooling or made other commitments.
Jack Thornton is a technology consultant in Santa Fe, N.M., and a frequent contributor to Mechanical Engineering.