immersive projection environments
a modular multi-sensory touch tabletop for interactive 2D and 3D visual data exploration.
This powerful virtual-reality device will enable scientific communities to view, share and interact with large-scale 2D and 3D data at the same time, and will enable computer scientists to study the integration of multi-sensory touch and gestural interaction techniques for seamlessly manipulating both 2D and 3D data.
So this sounds like a LambdaTable with added multi-touch interaction. I'm not certain how exactly they are proposing that it would be done with a wall of LCDs, although other groups have come up with different solutions to similar problems. Mitsubishi Electric Research Lab's DiamondTouch has embedded antennas in the touch surface which connects with receivers in each user's chair via a capacitance loop. This allows the table to record two-handed touch interactions as well as multiple users interacting at once.
The "gestural interaction" is also interesting, since it implies some sort of real-time hand and/or object tracking. This is work that's being done by many, many groups around the world. Here's one example: a movie showing real-time tracking of hand gestures from Carnegie Mellon's Robotics Institute. The Swiss CVLAB has been working on face and object tracking for augmented reality applications; see some nifty examples here, here and here.
Scientific American also published last month an article reviewing how various technologies work, including Jeff Han's Perceptive Pixel interface, Microsoft's Surface, and MERL's DiamondTouch