How it works

The Process

1. Raw 3-D graphic data of almost any kind- including: LIDAR, Aerial photographs, CAD, CAM, can be used to make a hologram.

2. The model data is processed and rendered by our proprietary rendering engine. Each digital hologram is composed of thousands of hogels (like a three dimensional pixel). The model data is broken down into subsets for each hogel.

3. A hologram of a 3-D model is formed by recording the interference pattern of two laser beams. One laser beam is encoded with the datausing an LCD screen which then scatters the image onto the recording medium. The second beam serves as a reference beam. The two beams are brought together and interfere on the recording medium (a photo polymer film). Each point in the object acts as a point source of light, and each of these point sources interferes with the reference beam, giving rise to an interference pattern. The interference pattern of light and dark areas, similar to zebra stripes, is recorded in the photo polymer. This process is repeated for each hogel to build the entire hologram.

4. After recording and processing the film, the hologram is illuminated by a light in a similar position to the reference beam it was recorded with. Each hogel’s recorded interference pattern will diffract part of the reference beam to re-construct the data beam. These individual reconstructed data beams add together to reconstruct the whole 3-D model. The viewer perceives a 3-D image reconstructed in reflected light which is identical to the 3-D model data.

For more about this tech, see ZEBRA IMAGING TECHNOLOGY

Posted in 3D, Holographic, Robotic at October 12th, 2012.

“”Little Magic Stories” is the latest project by Chris O’Shea, with aim to encourage children to use their creativity to bring stories to life. The installation allows them to create a performance from within their imagination, on stage, in front of an audience of family and friends.” Text by Creative Application Network

Chris achieved all this by using the Musion Eyeliner holographic projection system, An Xbox Kinect camera and the software used was C++, openFrameworks, openCV and Box2D.

Little Magic Stories from Chris O'Shea on Vimeo.

Posted in Body Tracking, Holographic, projection at September 19th, 2011.

Hatsune Miku (初音ミク) is a singing synthesizer application with a female persona, developed by Crypton Future Media. It uses Yamaha Corporation’s Vocaloid synthesizing technology. The name of the character comes from a fusion of the Japanese for first (初 hatsu), sound (音 ne) and future (Miku (ミク) sounds like a nanori reading of future, 未来, normally read as “mirai”), referring to her position as the first of Crypton’s “Character Vocal Series”. She was the second Vocaloid to be sold using the Vocaloid 2 engine and the first Japanese Vocaloid to use the Japanese version of the Vocaloid 2 engine. Her voice is sampled from Japanese voice actress, Saki Fujita. Hatsune Miku has performed onstage as a projection. (Text from Wikipedia)

CV01 Hatsune Miku live on stage

Her voice

Posted in Holographic, projection at July 4th, 2011.

3D holograms and real models mixed together at the Burberry winter collection show in Beijing.

Posted in 3D, Fashion, Holographic at April 26th, 2011.