«A fundamental rule in technology says that whatever can be done will be done».
[EV]3 inventions & know-how — AR studio
Interactive augmented realitystudio byEligoVision (AR studio) is a new 3D solution for conducting large-scale presentations using the «living 3D markers» on the big stage!
Background of the AR studio creation
AR studio from EligoVision is a new way to present illustrative material to a large audience on the basis of the augmented realitytechnology.
The purpose of the system is to give the presenter or host much more freedom in using new 3D visual means in the presentation. It gives the speaker the opportunity to enliven the presentation by adding interactive 2D and 3D illustrations, models and images.
As usual, for presentations on a large stage, the presenter’s image is broadcasted on a display system with one or more front-end cameras. When only the front-end camera is used, the presenter has to place the augmented reality marker vertically in front of him/her and direct it straight at the camera.
Only in this way the camera will see it completely, transmit the information onto the workstation, and special software will «attach» the virtual 3D model right over the marker’s image. However, this method slows down the presentation and the presenter is limited in his/her movements and control over the marker.
The interactive augmented reality studio — AR studio from EligoVision — employs a system comprising several cameras of different standards calibrated in a specific way to be able to interact with each other.
This system helps to solve the problem of limited space where markers interact with one working camera. This gives the presenter more freedom on the stage during the presentation which is a big advantage for working with augmented reality markers in both the horizontal and vertical planes.
How AR studio works
Camera (1) signal is transmitted to the EligoVision software for recognizing the augmented reality marker and determining the coordinates of the assigned model. HD (high definition) camera (2) is connected to the video capture card (3).
Through the video adapter (4), the image taken by the video capture card from the camera is sent to the display system. The software «attaches» the 3D model above the image from the camera using the video card, as well.
If the marker was not recognized by the upper camera, it is searched for in the image received from the font-end camera.
The maximum angle of the marker’s position relative to the camera at which recognition is possible is about 70°. When this value is increased, the marker cannot be recognized.
Using several cameras helps to get rid of this limitation and to significantly increase the process of markers recognition.
Mutual calibration of the cameras provides for the opportunity of recognizing a marker by several cameras at a time and for correcting the «attachment» of the models, scenes and animation to the markers.
This system can be supplemented by several cameras, thus widening the possibilities of the interaction with the augmented reality «living 3D markers», and can be used to transmit the signal to any kinds of displays or screens, as well as to broadcast it directly using TV networks.
Technology release in Russia
The system, as well as the technology, was first used on a big stage in Russia during the celebration of the JSC Russian Space Systems' 65th anniversary at Rossiya Central State Concert Hall, Luzhniki Palace of Sports, on May 26, 2011.
In this project, the augmented reality studio was used for the real-time demonstration of «living 3D markers» on a large stage by Yuri Urlichich, Chief Engineer of Russian Space Systems, together with the host Leonid Parfyonov, a famous TV journalist.
Particularly, in the AR studio from EligoVision which was used at the «65th Parallel» event, the following arrangement was implemented:
video stream from the Canon XHA1 professional camera reaches Mini Converter Analog to SDI from BlackMagic Design, from there the SDI signal is sent to the graphics station with the NVIDIA SDI Input card;
the received video stream (1920x1080p) is shown in the display system by means of OpenGL;
recognizing the augmented reality markers is programmed in the CPU (graphics card host), so the video stream is copied to the host after a number of conversions at the GPU.
You will get a more detailed information about the «65th Parallel» celebration on the web page describing this landmark project — www.eligovision.com/casestudy/31.
Examples of applications in Europe
To understand how serious and impressive may be a big stage project using the augmented reality technology, watch one of the popular videos from Total Immersion, a French augmented reality solutions provider.
Earlier, in order to present something like this in Russia, you had to turn to western technologies. Whereas now you can do this all here, in Moscow, at EligoVision company, using our own licensed technologies.
If you are interested and you would like to impress your audience, just get in touch with us and we will invite you to see the operation of this unique interactive system with your own eyes.