Brainstorm real time 3D graphics

We had Ricardo over from Spain to give us an update on what the Brainstorm real time 3D graphics system has been doing recently. They had some touch screen animations at the BBC for the US Elections last year, so we wanted to get some feedback on how easy they found it. As luck would have it, they used the same UTouch IR screens that we have, so they could demo the apps working in real time. They worked well, even though they were only running on a laptop. Brainstorm doesn’t seem to have the same performance issues as other 3D systems.

We had a crowd of people from presenters and designers thru to animators and programmers; their interest seemed to be held for a few hours, with plenty of questions so perhaps there might be something for us to test here in more depth.

Main demos were the touch screen controls, which are basically the same as vizRT’s. Simple click and drag type actions. The BBC work was professional, but the designers didn’t like as much as the scenes that we did (of course) and they weren’t doing huge amounts of scripting in the background as we were. It worked though, so should be reliable. The football controller was very basic, but again was functional if limited in design. It allowed the control of a video clip and simple drawing on the pitch.

Ricardo then took us through some of the cooler aspects of the system. The easy creation of control interfaces was a big feature. It allowed click and drag of buttons and pictures into a virtual set…very easy. The system has always been on the leading edge as far as new techniques is concerned and there were a few that I hadn’t seen. All the shaders were fun. Just having a pixel shader interface is impressive. I’ve played with the one on the viz machine, but it’s an extra cost and harder to use. I need to test this interface, but there are a number of built in scripts that work well – fire, water, reflections, surface finishes. Really good quality.

The system also has refraction and distortion effects and these have made virtual a bit easier to calibrate. Because the system can show it in real time, calibration is done by just viewing a straight line in the set and adjusting the distortion to match a similar line in the scene. Sounds easy. The effects can all be matched to the position of models and planes in the scene, so we saw the blur effects used to simulate depth of field shots. Again, it needs a test to see if the quality of set could be built without too much of a performance hit. There were a couple of effects that impressed the presenters – one was a girl in a live shot being able to generate graphics attached to her finger. They had used an image processing camera to work out where she was in the scene and where her hand was by using her silhouette. When she raised her hand to point, the graphics were added to the tip of her finger.

The second demo, that might be of interest to CJ, was the stereo output. It can be done in either the red-green style or as a twin pair of stereo images. I wonder how easy it is to create the graphic animations using that system. Might strain the brain a little. Viz didn’t mention any work on this when I asked the other day, but I supose they’ll be doing something in R&D; only for version 3 though, no doubt – we must try and push to upgrade, even if they are not fully networked.

We then went through some of the programming and modelling methods. They claim to have fbx format input now and a plugin to bring in Photoshop layers. That would make life easier for us. The animators were not too keen on all the windows that pop up everywhere. They are used to the standard windows layout of XSI, vizRT and other 3D systems. We would need to test how easy it is to use for daily graphics. At least it would be controllable from several places and easily programmable so that journalists can create and SGOs playout. I’d like to see the iNews interface in action. The programming side is python, so is flexible if not totally state of the art. There are plenty of libraries for python, all open source, but it’s yet another language to learn.

So, in summary, we should perhaps have a look at what could be useful for the interactive screen, the virtual sets and then test against similar performance graphics in the studios to see if there are performance gains. The particles and shaders certainly seemed to have less of a strain than on other machines. Perhaps we should look at Aston again; I missed the demo when they were over last time. I shall test some more, when I get a moment….. 3

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s