Real-Time Raytracing

Like many programmers, my first exposure to ray tracing was on my venerable Commodore Amiga. It's an iconic system demo every Amiga user has seen at some point: behold the robot juggling silver spheres!


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2008/03/real-time-raytracing.html

Erm,

Renderman isn’t really a Raytracer or a scanline renderer. It’s a REYES renderer that has a ray tracer botlted onto it.

http://en.wikipedia.org/wiki/Reyes_rendering

Great article! I remember an Apple keynote a while back where steve showed Pixar’s Luxo Jr rendering in “real time” as he swung the camera around.

I’d love to be able to watch an animated movie and choose ANY angle in real time. I have no idea why that fascinates me, but it does.

Very intersting post. Its nice to see there is some use to these quadcore CPU’s if anyone writes the code for it (which I imagine will be done)

raytracing “quake” is one thing, raytracing “juggler” is much simpler (simple scene, containing raytracing-friendly spheres and checkered plane).
i’m sure it can be done in realtime on a regular modern computer nowadays!

omg you did this post for me didn’t you…

Been a POV-Ray user for years, I even made my own renderfarm, now I’m searching about unbiased raytracers: forward raytracers, tracing rays from the light, slow as hell (let’s say slow as backwards raytracers were many years ago) but impressively realistic results. I was browsing Luxrender website when I noticed this post on the RSS feed…

By the way, why on does the “POV-Ray, which [produces some impressive results] as well” link point to the IMDB entry on Cars movie?? I’m pretty sure Cars didn’t involve POV-Ray…

In 2000 I wrote a realtime raytracer that could render a scene similar to the Amiga juggler in 320x240 at 30fps on a PII-450 using various scene optimisation techniques, so I’m not sure how true that last paragraph is…

Oh yeah Jeff, try out RaVi on your own computer: http://www.winosi.onlinehome.de/Ravi.htm

While the Pixar Photorealistic Renderer (prman) is the industry benchmark in rendering it has ray tracing as an add on. Blue Sky Studios (http://blueskystudios.com/) has been doing ray tracing for over 20 years. The people that started Blue Sky got their start at MAGI/Synthavision and worked on TRON.

Blue Sky has ray traced all of their movies, shorts, and commercials for many years. They have a little more on the renderer here: http://blueskystudios.com/content/process-tools.php. They even won the 1998 Oscar for Best Animated Short for Bunny (http://blueskystudios.com/content/shorts-bunny.php) where the only thing not rendered with radiosity was Bunny and the moth. All environments were radiostiy renders.

Who’s Ray Tracing?

Arauna real time ray tracing

http://igad.nhtv.nl/~bikker/

Interactive k-D Tree GPU Raytracing

http://graphics.stanford.edu/papers/i3dkdtree/

Nicolas, thanks for this link! Very cool real time raytracing demo:

http://www.winosi.onlinehome.de/Ravi.htm

  • default window size (which is 320 x 240)
  • default settings

(eg, do not change anything after launching the executable)

I get ~90fps on a 3.0 GHz Intel Core 2 duo.

“Egg” under the Demo menu is by far the slowest at around 11 fps.

[It’s essentially calculating the result of every individual ray of light in a scene.]

Well, that’s not exactly true. Your diagram shows it more accurately, i.e. it’s casting a ray from the “eye” or “camera” point for every pixel that makes up the viewport. The goal for each pixel is to calculate its colour, plain and simple. The more complex the scene and the more complex the effects desired, the more colour calculations to perform. If an individual ray intersects any “object”, then a vector is calculated from the point of intersection to every light source in the scene (in your diagram this second vector is called the “shadow ray”). The angle between the original ray and this vector provides the basis for a number of different colour accumulation calculations. When the second vector intersects yet another object rather than reaching the light source directly, it can result in yet more calculations, if you so choose (e.g. reflection). Otherwise, you simply do not add any colour from that light source. This can get more complex if you want soft shadows with proper penumbra, etc, but for the basic case you simply “do not add colour”.

The diagram itself has somewhat of an incorrect depiction in that it shows the vector “passing through” the object to still hit the light source, but fails to explain that the result is simply “do not add colour”. It would be more accurate to at least explain that the original ray had to interesect with something, in this case a plane. Otherwise, the colour added would simply be the background colour. Then, as the “shadow ray” intersects with the sphere, the light cannot reach that point on the plane and hence you do not add any new colour to that point.

For any programmers out there, it is actually quite easy to write your own ray-tracer in any language you choose. The old stand-by book “Computer Graphics in C” has a pretty good description of what’s involved. And as was stated, the result of a ray-traced scene has nothing to do with your graphics hardware, it’s all CPU crunching (unless you’re bold enough to try and hijack the GPU to do some crunching for you).

I don’t see much point in moving exclusively to ray-tracing, considering how nice results you can get with rasterization + shaders. If anything, a hybrid approach (like Pixar’s renderman…) is the best solution, imho.

Nvidia is obviously going to be biased :), but this article is good nevertheless: http://www.pcper.com/article.php?aid=530

The quiz made me grin. I’ve worked for years with a 3D artist who was doing 3D CG decades ago using custom software written for NASA and running on VAXen. Since he was doing TV work his frames were all 720x486x24, and in 1980 they took about 20-30 minutes each to render on the VAX and custom accelerator.

Over the years he moved to Softimage on Irix, then on PCs, then added a render farm that grew to 60+ processors, using both scanline and Mental Ray renderers. In 2007 he finished a large project with frames that took… 20-30 minutes each to complete. He has some sort of internal yardstick when designing that keeps him in that range no matter what resources he has at his disposal.

f0dder great link. If that is NVIDIA’s voice of the status quo with traditional 3D hardware rendering, here’s Intel’s response:

Ray Tracing and Gaming - 1 Year Later (Jan 2008)
a href="http://www.pcper.com/article.php?aid=506"http://www.pcper.com/article.php?aid=506/a

It does seem the hybrid rendering approaches work best, and that’s what Pixar’s RenderMan does. I’m really surprised they never got into ray tracing until Cars, though. Do check out that presentation I linked, it’s outstanding.

a href="http://www.cs.ucy.ac.cy/ayia-napa06/presentations/ayianapa06per.ppt"http://www.cs.ucy.ac.cy/ayia-napa06/presentations/ayianapa06per.ppt/a

Actually, it’s the other way around: the nvidia interview is the response to intel’s article :slight_smile: (check dates) - intel obviously wants to push raytracing since they’re mainly CPU people, and considering their experiments with 80+ core CPUs, they want something that can take advantage of it; most computer science is notoriously hard to parallelize, raytracing is almost the exception to the rule…

I’m actually surprised if Pixar did Cars fully-raytraced, considering how nice results their hybrid Renderman has done in the past - and, I assume, how much less CPU time it has taken.

No! Please don’t perpetuate this myth! Raytracing is NOT the holy grail.

For a reasoned argument, check out this article by Deano Calver. He worked at Ninja Theory on Heavenly Sword.
http://www.beyond3d.com/content/articles/94

Executive Summary:

  • anti-aliasing is hard
  • moving scenes are VERY expensive
  • it is almost fully procedural, so the artist has near zero stylistic control over the final look of a shot

Synchronicity! I was just listening to this podcast on POV Ray this morning: http://www.twit.tv/floss24

There’s some good history in there as well as discussion about the performance issues associated with ray tracing techniques.