As I don't want to run the risk of drawing down the ire of George Lucas's lawyers I work hard not to post anything in my blog that has anything to do with my work, that could be useful for work, or even anything that could have any useful application whatsoever. But earlier this year I presented a little bit of work publicly and the presentation is available online, so there can be no harm in linking to it.
First some background: Renderman is a "production quality" 3D renderer developed by Pixar and used by many visual effects houses, including my employer, ILM. One of its neatest features is that lighting and shading computations (ie. the calculation of the colour of every pixel once the geometry that you're looking at has been determined) are performed in a shading language called Renderman Shader Language. It's a C-like DSL that's compiled to a bytecode that runs on a SIMD virtual machine.
Buried within the Renderman documentation is another less well known virtual machine that is used to compute "blobbies". A blobby is simply an isosurface, ie. the set of points that form the solution to f(x,y,z)=c for a function f. The virtual machine is the one on which the function f is computed. Unlike shading, however, there are bytecodes for this VM, but no programming language. So this presentation was about my solution to the problem: embedding a tiny DSL in Python that compiles down to bytecodes. You write ordinary looking Python code but arithmetic operators and other functions are overloaded to build an AST that is converted to an "opcode" stream that is sent to Renderman. From a computer science standpoint it really is basic stuff, but as the author of the blobby code (Tom Duff of Duff's Device in fact) said to me, he's been waiting for someone to write a compiler for his opcodes for years and mine was apparently the first one. So here's a link to the presentation. There isn't really much detail in the presentation, but it might give at least a tiny flavour of what I do in my day job.
Wow, it's crazy this hasn't been done before. I've been playing around with Ruby with RenderMan, perhaps I'll try this in Ruby...
ReplyDeleteI wrote a Python Procedural a while back that would take a .geo file from Houdini and instance blobbies onto the particle system, and it was always a pain to fiddle with the opcode generation manually.
Any chance we can get the source? (Doubtful, I know, but worth a shot at least...)
aschwo,
ReplyDeleteThere's no chance ILM will let me release the source. But it's not too hard - mainly just a matter of going through the docmentation and seeing how it maps to python expressions. After you've done it once you'll never have to fiddle with those opcodes again :-)
Very good. Thanks for sharing, this is going to be a fun little project.
ReplyDeleteI'll be sure to link back to this post when I've got something to show.
Grrrr.
ReplyDeleteRenderMan is NOT a renderer. It is an interface standard. Photorealistic RenderMan is a renderer which implements the standard.
Pseudonym,
ReplyDeleteWhen I say 'Photorealistic RenderMan' people look at me funny. In fact, Pixar now call it Pixar's Renderman and even they say on their "What's Renderman?" web page, "RenderMan is *also* an industry standard interface specification for photorealistic renderers and is available here." making it clear that they interpret the word Renderman in two different ways.
Of course if you've worked on an alternative implementation things may look a little different... :-)
https://renderman.pixar.com/products/whatsrenderman/index.htm
You give a couple tantalizing hints about supporting general implicit surfaces.
ReplyDeleteI did a fair amount of work on an DSO procedural that would take a function described in almost-C (text that was lightly preprocessed to actual C), compile it on the fly with an embedded C compiler, and then extract iso-surfaces using marching cubes. Created a Maya node that was an implicit function type, you could construct implicit surfaces as a DAG. It was more than a proof-of-concept, but not quite there as a production-ready tool.
But I've been lately thinking of resurrecting it, since I'm working on something that really needs implicit surfaces, and I would really like to use Renderman. I've tried a ray-tracer with built in support for implicits, but renders are brutally long.
Any hints on how far you got with general implicits, or how you proceeded?
Beautiful!
ReplyDeleteWhat kind of frame rate do you get on these examples?
Conal,
ReplyDeleteFor that image, about 0.001 fps!
Some background: it's a software render, and it uses subsurface scattering. The subsurface scattering approach here requires three stages: the surface is discretised and the lighting at each point is computed (including ray-traced shadows) - but the lighting isn't used to illuminate the surface, instead it's cached in a 3D data structure. What is, in effect, a low pass 3D filter, is then applied to this lighting data to simulate the subsurface transport of light, and then the final illumination is computed based on the filtered light. That's what gives the waxy look, but the filtering pass takes forever.
There's probably some GPU hack you can do to fake it at 60fps but I haven't yet thought too hard about that. I just plumbed together Pixar's standard Renderman tools. The standard subsurface scattering hack would be hard to use with the arbitrary surface topology here.
You can interactively create "blobby" models with ShapeShop (http://www.shapeshop3d.com). You don't need a GPU to get interactive framerates, just good caching data structures. (Although there have also been some recent SIGGRAPH papers about real-time blobbies using the GPU)
ReplyDelete