Sintel, the Durian Open Movie Project » Blog Archive » Render Development Update


Render Development Update

on March 9th, 2010, by Brecht

As mentioned in my last blog post, there were four big changes planned, shading system refactor, indirect lighting, per tile subdivision and image tiled/mipmap caching. As usual, in practice those plans can change quickly in production. So here’s an update on the plans and some results.

For indirect lighting, the intention was to use a point based method. I found the performance of these quite disappointing. The micro-rendering method works well only with low resolution grids, but this is noisy, and when increasing the resolution suddenly performance is not as good any more. That leads to the Pixar method which is somewhat more complicated. This can scale better, but the render times in their paper aren’t all that impressive, in fact some raytracers can render similar images faster.

Indirect lighting test with sky light and emitting materials

So, I decided to try out raytracing low resolution geometry combined with irradiance caching. With the new and faster raytracing code this seems to be feasible performance wise, and the code for this is now available in a separate render branch. Right now we’re not entirely sure yet performance is good enough, though we may be close to about 1 hour / 2k frame on average. This is with irradiance cache, 2k rays per sample, low resolution geometry, one bounce, one/few lights and tricks to approximate bump mapping. These are the restrictions that we’ll probably work with.

Indirect lighting test in alley scene, with sun and sky light

A prototype implementation of per tile subdivision is also there for testing now. It’s very early though, doesn’t work with shadows, texture coordinates, raytracing, etc, but it’s already possible to do some tests.

Displaced monkeys with per tile subdivision

Regarding the shading system, I worked on node based shaders for a while, however it became clear that the benefits for durian wouldn’t be that big and that it was taking too much time. Being able to define materials as nodes more correct & flexible would be nice but also not help rendering better pictures much quicker, in fact converting our existing materials would take more time probably.

New shading nodes – you\’ll have to wait a bit longer for this

The code for this is not committed yet, and will probably have to wait until after Durian. However the core render engine code has been and is being further refactored so that we can easily plug in a different material systems, which is a big chunk of the work.

Image caching I haven’t really started on yet. I’m looking into using the OpenImageIO library, though it’s not clear yet if this would be a good decision, as it would introduce another image system in Blender and duplicate much of the existing code. Besides that images aren’t the only thing that would benefit from this, there are also for example deep shadow buffers which are not covered by OpenImageIO. Reusing code could save us a lot of time though.

Single plane subdivided with voronoi and clouds texture for displacement

Further plans are:

  • Try to get indirect light rendering faster, make it work better with strands, bump mapping, and displacement.
  • Work further on the per tile subdivision implementation to get it beyond a prototype.
  • Find out if we can use OpenImageIO, and either use it or implement our own tiled/mipmap cache system.


96 Responses to “Render Development Update”

  1. Jimmy Christensen Says:

    Jupiter file cache

    Saw this recently. Not sure if it could help with image caching or if you knew about it already.

  2. Shinobi Says:

    @ MTracer:
    I think so, and there is a grille as a hint at the back of the alley also… maybe a tribute to the very first open source movie in the world, does it sound familiar? ๐Ÿ˜‰

  3. Shinobi Says:

    @ Brecht:
    I’m almost speechless in front of the astonishing and incredibly quickly progressing improvements of the render departement in Blender.

    That very alley render is the realization of a big big part of all the dreams of the blenderheads out there, GI coming true, and in the internal good old, and greatly improved of course, Blender raytracer!!!! ๐Ÿ˜€ ๐Ÿ™‚

  4. loafie Says:

    I find this phenomenon interesting, somehow both the users and thus by extension the developers have overlooked the largest gaping hole in Blender as a production-ready tool. And that is 3d motion blur and depth of field.
    I saw some posts for the GSoC 2010 ideas on Blenderartists, and it’s completely mind boggeling that only one person has posted 3d motionblur and 3d depth of filed (micropolygon or raytraced) as a proposed GSoC project.
    I beleive the user base is a fault here, for not understanding the requirements of a production-worthy tool, and instead force the developers to focus on other flashy bells-and-whistles like GI, fluid simulations or sculpting.
    I hope the Durian folks bring this up, saw it on the requests page but it didn’t have a priority icon.

  5. Juan Romero Says:

    Great work!
    In the alley image, the walls painted red and green should emit much more color to the surrounding objects, the effect of color bleeding is barely noticeable

  6. brecht Says:

    The alley render takes about 8 minutes on a quad core by the way, my estimation/hope of 1 hour a frame is for final renders with full detail, hair, textures, displacement, ..

    @Irve, it’s automatic subdivision of the mesh for displacement, without using too much memory.

    @MTracer: the method in that paper seems to be mutually exclusive with irradiance caching. Also, Pixar isn’t using this themselves as far as I know.

    @Guybrush Threepwood, the irradiance cache already adapts itself to detail. There will be options to control the number of samples. Simply using the irradiance cache from a 25% render does not work well.

    @FishB8: if it’s doing 50% raytracing you can only get a 2x speedup, no matter how fast the GPU.

    @George Steel, I might commit it to the render branch commented out when I get the time.

    @Jimmy Christensen, that’s only a small part of what we need, though could be useful for inspiration. But it looks like they haven’t uploaded the actual source code, only examples.

    @loafie: every project has a different requirements, not many people render animations, that doesn’t make them wrong.

    @Juan Romero: I don’t think so, most of the light there is coming from the sky.

  7. loafie Says:

    Brecht: That’s true, but I would think they still want their favourite tool to be as complete and widely accepted as possible even their area is still images.
    The DoF still applies here.
    Congratulations on the fantastic work on the render branch so far by the way!

  8. vilda Says:

    Brecht, some great progress, although I’m a bit disappointed development has to go in some hacky way again because of the time frame of the movie project, not the opposite..
    I LOVE the approach of low res raytracing with displacement. If you consider there will be heavily used soft shadows, maybe the displacement doesn’t have to work with shadows at all?
    Did you check the luxrays library allready? I think it could considerably speed up the the raytraycing part, especially with the approach of raytraycing low poly geometry(luxrays should be able to handle scenes up to 10 000 000 polygons). Also the library allready produces some really nice results and is now completely independent from luxrender.
    this is link to the development branch:

  9. Sleeper Says:

    In the interest of not misrepresenting Blenders abilities, is per tile subdivision a form of micropolygons? It’s just when asked on other boards, you don’t want to make erroneous claims and then get accused of fan-boy-ism.

  10. Shinobi Says:

    @ Juan Romero:
    color bleeding is more than noticeable in that image and it is the way it should be, it works this way in the real world. Barely you could see your hand turning blu because you’re under a clean sky. Colors bleeding has to be delicate to not being ridicoulous.

    @ Sleeper:
    wait for Brecht official words, but I think I can say “oh sure!” it’s real micropolygons you have there ๐Ÿ™‚ Answer in your boards and don’t be afraid to appear fanboy ๐Ÿ˜‰

  11. Juan Romero Says:

    @Shinobi, you are not under any sky, you are inside it.
    If you put a green saturated card like that wall close to your hand in direct sunlight (the card) you’ll see how much green spill light there is.

  12. brecht Says:

    @vilda: already answered about using GPU in an earlier comment.

    @Sleeper: micropolygons are usually associated with REYES rendering, which is a more specific way to do also shading and motion blur, which this doesn’t do. If you just define micropolygons as really small polygons then yes, that’s what you have here.

  13. D Says:

    Juan, I think you severely overestimate the amount of light that spills from normal materials.

    To illustrate, below is an image from LuxRender depicting a fully saturated green cube on a grayish floor under a somewhat late afternoon sun. Please keep in mind that LuxRender’s light calculations are PHYSICALLY BASED, meaning it takes no artistic license whatsoever, it depicts exactly what would be expected in real life.

    I had to fully increase the saturation of the original render twice to be able to show the green spill on the floor. THIS IS WHAT REAL LIFE COLOR SPILL LOOKS LIKE. It’s subtle!

  14. kram1032 Says:

    Very nice progress ๐Ÿ˜€

    D: In an open scene, colour spill will be less noticable than in a closed one. Most of the rays there exit to the infinity of the sky…

    However, even in an indoor scene which is heavily based towards colour spill, it remains rather subtle (but still way more noticable than in your example) – the Cornell Box in all its variations is the most famous example ๐Ÿ™‚

  15. tweakingknobs Says:

    Just go GPUยดd pleaseeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee!!!!

  16. Shinobi Says:

    @ Juan Romero:
    if you don’t trust my words, please, trust your eyes. Open a color chooser of your choice and move the cursor over the different areas of the image to see their colors fade in the surrounding objects color.

    Do you really think that crate should be more green than this?

  17. Juan Romero Says:

    @D and Brecht:
    I don’t know what you are trying to prove with that example, it has no resemblance whatsoever with the alley. The shadow sides of the cube, obviously are going to spill allmost no light. I’m referring to a dark wall (I mean in shadow) parallel to a very bright one (under direct sunlight) and painted with highly saturated colors.
    Any proffesional photographer can confirm this, I don’t know to exactly what amount of color bleed, but sure is more than in the alley image. First time I saw it I didn’t notice any red spill on objects nearest to the red wall

  18. Juan Romero Says:

    sorry, my comment was for D only!

  19. Juan Romero Says:

    Yes I think it should be more green, given the color of the crate is white.
    I’ll Try to upload a photograph (no cg will be the same even the most physically correct renderer) of what I mean when I get home.
    Regards to all of you guys, and Brecht, congratolations for your work!

  20. Shinobi Says:

    @ Juan romero:
    I do really think not, unless you want obtain a patchwork.
    Just in case, in your projects you can simply increase this value:

  21. 4museman Says:

    @Shinobi, D, Juan Romero…:
    I’m sure that the amount of reflected light highly depends on reflectivity characteristic of lit material. Your discussion miss that point. ๐Ÿ˜‰

  22. Dan Says:

    Great work!

    Indirect lighting is one of the features I have been missing most in blender. Doing visualization work this makes a lot of difference. I used to add a bunch of small lamps in my scenes to avoid black shadows. AO can help in some situations but often gives a dirty look instead of the clean look from indirect lighting. Any indirect lighting solution in blender is a huge step forward. Most of us make more images than animations anyway and render time is not the biggest concern. I’m confident though that we will end up with a good and fast solution.

    Actually.. I handed over a few concept sketches of a product to a client yesterday rendered with the new IL feature. So I guess we can say it has already been used professionally.

  23. brecht Says:

    @francois, I’ve seen the paper, but it only does low frequency effects, and seems like it would suffer from flickering in some cases too.

    Further, for anyone who has suggestions for different algorithms, I appreciate the suggestions but I’ve seen pretty much all the important papers published about this in the last 10 years, and it’s very unlikely that we’ll switch to something else or do GPU acceleration or whatever, there is simply no time for that.

  24. bigbad Says:

    This is great stuff. Excellent work.

    What papers did you follow for tile based subdivision, if I may ask?

  25. Matt Says:

    You’re doing a kick-arse job, Brecht, keep it up ๐Ÿ™‚

  26. Pavel Says:

    Talking about external renderers – I know this project is death but Pixie has all the features you mentioned and since it is opensource

  27. FishB8 Says:

    @brecht: I see what you’re saying. I thought you were talking about time spent raytracing rather than time spent on the complete rendering process.

  28. Guybrush Threepwood Says:

    Thank you very much to take time to answer each person’s question.
    Please do not hesitate to continue posting now, because so many asked questions.
    Even it takes time to answer sometimes silly questions, it gives us (the community) great respect to hear your opinion to our simple questions.

  29. Shinobi Says:

    @ 4museman:
    you’re completely right, I agree with your point here. That’s a wall, so I don’t know how much it can spills colors, but more important is how to translate it in Blender.
    I know the Emit value influences colors spill, I don’t know if there is another parameter for that to take into account for indirect lighting computation.

  30. 4museman Says:

    I think the most probable parameter for this is Diffuse Intensity. But of course, it change the material visually as well, which is only natural. From my quick test in Blender it really affects the indirect lighting.

    But theoretically, the light reflection should be affected also by the surface roughness (smoothness), which results in diffuse of the light. In Blender it is probably Specular Hardness parameter that is closest to that characteristic (?). Or maybe it could be controled by different Diffuse Shader Models.

    But I’m not sure how much it could be of the difference, while recent the only method for IL is point based (Approximate) not raytraced.

    What is the viewpoint of the competent person, Brecht?

  31. Shinobi Says:

    Interesting what you’re saying but has it effectively coded in Blender IL yet? I mean are Diffuse Intensity and Specular Hardness influencing color bleeding yet? Or is Emit value only being taken in account in computation…?

    For what I understood IL in Blender now is a the level of raytracer, so it is not Approximated, or not only. Approximated approach was before Brecht coded raytracing IL.

  32. Lamhaidh Says:

    I hope that the parameters aren’t linked Permanently! I’m an engineer and have to use Inventor to do renders of my work for the sales guys. They ooh and ahh over it and I go out of my mind because they’re crap! In Inventor the controls for Specularity and Reflectivity are linked and the only way that I can make something shinny, like plastic pipes which I have to do alot, is to give them a mirror finish! Please don’t link them just because they should be related to each other, some of us might have other ideas!

  33. 4museman Says:

    According to my (rather quick) tests, the reflected light color and strength is taken from resulting color in specific vertex of the model (after all changes to color values are applied, such as Diffuse Intensity etc.).

    Emit value is not related to light rebounce, but it is the amount of light the surface is actively emitting.

    My remarking of Specular Hardness and Diffuse Shader Model parameters are meant more or less as a question. That it wound be a good idea. Maybe. ๐Ÿ™‚ I’d love to hear opinion from Brecht. Is that thought a bullshit?

    And as I understood so far, the current working IL approach in Blender (at least in SVN trunk) is just Approximated. Which is someway reduced version of raytracing followed by approximation of the results between vertices.

  34. oin Says:

    Many congrats, Bretch!
    Outstanding work.

  35. Ralph Says:


    fantastic work!

    Will you be integrating functionality in indiredt lighing that bounces the texture rather than the diffuse color? It would be great to have.


  36. toontje Says:


    I felt I needed to bump this thread and I hope that it is still read.
    @ Brecht:
    I know you’ve said that you’ve explored many different papers and solutions already, so I apologize in advance when I point you to this thread:
    I’ll be posting in Blender artist too just in case this one goes unnoticed.

  37. shul Says:

    This kind of work blow my mind brecht, compiling and running svn is such a joy because of all the advances I see each time,

    Continue doing your thing man, and don’t forget to enjoy it ๐Ÿ™‚

  38. MTracer Says:

    In what build is the GI? It doesn’t seem to be in any of the latest builds on graphicall…

  39. Victor Says:

    it’s in a branch, there’s no GI yet

  40. Philas Says:

    render25 branch, not in trunk yet

  41. Shinobi Says:

    “Will you be integrating functionality in indiredt lighing that bounces the texture rather than the diffuse color? It would be great to have.”

    Definitely, it should be a must have feature, it was really great!
    We could avoid things like this:

    What do you think Brecht?

  42. Satish 'iluvblender' Goda Says:

    Hi Brecht.

    I came across this paper.. Maybe it would be useful.

    Temporally Coherent Irradiance Caching for High Quality Animation Rendering

  43. Satish 'iluvblender' Goda Says:

    One more..

    An Approximate Global Illumination System for Computer Generated Films


    Lighting models used in the production of computer generated feature animation have to be flexible, easy to control, and efficient to compute. Global illumination techniques do not lend themselves easily to flexibility, ease of use, or speed, and have remained out of reach thus far for the vast majority of images generated in this context. This paper describes the implementation and integration of indirect illumination within a feature animation production renderer. For efficiency reasons, we choose to partially solve the rendering equation. We explain how this compromise allows us to speed-up final gathering calculations and reduce noise. We describe an efficient ray tracing strategy and its integration with a micro-polygon based scan line renderer supporting displacement mapping and programmable shaders. We combine a modified irradiance gradient caching technique with an approximate lighting model that enhances caching coherence and provides good scalability to render complex scenes into high-resolution images suitable for film. We describe the tools that are made available to the artists to control indirect lighting in final renders. We show that our approach provides an efficient solution, easy to art direct, that allows animators to enhance considerably the quality of images generated for a large category of production work.

  44. Prodigal Son Says:

    To answer the couple of BURP related posts here. We’re just in the final stages of getting the Blender 2.5A2 clients out for Win/Linux/Mac for the publicly distributed rendering project.

    As far as I have understood, the main issue with using BURP for rendering a scene in Sintel is the fact that the scenes require massive amounts of memory.

    We’re also testing 64bit builds of the clients, which might just imply (in the future) that we could move onwards from the current 1,5GB scene memory requirement limit that we must enforce. Unfortunately I think that our efforts on making the service more accessible and stable will not be on time for the final renders of Durian/Sintel. Perhaps for the next open movie project though? ๐Ÿ™‚

    – Julius Tuomisto /