main
side
curve
  1. In Memory of LAJ_FETT: Please share your remembrances and condolences HERE

[HDRI, why bother?]

Discussion in 'Archive: Scifi 3D Forum' started by theN00_Jedi, Nov 27, 2005.

Thread Status:
Not open for further replies.
  1. theN00_Jedi

    theN00_Jedi Jedi Padawan star 4

    Registered:
    May 26, 2005
    [image=http://img382.imageshack.us/img382/5181/atmo14nb.jpg]

    [image=http://img382.imageshack.us/img382/5278/atmo21dy.jpg]

    Simple in atmosphere compostion test, one render took about 2 and a half minutes, the other took about 12 and a half hours, I bet you can't even tell the difference[face_laugh]

    There has to be a better way[face_not_talking]
     
  2. Jedi2016

    Jedi2016 Jedi Padawan star 4

    Registered:
    Jun 3, 2000
    Of course I can tell the difference.. it's called occlusion shading. Unfortunately, that model doesn't really make much use of it. For that particular model, it might not be worth it, no. But there are plenty of instances where HDRI or radiosity (real or simulated) can add a veritable assload of realism to an image.

    If you want something similar without the render times, all you have to do is add a bunch of lights around the object. It's usually referred to as a "skydome". Fifty or sixty soft spotlights should do the trick just as well, depending on the model.

    Sometimes, however, there just isn't any better way than real HDRI or radiosity (which are two different things, incidentally), depending on what you want out of the shot.
     
  3. Man_from_Naboo

    Man_from_Naboo Jedi Master star 1

    Registered:
    Apr 18, 2004
    HDRI makes IMO the most sense if you apply it to a scene that needs a very complex lightsetup.

    This is usually the case if you do interior design stuff. There you have a very complex light situation, with windows lighting the room, and probably various lights in the room. Look at the HDRI-images that you can get on the web. They

    Since the situation you have is quite simple, having only the diffuse light of the sky as a light source (it seems that the sun or a sun is absent), it is only natural that you have very uniform light.

    HDRI in addition is very useful in combining CGI and real images, since you can very easily emulate the lighting situation that was present when you took the real footage.

    Michael
     
  4. PadawanNick

    PadawanNick Jedi Grand Master star 4

    Registered:
    Jun 6, 2001
    Yeah. The main thing HDRI is good for is when the model is being integrated/composited into a fairly complex real-world location. In these situations, the difference between HDRI based lighting and "manual" lighting are typically VERY dramatic.

    Just putting a model into the sky with some distant clouds really doens't call for the use of HDRI. For this, could could light with a "sun" and a radiosity sphere with solid color or maybe gradient.

    Have fun.
     
  5. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    We need a few clarifications around here:

    Of course I can tell the difference.. it's called occlusion shading. ... But there are plenty of instances where HDRI or radiosity (real or simulated) can add a veritable assload of realism to an image.

    HDRI and ambient occlusion are two totally different things.

    HDRI means High Dynamic Range Imagery. It just refers to how much information an image stores. Typical images (8bit per component) assume lighting intensity goes from 0 to 1 but in the real world and photography you can have, for example, a difference of 1:100,000 between the brightest or darkest spots. This meand that in normal formats a lot would be clipped. HDR images allow you to capture all that dynamic range. Several formats have been developed to capture that and usually fall in 3 broaqd categories: extended bit depths (16 or 32 bit), floating point and log formats.

    HDR makes more sense when working on film, since it captures high dynamic ranges so using HDR makes handling lighting much more realistic. For TV and video formats it's not as necessarty but might still be useful for two reason: you can adjust the exposure of the image to better suit your needs, and to apply tone mapping.

    Ambient occlusion is a way to determine how occluded (how much coverage) is each point in a model, basically by shotting rays from the point over the semi-hemisphere. It's an alternative to use an ambient term in shading an thus you can achieve subtler shading and approximate global illumination.

    But there are plenty of instances where HDRI or radiosity (real or simulated) can add a veritable assload of realism to an image.

    The term you're looking for is global illumination (GI), which is a generic term. Radiosity is a specific technique used to do GI that involves meshing and simulating energy transfers. But classic radiosity has fallen into disuse lately because of its shortcomings.

    HDRI makes IMO the most sense if you apply it to a scene that needs a very complex lightsetup. This is usually the case if you do interior design stuff.

    I think there us a confusion between HDRI and image based lighting (IBL). You can certainly do IBL with HDR images but one doesn't imply the other. You could still do IBL with regular images though. But IBL indeed does help in complex lighting situations.
     
  6. PadawanNick

    PadawanNick Jedi Grand Master star 4

    Registered:
    Jun 6, 2001
    Are you then grouping formats like Cineon and OpenEXR with HDRI, malducin?

    While a strict definition of HDR might include formats like these, I thought that "HDR" generally indicates REALLY high dynamic range gathered through mutliple exposures with bracketed shutter speeds.

    So, while a log format like Cineon or a floating point format like OpenEXR certainly have much greater dynamic range than an 8-bit image, I didn't think they were generally considered to be "HDR".

    Also, doesn't the use of HDRI sources for IBL (which is what the initial post seems to refer to) still make sense, even for video output, since it produces an extremely accurate lighting model?
    The trouble with a traditional LDR image is that a lot of lighting variation in the brightest and darkest areas of a lighting map are clipped/lost to white and black and this could be especially important to creating accurate specular hits.
     
  7. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    Are you then grouping formats like Cineon and OpenEXR with HDRI, malducin?

    Yes. Actually Cineon is probably the oldest HDR format out there. It was created by Kodak and widely used in the film industry for scanning film, which of course has a high dynamic range. Cineon is 10 bit per component and logarithmic. Usually during the process you determine your white and black point (say 685 and 90 in a scale from 0 to 1023) so you can have whiter than white and darker than black.

    While a strict definition of HDR might include formats like these, I thought that "HDR" generally indicates REALLY high dynamic range gathered through mutliple exposures with bracketed shutter speeds.

    Not exactly. HDR just refers to the high dynamic range, it's a generic term (like GI). You can create HDR images via different methods. Yes the current popular method is through bracketed exposure, but there are others as well. You could scan film, say on an Imagica scanner. Or you could make one in one go with a SpheroCamHDR camera. Yes you can get a higher dynamic range via the multiple exposures (for still images), after all a film camera you set it to a certain exposure but film still captures a high dynamic range that can be scanned.

    So, while a log format like Cineon or a floating point format like OpenEXR certainly have much greater dynamic range than an 8-bit image, I didn't think they were generally considered to be "HDR".

    Oh yes they are HDR formats although Cineon being the oldest has probably the least range compared to newer one. What makes normal images low range is not that they only have 8 bits but the fact that in those 8 bits they assume the range is from 0 to 1, basically just one f-stop (just one level of magnitude 1:10), everything else gets clipped. On Cineon you have ranges that go under 0 and above 1 and can capture several f-stop levels. I think some people using OpenEXR have used it to capture up to 10 fstops.

    Greg Ward has a good summary of this (although he doesn't include Cineon probably because some of its limitations but you'll find other references to it):

    High Dynamic Range Image Encodings

    Also, doesn't the use of HDRI sources for IBL (which is what the initial post seems to refer to) still make sense, even for video output, since it produces an extremely accurate lighting model?

    Yes it makes sense but it depends on several factors, say for example he uses a renderer that can make use of HDR images and do high precision lighting calculations (most modern packages do anyway), or as I say just to even just adjust the exposure or do tone mapping to get an 8bit image.

    The thing is if you don't understand what the terms and techniques are how can you apply them effectively? As other more or less correctly pointed out maybe a dome lighting would suffice or say use ambient occlusion and save rendering time. Difficult to see what his setup is going to be (is the spaceship going to be on ground?). Conversely how do you decide to use IBL with HDR if you don't really know when it might be worth doing it? My only point is that if people use the right terminology it makes for easier communication especially when looking for feedback or trying to debug something or find solutions. Not long ago there was a poster that had problems with his "HDRI render". He could never explain himself but seems he was actually using IBL with a normal image.

    The other thing of course is that you don't want a totally accurate lighting environment, especially if you are following art direction (though that would be rare here). Of course the user is encouraged to do tests to weigh the benefits of different approaches against the cost (setup time, rendering time, flexibility, etc.). You can really tell the difference between his 2 images but it might be moot depending on how his final scene is setup.
     
  8. PadawanNick

    PadawanNick Jedi Grand Master star 4

    Registered:
    Jun 6, 2001
    Firstly, thanks for taking the time to share these insights.

    So what really defines an image as being HDR?
    I've worked with a few Cineon plates and compared to bracketed exposure generated HDRIs, the Cineon files had much less dynamic range.
    I've also worked with shots from CineAlta cameras.
    These are 10bit, but linear, so they have even less latitude than Cineon even though they're more "modern" sources.

    Is just being "more than 8 bit" the definition of HDR?
    Or perhaps some latitude in stop values?

    THIS I'm totatally on board with.
    It gets pretty interesting when a director or producer starts mixing the terminology. :p
    Of course, this just makes it all the more critically important that the TD/contractor understands both the proper and ... erm .... popular uses of these terms and can provide a clear translation.

    Thanks again.
    Have fun.
     
  9. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    I forgot my standard disclaimer, I don't claim at all to be any sort of expert in HDR or IBL, just read a bit of the basics. Anyway...

    So what really defines an image as being HDR? ... Is just being "more than 8 bit" the definition of HDR? Or perhaps some latitude in stop values?

    There's the rub. Since HDR (or HRDI) are generic terms you don't have it as well defined as other more specific ones (which is actually a significant problem in CG terminology but I digress ;-). So it sometimes depends of context, sometimes is misused or is applied to similar things, etc.

    Broadly speaking it has more to do with the latitude as you mention. Nature doesn't go from 0 to 1. But all 8bit formats (as far as I know) assume 0-1 and just one fstop (although in technical terms I guess you could have a log format for 8bit images even if it's very restricted, although more on this later). HDR goes beyond that. So in that aspect I would consider Cineon HDR. Interestingly Greg Ward doesn't mention it, and seems that Paul Debevec also omits it from his discussions (just checked my SIGGRAPH course notes).

    The thing that is tricky for Cineon is that even though it's 10bit log, the data is stored in 32 bits (say like a normal RGBA TIFF) except that it's packed different of course and to use it you usually have to linearize it. So maybe it's just barely HDR. Certainly I've seen references to it being considered HDR: Idruna, makers of the HDR paint package Photogenics, call Cineon/DPX an HDR format.

    BTW, Morgan Kaufman recently came out with a specific book about HDRI (both Greg Ward and Paul Debevec are authors). It'll be interesting to see how they define it.

    I've also worked with shots from CineAlta cameras. These are 10bit, but linear, so they have even less latitude than Cineon even though they're more "modern" sources.

    Interesting. I guess I didn't suspect it since the one used in Ep. 3 were 10bit log, but it makes sense for the different models. How exactly are those 10bits used anyway?

    But yes the actual acquisition/creation method or actual image format is irrelevant if an image is HDR.

    Personally I would consider Cineon HDR.

    It gets pretty interesting when a director or producer starts mixing the terminology.

    This reminds me more or less what a friend mentioned to me. You always more or less just nod to the director and producers ;-). Who you really want to cozy up to is the DP since he is bound to understand this and have a more direct impact in your work. As long as you understand inside the head what the director is talking about (even if totally screws up the terminology) no harm done ;-).
     
  10. PadawanNick

    PadawanNick Jedi Grand Master star 4

    Registered:
    Jun 6, 2001
    Where did you hear that the EP 3 cameras were 10bit log?
    Can we go private with a discussion on that?
    My email is in my profile.

    We were capturing using a KONA 2 card on a G5 with a pair of XRAIDs (no tape).
    The resulting files are Quicktime clips encoded to 10 bit linear data.
    Both the F950 and the F900 are 10 bit linear. (I've worked on shots from a Panavision-ized CineAlta like the EPII cameras. Definetly 10bit linear).
    The big advantage of the 950 is that it doesn't compress color.
    Using a pair of SDI feeds you get true 4:4:4 color data.
    I'm pretty sure this is true of the Lucas Digital version of the F950 as well.
    There's is pretty special, not a stock model, but I'm pretty sure it's linear.

    I guess, even though Cineon have a lot of latitude and the log format gives it much more dynamic range than linear 8bit or 10bit, I never really considered Cineon as "HDR".
    Because it's log, you really don't get a lot of detail in the bright end of the data where as multi-exposure "HDR" techniques are specifically FOR capturing detail in these areas. Utilmately, Cineon is designed (and limited to) the range of information that is typically captured in a single exposure of a frame of film.

    Anyway, we're digging deep now, and I'll need to review my notes before posting more detail. I have a pile of information on Cineon from Steve Wright, but haven't worked with it enough to be able to decode it easily from memory.

    More later..... :)

    (The Kaufman book looks like it could be interesting. Is there more information on it anywhere?)

    Have fun.
     
  11. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    Can we go private with a discussion on that?

    I'll email you later or tomorrow morning.

    Because it's log, you really don't get a lot of detail in the bright end of the data where as multi-exposure "HDR" techniques are specifically FOR capturing detail in these areas. Utilmately, Cineon is designed (and limited to) the range of information that is typically captured in a single exposure of a frame of film.

    Yeah exactly. I guess you could call it marginal HDR. Film captures a greater dynamic range that would be possible to represent in normal 8bit files. The density variation in a film negative might be 2 or 3 orders or magnitude, and Cineon can capture that. Even if it's a single "exposure" it does have more dynamic range. That's why it's always a mess and you have to worry about the black and white point and set and manage your LUTs.

    I have a pile of information on Cineon from Steve Wright, but haven't worked with it enough to be able to decode it easily from memory.

    Is he at PixelCorps? Or are you refering to his book? The Brinkman book also has some info (since it's more geared towards film compositing) and I have some SIGGRAPH course notes as reference and faded memories from the vfx newsgroup.

    The Kaufman book looks like it could be interesting. Is there more information on it anywhere?

    Yes they had a leaflet at SIGGRAPH and the lady wanted me to preorder it but it's out know. From the proof copy that I quicklu scanned seemed to be good. Of course the book is from the theoretical perspective, so it will not have a complete practical application for neophytes. Here's some info:

    HIGH DYNAMIC RANGE IMAGING
     
  12. Cryptite_

    Cryptite_ Jedi Youngling star 3

    Registered:
    Jan 1, 2004
    RABBLE RABBLE RABBLE!
     
  13. PadawanNick

    PadawanNick Jedi Grand Master star 4

    Registered:
    Jun 6, 2001
    Yeah, Steve has been part of the PXC for a good part of this year.
    He was on our forums a lot for Q&A on his book (watch for a major revision to come out in Q1 of next year) and he taught a few on-site classes on compositing for us in San Francisco. (last class info page is still up from last month: http://www.padd.com/pxc/classes/classes.php)

    I only worked at the 2 day shoot with Marty for the last class, but I did get to meet Steve in a session he did back in August.
    Really great guy. TONS of knowledge and experience and very willing to share.

    Thanks for the book link.
    Have fun.
     
  14. Jedi2016

    Jedi2016 Jedi Padawan star 4

    Registered:
    Jun 3, 2000
    Don't patronize me, malducin.

    I know damn well what the differences are, probably better than you. I was using the "simple" terms so that the original thread starter would understand. It's obvious he doesn't have much understanding of how these things work, otherwise, he'd never have started the thread. I didn't want to confuse him, why did you?
     
  15. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    Don't patronize me, malducin. I know damn well what the differences are, probably better than you. I was using the "simple" terms so that the original thread starter would understand. It's obvious he doesn't have much understanding of how these things work, otherwise, he'd never have started the thread.

    I'm not patronizing you, and if you understood it that way, I'm sorry.

    But as if you say, if the original poster didn't understand much and you were doing "simple" explanation, why use the incorrect terms or at least a subset of the most correct term? Use the best term possible (GI as opposed to radiosity) lest you risk give him a wrong idea which will make him more confused later on.

    I didn't want to confuse him, why did you?

    I didn't confuse but give a more detailed answer. Puzzles me that people would be averse to more knowledge around here and in the fanfilms forums. How are we supposed to help people if they don't have even the most basic grasp of the terminology. How can people become more than buttonpushers if they don't undertsnad the concepts.
     
  16. Jedi2016

    Jedi2016 Jedi Padawan star 4

    Registered:
    Jun 3, 2000
    All I did was answer his question, his point about how his model didn't look that different using HDRI-based radiosity. I didn't bother spending a half an hour telling him how his terminology was wrong, that's all.

    For me, it's easier to understand something at it's basest level, then learning all the little details that actually make it work. Sometimes getting technical just isn't the best way to get a point across.
     
  17. Brandeni

    Brandeni Jedi Padawan star 4

    Registered:
    Aug 15, 2002
    Take it to PM's please




    would HDRI renders go alot faster with a much smaller res image?
     
  18. malducin

    malducin Jedi Padawan star 4

    Registered:
    Oct 23, 2001
    Not really (unless your initial image was very big). But you don't want your image to be too small, if it is the you might find the illumination to be splotchy. You want it as big as possible to get good sampling.
     
  19. darthviper107

    darthviper107 Jedi Master star 4

    Registered:
    Jun 26, 2003
    HDRI--Check it out:
    [image=http://www.cgfilms.x10hosting.com/g_grievoushdri.jpg]


    Did that image with HDRI from a model I made a while ago
     
  20. Brandeni

    Brandeni Jedi Padawan star 4

    Registered:
    Aug 15, 2002
    Nice, it looks like your still working on the lego thing then?
     
  21. darthviper107

    darthviper107 Jedi Master star 4

    Registered:
    Jun 26, 2003
    Yes, I'm still working on Lego--although I don't think that Grievous will actually be in the movie, or if he is he'll only be in the background.
     
Thread Status:
Not open for further replies.