COLOR

LIGHTING

  • DiffusionLight: HDRI Light Probes for Free by Painting a Chrome Ball

    https://diffusionlight.github.io/

     

     

    https://github.com/DiffusionLight/DiffusionLight

     

    https://github.com/DiffusionLight/DiffusionLight?tab=MIT-1-ov-file#readme

     

    https://colab.research.google.com/drive/15pC4qb9mEtRYsW3utXkk-jnaeVxUy-0S

     

    “a simple yet effective technique to estimate lighting in a single input image. Current techniques rely heavily on HDR panorama datasets to train neural networks to regress an input with limited field-of-view to a full environment map. However, these approaches often struggle with real-world, uncontrolled settings due to the limited diversity and size of their datasets. To address this problem, we leverage diffusion models trained on billions of standard images to render a chrome ball into the input image. Despite its simplicity, this task remains challenging: the diffusion models often insert incorrect or inconsistent objects and cannot readily generate images in HDR format. Our research uncovers a surprising relationship between the appearance of chrome balls and the initial diffusion noise map, which we utilize to consistently generate high-quality chrome balls. We further fine-tune an LDR difusion model (Stable Diffusion XL) with LoRA, enabling it to perform exposure bracketing for HDR light estimation. Our method produces convincing light estimates across diverse settings and demonstrates superior generalization to in-the-wild scenarios.”

     

    , ,
    Read more: DiffusionLight: HDRI Light Probes for Free by Painting a Chrome Ball
  • Arto T. – A workflow for creating photorealistic, equirectangular 360° panoramas in ComfyUI using Flux

    https://civitai.com/models/735980/flux-equirectangular-360-panorama

    https://civitai.com/models/745010?modelVersionId=833115

    The trigger phrase is “equirectangular 360 degree panorama”. I would avoid saying “spherical projection” since that tends to result in non-equirectangular spherical images.

    Image resolution should always be a 2:1 aspect ratio. 1024 x 512 or 1408 x 704 work quite well and were used in the training data. 2048 x 1024 also works.

    I suggest using a weight of 0.5 – 1.5. If you are having issues with the image generating too flat instead of having the necessary spherical distortion, try increasing the weight above 1, though this could negatively impact small details of the image. For Flux guidance, I recommend a value of about 2.5 for realistic scenes.

    8-bit output at the moment

    , ,
    Read more: Arto T. – A workflow for creating photorealistic, equirectangular 360° panoramas in ComfyUI using Flux
  • domeble – Hi-Resolution CGI Backplates and 360° HDRI

    www.domeble.com/

    When collecting hdri make sure the data supports basic metadata, such as:

    • Iso
    • Aperture
    • Exposure time or shutter time
    • Color temperature
    • Color space Exposure value (what the sensor receives of the sun intensity in lux)
    • 7+ brackets (with 5 or 6 being the perceived balanced exposure)

     

    In image processing, computer graphics, and photography, high dynamic range imaging (HDRI or just HDR) is a set of techniques that allow a greater dynamic range of luminances (a Photometry measure of the luminous intensity per unit area of light travelling in a given direction. It describes the amount of light that passes through or is emitted from a particular area, and falls within a given solid angle) between the lightest and darkest areas of an image than standard digital imaging techniques or photographic methods. This wider dynamic range allows HDR images to represent more accurately the wide range of intensity levels found in real scenes ranging from direct sunlight to faint starlight and to the deepest shadows.

     

    The two main sources of HDR imagery are computer renderings and merging of multiple photographs, which in turn are known as low dynamic range (LDR) or standard dynamic range (SDR) images. Tone Mapping (Look-up) techniques, which reduce overall contrast to facilitate display of HDR images on devices with lower dynamic range, can be applied to produce images with preserved or exaggerated local contrast for artistic effect. Photography

     

    In photography, dynamic range is measured in Exposure Values (in photography, exposure value denotes all combinations of camera shutter speed and relative aperture that give the same exposure. The concept was developed in Germany in the 1950s) differences or stops, between the brightest and darkest parts of the image that show detail. An increase of one EV or one stop is a doubling of the amount of light.

     

    The human response to brightness is well approximated by a Steven’s power law, which over a reasonable range is close to logarithmic, as described by the Weber�Fechner law, which is one reason that logarithmic measures of light intensity are often used as well.

     

    HDR is short for High Dynamic Range. It’s a term used to describe an image which contains a greater exposure range than the “black” to “white” that 8 or 16-bit integer formats (JPEG, TIFF, PNG) can describe. Whereas these Low Dynamic Range images (LDR) can hold perhaps 8 to 10 f-stops of image information, HDR images can describe beyond 30 stops and stored in 32 bit images.

     

    , ,
    Read more: domeble – Hi-Resolution CGI Backplates and 360° HDRI

| Featured AI
| Design And Composition
| Explore posts


unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke




Subscribe to PixelSham.com RSS for free
Subscribe to PixelSham.com RSS for free