COMPOSITION
DESIGN
- 
Principles of Interior Design – BalanceRead more: Principles of Interior Design – Balancehttps://www.yankodesign.com/2024/09/18/principles-of-interior-design-balance The three types of balance include: - Symmetrical Balance
- Asymmetrical Balance
- Radial Balance
 
COLOR
- 
RawTherapee – a free, open source, cross-platform raw image and HDRi processing programRead more: RawTherapee – a free, open source, cross-platform raw image and HDRi processing program5.10 of this tool includes excellent tools to clean up cr2 and cr3 used on set to support HDRI processing. 
 Converting raw to AcesCG 32 bit tiffs with metadata.
- 
The Forbidden colors – Red-Green & Blue-Yellow: The Stunning Colors You Can’t SeeRead more: The Forbidden colors – Red-Green & Blue-Yellow: The Stunning Colors You Can’t Seewww.livescience.com/17948-red-green-blue-yellow-stunning-colors.html  While the human eye has red, green, and blue-sensing cones, those cones are cross-wired in the retina to produce a luminance channel plus a red-green and a blue-yellow channel, and it’s data in that color space (known technically as “LAB”) that goes to the brain. That’s why we can’t perceive a reddish-green or a yellowish-blue, whereas such colors can be represented in the RGB color space used by digital cameras. https://en.rockcontent.com/blog/the-use-of-yellow-in-data-design The back of the retina is covered in light-sensitive neurons known as cone cells and rod cells. There are three types of cone cells, each sensitive to different ranges of light. These ranges overlap, but for convenience the cones are referred to as blue (short-wavelength), green (medium-wavelength), and red (long-wavelength). The rod cells are primarily used in low-light situations, so we’ll ignore those for now. When light enters the eye and hits the cone cells, the cones get excited and send signals to the brain through the visual cortex. Different wavelengths of light excite different combinations of cones to varying levels, which generates our perception of color. You can see that the red cones are most sensitive to light, and the blue cones are least sensitive. The sensitivity of green and red cones overlaps for most of the visible spectrum.  Here’s how your brain takes the signals of light intensity from the cones and turns it into color information. To see red or green, your brain finds the difference between the levels of excitement in your red and green cones. This is the red-green channel. To get “brightness,” your brain combines the excitement of your red and green cones. This creates the luminance, or black-white, channel. To see yellow or blue, your brain then finds the difference between this luminance signal and the excitement of your blue cones. This is the yellow-blue channel. From the calculations made in the brain along those three channels, we get four basic colors: blue, green, yellow, and red. Seeing blue is what you experience when low-wavelength light excites the blue cones more than the green and red. Seeing green happens when light excites the green cones more than the red cones. Seeing red happens when only the red cones are excited by high-wavelength light. Here’s where it gets interesting. Seeing yellow is what happens when BOTH the green AND red cones are highly excited near their peak sensitivity. This is the biggest collective excitement that your cones ever have, aside from seeing pure white. Notice that yellow occurs at peak intensity in the graph to the right. Further, the lens and cornea of the eye happen to block shorter wavelengths, reducing sensitivity to blue and violet light. 
LIGHTING
- 
Light and Matter : The 2018 theory of Physically-Based Rendering and Shading by AllegorithmicRead more: Light and Matter : The 2018 theory of Physically-Based Rendering and Shading by Allegorithmicacademy.substance3d.com/courses/the-pbr-guide-part-1 academy.substance3d.com/courses/the-pbr-guide-part-2 Local copy:
 
- 
Simulon – a Hollywood production studio app in the hands of an independent creator with access to consumer hardware, LDRi to HDRi through MLRead more: Simulon – a Hollywood production studio app in the hands of an independent creator with access to consumer hardware, LDRi to HDRi through MLDivesh Naidoo: The video below was made with a live in-camera preview and auto-exposure matching, no camera solve, no HDRI capture and no manual compositing setup. Using the new Simulon phone app. LDR to HDR through ML https://simulon.typeform.com/betatest (more…)Process example 
- 
Composition – cinematography Cheat SheetRead more: Composition – cinematography Cheat Sheet Where is our eye attracted first? Why? Size. Focus. Lighting. Color. Size. Mr. White (Harvey Keitel) on the right. 
 Focus. He’s one of the two objects in focus.
 Lighting. Mr. White is large and in focus and Mr. Pink (Steve Buscemi) is highlighted by
 a shaft of light.
 Color. Both are black and white but the read on Mr. White’s shirt now really stands out.
 (more…)
 What type of lighting?
- 
Magnific.ai Relight – change the entire lighting of a sceneRead more: Magnific.ai Relight – change the entire lighting of a sceneIt’s a new Magnific spell that allows you to change the entire lighting of a scene and, optionally, the background with just: 1/ A prompt OR 
 2/ A reference image OR
 3/ A light map (drawing your own lights)https://x.com/javilopen/status/1805274155065176489 
- 
Tracing Spherical harmonics and how Weta used them in productionRead more: Tracing Spherical harmonics and how Weta used them in productionA way to approximate complex lighting in ultra realistic renders. All SH lighting techniques involve replacing parts of standard lighting equations with spherical functions that have been projected into frequency space using the spherical harmonics as a basis. http://www.cs.columbia.edu/~cs4162/slides/spherical-harmonic-lighting.pdf Spherical harmonics as used at Weta Digital 
- 
Debayer – A free command line tool to convert camera raw images into scene-linear exrRead more: Debayer – A free command line tool to convert camera raw images into scene-linear exr https://github.com/jedypod/debayer The only required dependency is oiiotool. However other “debayer engines” are also supported. - OpenImageIO – oiiotool is used for converting debayered tif images to exr.
- Debayer Engines
- RawTherapee – Powerful raw development software used to decode raw images. High quality, good selection of debayer algorithms, and more advanced raw processing like chromatic aberration removal.
- LibRaw – dcraw_emu commandline utility included with LibRaw. Optional alternative for debayer. Simple, fast and effective.
- Darktable – Uses darktable-cli plus an xmp config to process.
- vkdt – uses vkdt-cli to debayer. Pretty experimental still. Uses Vulkan for image processing. Stupidly fast. Pretty limited.
 
 
COLLECTIONS
| Featured AI
| Design And Composition 
| Explore posts  
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
- 
Alejandro Villabón and Rafał Kaniewski – Recover Highlights With 8-Bit to High Dynamic Range Half Float Copycat – Nuke
- 
Embedding frame ranges into Quicktime movies with FFmpeg
- 
Generative AI Glossary / AI Dictionary / AI Terminology
- 
Convert 2D Images or Text to 3D Models
- 
Matt Hallett – WAN 2.1 VACE Total Video Control in ComfyUI
- 
Image rendering bit depth
- 
Game Development tips
- 
Scene Referred vs Display Referred color workflows
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.



















































