COMPOSITION
- 
HuggingFace ai-comic-factory – a FREE AI Comic Book CreatorRead more: HuggingFace ai-comic-factory – a FREE AI Comic Book Creatorhttps://huggingface.co/spaces/jbilcke-hf/ai-comic-factory this is the epic story of a group of talented digital artists trying to overcame daily technical challenges to achieve incredibly photorealistic projects of monsters and aliens 
DESIGN
COLOR
- 
OpenColorIO standardRead more: OpenColorIO standardhttps://www.provideocoalition.com/color-management-part-11-introducing-opencolorio/ OpenColorIO (OCIO) is a new open source project from Sony Imageworks. Based on development started in 2003, OCIO enables color transforms and image display to be handled in a consistent manner across multiple graphics applications. Unlike other color management solutions, OCIO is geared towards motion-picture post production, with an emphasis on visual effects and animation color pipelines. 
- 
Tim Kang – calibrated white light values in sRGB color spaceRead more: Tim Kang – calibrated white light values in sRGB color space8bit sRGB encoded 
 2000K 255 139 22
 2700K 255 172 89
 3000K 255 184 109
 3200K 255 190 122
 4000K 255 211 165
 4300K 255 219 178
 D50 255 235 205
 D55 255 243 224
 D5600 255 244 227
 D6000 255 249 240
 D65 255 255 255
 D10000 202 221 255
 D20000 166 196 2558bit Rec709 Gamma 2.4 
 2000K 255 145 34
 2700K 255 177 97
 3000K 255 187 117
 3200K 255 193 129
 4000K 255 214 170
 4300K 255 221 182
 D50 255 236 208
 D55 255 243 226
 D5600 255 245 229
 D6000 255 250 241
 D65 255 255 255
 D10000 204 222 255
 D20000 170 199 2558bit Display P3 encoded 
 2000K 255 154 63
 2700K 255 185 109
 3000K 255 195 127
 3200K 255 201 138
 4000K 255 219 176
 4300K 255 225 187
 D50 255 239 212
 D55 255 245 228
 D5600 255 246 231
 D6000 255 251 242
 D65 255 255 255
 D10000 208 223 255
 D20000 175 199 25510bit Rec2020 PQ (100 nits) 
 2000K 520 435 273
 2700K 520 466 358
 3000K 520 475 384
 3200K 520 480 399
 4000K 520 495 446
 4300K 520 500 458
 D50 520 510 482
 D55 520 514 497
 D5600 520 514 500
 D6000 520 517 509
 D65 520 520 520
 D10000 479 489 520
 D20000 448 464 520
- 
If a blind person gained sight, could they recognize objects previously touched?Read more: If a blind person gained sight, could they recognize objects previously touched?Blind people who regain their sight may find themselves in a world they don’t immediately comprehend. “It would be more like a sighted person trying to rely on tactile information,” Moore says. Learning to see is a developmental process, just like learning language, Prof Cathleen Moore continues. “As far as vision goes, a three-and-a-half year old child is already a well-calibrated system.” 
- 
Photography Basics : Spectral Sensitivity Estimation Without a CameraRead more: Photography Basics : Spectral Sensitivity Estimation Without a Camerahttps://color-lab-eilat.github.io/Spectral-sensitivity-estimation-web/ A number of problems in computer vision and related fields would be mitigated if camera spectral sensitivities were known. As consumer cameras are not designed for high-precision visual tasks, manufacturers do not disclose spectral sensitivities. Their estimation requires a costly optical setup, which triggered researchers to come up with numerous indirect methods that aim to lower cost and complexity by using color targets. However, the use of color targets gives rise to new complications that make the estimation more difficult, and consequently, there currently exists no simple, low-cost, robust go-to method for spectral sensitivity estimation that non-specialized research labs can adopt. Furthermore, even if not limited by hardware or cost, researchers frequently work with imagery from multiple cameras that they do not have in their possession. To provide a practical solution to this problem, we propose a framework for spectral sensitivity estimation that not only does not require any hardware (including a color target), but also does not require physical access to the camera itself. Similar to other work, we formulate an optimization problem that minimizes a two-term objective function: a camera-specific term from a system of equations, and a universal term that bounds the solution space. Different than other work, we utilize publicly available high-quality calibration data to construct both terms. We use the colorimetric mapping matrices provided by the Adobe DNG Converter to formulate the camera-specific system of equations, and constrain the solutions using an autoencoder trained on a database of ground-truth curves. On average, we achieve reconstruction errors as low as those that can arise due to manufacturing imperfections between two copies of the same camera. We provide predicted sensitivities for more than 1,000 cameras that the Adobe DNG Converter currently supports, and discuss which tasks can become trivial when camera responses are available.  
- 
Stefan Ringelschwandtner – LUT Inspector toolRead more: Stefan Ringelschwandtner – LUT Inspector toolIt lets you load any .cube LUT right in your browser, see the RGB curves, and use a split view on the Granger Test Image to compare the original vs. LUT-applied version in real time — perfect for spotting hue shifts, saturation changes, and contrast tweaks. https://mononodes.com/lut-inspector/  
- 
What causes colorRead more: What causes colorwww.webexhibits.org/causesofcolor/5.html Water itself has an intrinsic blue color that is a result of its molecular structure and its behavior.  
- 
SecretWeapons MixBox – a practical library for paint-like digital color mixingRead more: SecretWeapons MixBox – a practical library for paint-like digital color mixingInternally, Mixbox treats colors as real-life pigments using the Kubelka & Munk theory to predict realistic color behavior. https://scrtwpns.com/mixbox/painter/ https://scrtwpns.com/mixbox.pdf https://github.com/scrtwpns/mixbox https://scrtwpns.com/mixbox/docs/ 
- 
Is a MacBeth Colour Rendition Chart the Safest Way to Calibrate a Camera?Read more: Is a MacBeth Colour Rendition Chart the Safest Way to Calibrate a Camera?www.colour-science.org/posts/the-colorchecker-considered-mostly-harmless/ “Unless you have all the relevant spectral measurements, a colour rendition chart should not be used to perform colour-correction of camera imagery but only for white balancing and relative exposure adjustments.” “Using a colour rendition chart for colour-correction might dramatically increase error if the scene light source spectrum is different from the illuminant used to compute the colour rendition chart’s reference values.” “other factors make using a colour rendition chart unsuitable for camera calibration: – Uncontrolled geometry of the colour rendition chart with the incident illumination and the camera. 
 – Unknown sample reflectances and ageing as the colour of the samples vary with time.
 – Low samples count.
 – Camera noise and flare.
 – Etc…“Those issues are well understood in the VFX industry, and when receiving plates, we almost exclusively use colour rendition charts to white balance and perform relative exposure adjustments, i.e. plate neutralisation.” 
LIGHTING
- 
About green screensRead more: About green screenshackaday.com/2015/02/07/how-green-screen-worked-before-computers/ www.newtek.com/blog/tips/best-green-screen-materials/ www.chromawall.com/blog//chroma-key-green Chroma Key Green, the color of green screens is also known as Chroma Green and is valued at approximately 354C in the Pantone color matching system (PMS). Chroma Green can be broken down in many different ways. Here is green screen green as other values useful for both physical and digital production: Green Screen as RGB Color Value: 0, 177, 64 
 Green Screen as CMYK Color Value: 81, 0, 92, 0
 Green Screen as Hex Color Value: #00b140
 Green Screen as Websafe Color Value: #009933Chroma Key Green is reasonably close to an 18% gray reflectance. Illuminate your green screen with an uniform source with less than 2/3 EV variation. 
 The level of brightness at any given f-stop should be equivalent to a 90% white card under the same lighting.
- 
Rendering – BRDF – Bidirectional reflectance distribution functionRead more: Rendering – BRDF – Bidirectional reflectance distribution functionhttp://en.wikipedia.org/wiki/Bidirectional_reflectance_distribution_function The bidirectional reflectance distribution function is a four-dimensional function that defines how light is reflected at an opaque surface http://www.cs.ucla.edu/~zhu/tutorial/An_Introduction_to_BRDF-Based_Lighting.pdf In general, when light interacts with matter, a complicated light-matter dynamic occurs. This interaction depends on the physical characteristics of the light as well as the physical composition and characteristics of the matter. That is, some of the incident light is reflected, some of the light is transmitted, and another portion of the light is absorbed by the medium itself. A BRDF describes how much light is reflected when light makes contact with a certain material. Similarly, a BTDF (Bi-directional Transmission Distribution Function) describes how much light is transmitted when light makes contact with a certain material http://www.cs.princeton.edu/~smr/cs348c-97/surveypaper.html It is difficult to establish exactly how far one should go in elaborating the surface model. A truly complete representation of the reflective behavior of a surface might take into account such phenomena as polarization, scattering, fluorescence, and phosphorescence, all of which might vary with position on the surface. Therefore, the variables in this complete function would be: incoming and outgoing angle incoming and outgoing wavelength incoming and outgoing polarization (both linear and circular) incoming and outgoing position (which might differ due to subsurface scattering) time delay between the incoming and outgoing light ray 
- 
3D Lighting Tutorial by Amaan KramRead more: 3D Lighting Tutorial by Amaan Kramhttp://www.amaanakram.com/lightingT/part1.htm The goals of lighting in 3D computer graphics are more or less the same as those of real world lighting. Lighting serves a basic function of bringing out, or pushing back the shapes of objects visible from the camera’s view. 
 It gives a two-dimensional image on the monitor an illusion of the third dimension-depth.But it does not just stop there. It gives an image its personality, its character. A scene lit in different ways can give a feeling of happiness, of sorrow, of fear etc., and it can do so in dramatic or subtle ways. Along with personality and character, lighting fills a scene with emotion that is directly transmitted to the viewer. Trying to simulate a real environment in an artificial one can be a daunting task. But even if you make your 3D rendering look absolutely photo-realistic, it doesn’t guarantee that the image carries enough emotion to elicit a “wow” from the people viewing it. Making 3D renderings photo-realistic can be hard. Putting deep emotions in them can be even harder. However, if you plan out your lighting strategy for the mood and emotion that you want your rendering to express, you make the process easier for yourself. Each light source can be broken down in to 4 distinct components and analyzed accordingly. · Intensity 
 · Direction
 · Color
 · SizeThe overall thrust of this writing is to produce photo-realistic images by applying good lighting techniques. 
COLLECTIONS
| Featured AI
| Design And Composition 
| Explore posts  
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
- 
Google – Artificial Intelligence free courses
- 
Advanced Computer Vision with Python OpenCV and Mediapipe
- 
Alejandro Villabón and Rafał Kaniewski – Recover Highlights With 8-Bit to High Dynamic Range Half Float Copycat – Nuke
- 
Photography basics: Production Rendering Resolution Charts
- 
Image rendering bit depth
- 
Game Development tips
- 
Photography basics: Solid Angle measures
- 
N8N.io – From Zero to Your First AI Agent in 25 Minutes
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.



























