COMPOSITION
- 
SlowMoVideo – How to make a slow motion shot with the open source programRead more: SlowMoVideo – How to make a slow motion shot with the open source programhttp://slowmovideo.granjow.net/ slowmoVideo is an OpenSource program that creates slow-motion videos from your footage. Slow motion cinematography is the result of playing back frames for a longer duration than they were exposed. For example, if you expose 240 frames of film in one second, then play them back at 24 fps, the resulting movie is 10 times longer (slower) than the original filmed event…. Film cameras are relatively simple mechanical devices that allow you to crank up the speed to whatever rate the shutter and pull-down mechanism allow. Some film cameras can operate at 2,500 fps or higher (although film shot in these cameras often needs some readjustment in postproduction). Video, on the other hand, is always captured, recorded, and played back at a fixed rate, with a current limit around 60fps. This makes extreme slow motion effects harder to achieve (and less elegant) on video, because slowing down the video results in each frame held still on the screen for a long time, whereas with high-frame-rate film there are plenty of frames to fill the longer durations of time. On video, the slow motion effect is more like a slide show than smooth, continuous motion. One obvious solution is to shoot film at high speed, then transfer it to video (a case where film still has a clear advantage, sorry George). Another possibility is to cross dissolve or blur from one frame to the next. This adds a smooth transition from one still frame to the next. The blur reduces the sharpness of the image, and compared to slowing down images shot at a high frame rate, this is somewhat of a cheat. However, there isn’t much you can do about it until video can be recorded at much higher rates. Of course, many film cameras can’t shoot at high frame rates either, so the whole super-slow-motion endeavor is somewhat specialized no matter what medium you are using. (There are some high speed digital cameras available now that allow you to capture lots of digital frames directly to your computer, so technology is starting to catch up with film. However, this feature isn’t going to appear in consumer camcorders any time soon.) 
DESIGN
COLOR
- 
Victor Perez – ACES Color Management in DaVinci ResolveRead more: Victor Perez – ACES Color Management in DaVinci Resolvehttpv://www.youtube.com/watch?v=i–TS88-6xA 
- 
Types of Film Lights and their efficiency – CRI, Color Temperature and Luminous EfficacyRead more: Types of Film Lights and their efficiency – CRI, Color Temperature and Luminous Efficacynofilmschool.com/types-of-film-lights “Not every light performs the same way. Lights and lighting are tricky to handle. You have to plan for every circumstance. But the good news is, lighting can be adjusted. Let’s look at different factors that affect lighting in every scene you shoot. “ 
 Use CRI, Luminous Efficacy and color temperature controls to match your needs.Color Temperature 
 Color temperature describes the “color” of white light by a light source radiated by a perfect black body at a given temperature measured in degrees Kelvinhttps://www.pixelsham.com/2019/10/18/color-temperature/ CRI 
 “The Color Rendering Index is a measurement of how faithfully a light source reveals the colors of whatever it illuminates, it describes the ability of a light source to reveal the color of an object, as compared to the color a natural light source would provide. The highest possible CRI is 100. A CRI of 100 generally refers to a perfect black body, like a tungsten light source or the sun. “https://www.studiobinder.com/blog/what-is-color-rendering-index (more…)
- 
What light is best to illuminate gems for resaleRead more: What light is best to illuminate gems for resalewww.palagems.com/gem-lighting2 Artificial light sources, not unlike the diverse phases of natural light, vary considerably in their properties. As a result, some lamps render an object’s color better than others do. The most important criterion for assessing the color-rendering ability of any lamp is its spectral power distribution curve. Natural daylight varies too much in strength and spectral composition to be taken seriously as a lighting standard for grading and dealing colored stones. For anything to be a standard, it must be constant in its properties, which natural light is not. For dealers in particular to make the transition from natural light to an artificial light source, that source must offer: 
 1- A degree of illuminance at least as strong as the common phases of natural daylight.
 2- Spectral properties identical or comparable to a phase of natural daylight.A source combining these two things makes gems appear much the same as when viewed under a given phase of natural light. From the viewpoint of many dealers, this corresponds to a naturalappearance. The 6000° Kelvin xenon short-arc lamp appears closest to meeting the criteria for a standard light source. Besides the strong illuminance this lamp affords, its spectrum is very similar to CIE standard illuminants of similar color temperature.   
- 
Gamma correctionRead more: Gamma correction http://www.normankoren.com/makingfineprints1A.html#Gammabox https://en.wikipedia.org/wiki/Gamma_correction http://www.photoscientia.co.uk/Gamma.htm https://www.w3.org/Graphics/Color/sRGB.html http://www.eizoglobal.com/library/basics/lcd_display_gamma/index.html https://forum.reallusion.com/PrintTopic308094.aspx Basically, gamma is the relationship between the brightness of a pixel as it appears on the screen, and the numerical value of that pixel. Generally Gamma is just about defining relationships. Three main types: 
 – Image Gamma encoded in images
 – Display Gammas encoded in hardware and/or viewing time
 – System or Viewing Gamma which is the net effect of all gammas when you look back at a final image. In theory this should flatten back to 1.0 gamma.
 (more…)
- 
RawTherapee – a free, open source, cross-platform raw image and HDRi processing programRead more: RawTherapee – a free, open source, cross-platform raw image and HDRi processing program5.10 of this tool includes excellent tools to clean up cr2 and cr3 used on set to support HDRI processing. 
 Converting raw to AcesCG 32 bit tiffs with metadata.
LIGHTING
- 
What is physically correct lighting all about?Read more: What is physically correct lighting all about?http://gamedev.stackexchange.com/questions/60638/what-is-physically-correct-lighting-all-about 2012-08 Nathan Reed wrote: Physically-based shading means leaving behind phenomenological models, like the Phong shading model, which are simply built to “look good” subjectively without being based on physics in any real way, and moving to lighting and shading models that are derived from the laws of physics and/or from actual measurements of the real world, and rigorously obey physical constraints such as energy conservation. For example, in many older rendering systems, shading models included separate controls for specular highlights from point lights and reflection of the environment via a cubemap. You could create a shader with the specular and the reflection set to wildly different values, even though those are both instances of the same physical process. In addition, you could set the specular to any arbitrary brightness, even if it would cause the surface to reflect more energy than it actually received. In a physically-based system, both the point light specular and the environment reflection would be controlled by the same parameter, and the system would be set up to automatically adjust the brightness of both the specular and diffuse components to maintain overall energy conservation. Moreover you would want to set the specular brightness to a realistic value for the material you’re trying to simulate, based on measurements. Physically-based lighting or shading includes physically-based BRDFs, which are usually based on microfacet theory, and physically correct light transport, which is based on the rendering equation (although heavily approximated in the case of real-time games). It also includes the necessary changes in the art process to make use of these features. Switching to a physically-based system can cause some upsets for artists. First of all it requires full HDR lighting with a realistic level of brightness for light sources, the sky, etc. and this can take some getting used to for the lighting artists. It also requires texture/material artists to do some things differently (particularly for specular), and they can be frustrated by the apparent loss of control (e.g. locking together the specular highlight and environment reflection as mentioned above; artists will complain about this). They will need some time and guidance to adapt to the physically-based system. On the plus side, once artists have adapted and gained trust in the physically-based system, they usually end up liking it better, because there are fewer parameters overall (less work for them to tweak). Also, materials created in one lighting environment generally look fine in other lighting environments too. This is unlike more ad-hoc models, where a set of material parameters might look good during daytime, but it comes out ridiculously glowy at night, or something like that. Here are some resources to look at for physically-based lighting in games: SIGGRAPH 2013 Physically Based Shading Course, particularly the background talk by Naty Hoffman at the beginning. You can also check out the previous incarnations of this course for more resources. Sébastien Lagarde, Adopting a physically-based shading model and Feeding a physically-based shading model And of course, I would be remiss if I didn’t mention Physically-Based Rendering by Pharr and Humphreys, an amazing reference on this whole subject and well worth your time, although it focuses on offline rather than real-time rendering. 
- 
Light and Matter : The 2018 theory of Physically-Based Rendering and Shading by AllegorithmicRead more: Light and Matter : The 2018 theory of Physically-Based Rendering and Shading by Allegorithmicacademy.substance3d.com/courses/the-pbr-guide-part-1 academy.substance3d.com/courses/the-pbr-guide-part-2 Local copy:
 
- 
IES Light Profiles and editing softwareRead more: IES Light Profiles and editing softwarehttp://www.derekjenson.com/3d-blog/ies-light-profiles https://ieslibrary.com/en/browse#ies https://leomoon.com/store/shaders/ies-lights-pack https://docs.arnoldrenderer.com/display/a5afmug/ai+photometric+light IES profiles are useful for creating life-like lighting, as they can represent the physical distribution of light from any light source. The IES format was created by the Illumination Engineering Society, and most lighting manufacturers provide IES profile for the lights they manufacture. 
- 
Beeble Switchlight’s Plugin for Foundry NukeRead more: Beeble Switchlight’s Plugin for Foundry Nukehttps://www.cutout.pro/learn/beeble-switchlight/ https://www.switchlight-api.beeble.ai/pricing https://www.switchlight-api.beeble.ai https://github.com/beeble-ai/SwitchLight-Studio https://beeble.ai/terms-of-use https://www.switchlight-api.beeble.ai/docs 
- 
Tracing Spherical harmonics and how Weta used them in productionRead more: Tracing Spherical harmonics and how Weta used them in productionA way to approximate complex lighting in ultra realistic renders. All SH lighting techniques involve replacing parts of standard lighting equations with spherical functions that have been projected into frequency space using the spherical harmonics as a basis. http://www.cs.columbia.edu/~cs4162/slides/spherical-harmonic-lighting.pdf Spherical harmonics as used at Weta Digital 
COLLECTIONS
| Featured AI
| Design And Composition 
| Explore posts  
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
- 
HDRI Median Cut plugin
- 
Advanced Computer Vision with Python OpenCV and Mediapipe
- 
VFX pipeline – Render Wall Farm management topics
- 
Steven Stahlberg – Perception and Composition
- 
Methods for creating motion blur in Stop motion
- 
Blender VideoDepthAI – Turn any video into 3D Animated Scenes
- 
How do LLMs like ChatGPT (Generative Pre-Trained Transformer) work? Explained by Deep-Fake Ryan Gosling
- 
Most common ways to smooth 3D prints
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.






























