COMPOSITION
- 
Photography basics: Depth of Field and compositionRead more: Photography basics: Depth of Field and compositionDepth of field is the range within which focusing is resolved in a photo. 
 Aperture has a huge affect on to the depth of field.Changing the f-stops (f/#) of a lens will change aperture and as such the DOF. f-stops are a just certain number which is telling you the size of the aperture. That’s how f-stop is related to aperture (and DOF). If you increase f-stops, it will increase DOF, the area in focus (and decrease the aperture). On the other hand, decreasing the f-stop it will decrease DOF (and increase the aperture). The red cone in the figure is an angular representation of the resolution of the system. Versus the dotted lines, which indicate the aperture coverage. Where the lines of the two cones intersect defines the total range of the depth of field. This image explains why the longer the depth of field, the greater the range of clarity. 
- 
9 Best Hacks to Make a Cinematic Video with Any CameraRead more: 9 Best Hacks to Make a Cinematic Video with Any Camerahttps://www.flexclip.com/learn/cinematic-video.html - Frame Your Shots to Create Depth
- Create Shallow Depth of Field
- Avoid Shaky Footage and Use Flexible Camera Movements
- Properly Use Slow Motion
- Use Cinematic Lighting Techniques
- Apply Color Grading
- Use Cinematic Music and SFX
- Add Cinematic Fonts and Text Effects
- Create the Cinematic Bar at the Top and the Bottom
  
DESIGN
- 
How to paint a boardgame miniaturesRead more: How to paint a boardgame miniaturesSteps: - soap wash cleaning
- primer
- base-coat layer (black/white)
- detailing
- washing aka shade (could be done after highlighting)
- highlights aka dry brushing (could be done after washing)
- varnish (gloss/satin/matte)
 
- 
Disco Diffusion V4.1 Google Colab, Dall-E, Starryai – creating images with AIRead more: Disco Diffusion V4.1 Google Colab, Dall-E, Starryai – creating images with AIDisco Diffusion (DD) is a Google Colab Notebook which leverages an AI Image generating technique called CLIP-Guided Diffusion to allow you to create compelling and beautiful images from just text inputs. Created by Somnai, augmented by Gandamu, and building on the work of RiversHaveWings, nshepperd, and many others. Phone app: https://www.starryai.com/ docs.google.com/document/d/1l8s7uS2dGqjztYSjPpzlmXLjl5PM3IGkRWI3IiCuK7g colab.research.google.com/drive/1sHfRn5Y0YKYKi1k-ifUSBFRNJ8_1sa39 Colab, or “Colaboratory”, allows you to write and execute Python in your browser, with – Zero configuration required 
 – Access to GPUs free of charge
 – Easy sharinghttps://80.lv/articles/a-beautiful-roman-villa-made-with-disco-diffusion-5-2/      
COLOR
- 
About color: What is a LUTRead more: About color: What is a LUThttp://www.lightillusion.com/luts.html https://www.shutterstock.com/blog/how-use-luts-color-grading A LUT (Lookup Table) is essentially the modifier between two images, the original image and the displayed image, based on a mathematical formula. Basically conversion matrices of different complexities. There are different types of LUTS – viewing, transform, calibration, 1D and 3D. 
- 
Space bodies’ components and light spectroscopyRead more: Space bodies’ components and light spectroscopywww.plutorules.com/page-111-space-rocks.html This help’s us understand the composition of components in/on solar system bodies. Dips in the observed light spectrum, also known as, lines of absorption occur as gasses absorb energy from light at specific points along the light spectrum. These dips or darkened zones (lines of absorption) leave a finger print which identify elements and compounds. In this image the dark absorption bands appear as lines of emission which occur as the result of emitted not reflected (absorbed) light. Lines of absorption  Lines of emission Lines of emission    
- 
sRGB vs REC709 – An introduction and FFmpeg implementationsRead more: sRGB vs REC709 – An introduction and FFmpeg implementations 1. Basic Comparison- What they are
- sRGB: A standard “web”/computer-display RGB color space defined by IEC 61966-2-1. It’s used for most monitors, cameras, printers, and the vast majority of images on the Internet.
- Rec. 709: An HD-video color space defined by ITU-R BT.709. It’s the go-to standard for HDTV broadcasts, Blu-ray discs, and professional video pipelines.
 
- Why they exist
- sRGB: Ensures consistent colors across different consumer devices (PCs, phones, webcams).
- Rec. 709: Ensures consistent colors across video production and playback chains (cameras → editing → broadcast → TV).
 
- What you’ll see
- On your desktop or phone, images tagged sRGB will look “right” without extra tweaking.
- On an HDTV or video-editing timeline, footage tagged Rec. 709 will display accurate contrast and hue on broadcast-grade monitors.
 
 
 2. Digging DeeperFeature sRGB Rec. 709 White point D65 (6504 K), same for both D65 (6504 K) Primaries (x,y) R: (0.640, 0.330) G: (0.300, 0.600) B: (0.150, 0.060) R: (0.640, 0.330) G: (0.300, 0.600) B: (0.150, 0.060) Gamut size Identical triangle on CIE 1931 chart Identical to sRGB Gamma / transfer Piecewise curve: approximate 2.2 with linear toe Pure power-law γ≈2.4 (often approximated as 2.2 in practice) Matrix coefficients N/A (pure RGB usage) Y = 0.2126 R + 0.7152 G + 0.0722 B (Rec. 709 matrix) Typical bit-depth 8-bit/channel (with 16-bit variants) 8-bit/channel (10-bit for professional video) Usage metadata Tagged as “sRGB” in image files (PNG, JPEG, etc.) Tagged as “bt709” in video containers (MP4, MOV) Color range Full-range RGB (0–255) Studio-range Y′CbCr (Y′ [16–235], Cb/Cr [16–240]) 
 Why the Small Differences Matter(more…)
- What they are
- 
If a blind person gained sight, could they recognize objects previously touched?Read more: If a blind person gained sight, could they recognize objects previously touched?Blind people who regain their sight may find themselves in a world they don’t immediately comprehend. “It would be more like a sighted person trying to rely on tactile information,” Moore says. Learning to see is a developmental process, just like learning language, Prof Cathleen Moore continues. “As far as vision goes, a three-and-a-half year old child is already a well-calibrated system.” 
- 
VES Cinematic Color – Motion-Picture Color ManagementRead more: VES Cinematic Color – Motion-Picture Color ManagementThis paper presents an introduction to the color pipelines behind modern feature-film visual-effects and animation. Authored by Jeremy Selan, and reviewed by the members of the VES Technology Committee including Rob Bredow, Dan Candela, Nick Cannon, Paul Debevec, Ray Feeney, Andy Hendrickson, Gautham Krishnamurti, Sam Richards, Jordan Soles, and Sebastian Sylwan. 
LIGHTING
- 
Simulon – a Hollywood production studio app in the hands of an independent creator with access to consumer hardware, LDRi to HDRi through MLRead more: Simulon – a Hollywood production studio app in the hands of an independent creator with access to consumer hardware, LDRi to HDRi through MLDivesh Naidoo: The video below was made with a live in-camera preview and auto-exposure matching, no camera solve, no HDRI capture and no manual compositing setup. Using the new Simulon phone app. LDR to HDR through ML https://simulon.typeform.com/betatest (more…)Process example 
- 
Custom bokeh in a raytraced DOF renderRead more: Custom bokeh in a raytraced DOF renderTo achieve a custom pinhole camera effect with a custom bokeh in Arnold Raytracer, you can follow these steps: - Set the render camera with a focal length around 50 (or as needed)
- Set the F-Stop to a high value (e.g., 22).
- Set the focus distance as you require
- Turn on DOF
- Place a plane a few cm in front of the camera.
- Texture the plane with a transparent shape at the center of it. (Transmission with no specular roughness)
 
COLLECTIONS
| Featured AI
| Design And Composition 
| Explore posts  
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
- 
Yann Lecun: Meta AI, Open Source, Limits of LLMs, AGI & the Future of AI | Lex Fridman Podcast #416
- 
Kling 1.6 and competitors – advanced tests and comparisons
- 
Matt Hallett – WAN 2.1 VACE Total Video Control in ComfyUI
- 
Animation/VFX/Game Industry JOB POSTINGS by Chris Mayne
- 
VFX pipeline – Render Wall Farm management topics
- 
Key/Fill ratios and scene composition using false colors and Nuke node
- 
Black Body color aka the Planckian Locus curve for white point eye perception
- 
copypastecharacter.com – alphabets, special characters, alt codes and symbols library
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.



























