BREAKING NEWS
LATEST POSTS
-
Lovis Odin ComfyUI-8iPlayer – Seamlessly integrate 8i volumetric videos into your AI workflows
Load holograms, animate cameras, capture frames, and feed them to your favorite AI models. Developed by Lovis Odin for Kartel.ai
You can obtain the MPD URL directly from the official 8i Web Player.https://github.com/Kartel-ai/ComfyUI-8iPlayer/
-
Thomas Müller nv-tlabs GEN3C – 3D-Informed World-Consistent Video Generation with Precise Camera Control
https://github.com/nv-tlabs/GEN3C
Load a picture, define a camera path in 3D, and then render a photoreal video.
-
AI and the Law – Disney, NBCU sue Midjourney over copyright infringement
https://www.axios.com/2025/06/11/disney-nbcu-midjourney-copyright
Why it matters: It’s the first legal action that major Hollywood studios have taken against a generative AI company.
The complaint, filed in a U.S. District Court in central California, accuses Midjourney of both direct and secondary copyright infringement by using the studios’ intellectual property to train their large language model and by displaying AI-generated images of their copyrighted characters. -
ComfyRun – A fully open source and self-hosted solution to run your ComfyUI workflows at blazing fast speeds on cloud GPUs
https://github.com/punitda/ComfyRun
Best suited for individuals who want to
- Run complex workflows in seconds on the powerful GPUs like A10G, A100, and H100
- Experiment with any workflows you find across web without worrying about breaking your local ComfyUI environment
- Edit workflows on the go
-
Python Windows environment requirements vs apps and custom venv installs
Think of Python like a big toolkit of tools (the interpreter and all its libraries). On Windows, you need to install that toolkit in one place so the operating system knows “Here’s where Python lives.” Once that’s in place, each application can make its own little copy of the toolkit (a venv) to keep its dependencies separate. Here’s why this setup is necessary:
(more…)
FEATURED POSTS
-
Shutter Speed and Rolling Shutter
https://www.studiobinder.com/blog/what-is-rolling-stutter
Rendering rolling shutter in Arnold
Rolling_shutter is used to simulate the type of rolling shutter effect seen in footage shot with digital cameras that use CMOS-based sensors such as Blackmagics, Alexas, REDs, and even iPhones. This method is implemented by rolling (moving) the shutter across the camera area instead of the entire image area all at the same time.
https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_user_guide_ac_cameras_html
-
Methods for creating motion blur in Stop motion
en.wikipedia.org/wiki/Go_motion
Petroleum jelly
This crude but reasonably effective technique involves smearing petroleum jelly (“Vaseline”) on a plate of glass in front of the camera lens, also known as vaselensing, then cleaning and reapplying it after each shot — a time-consuming process, but one which creates a blur around the model. This technique was used for the endoskeleton in The Terminator. This process was also employed by Jim Danforth to blur the pterodactyl’s wings in Hammer Films’ When Dinosaurs Ruled the Earth, and by Randal William Cook on the terror dogs sequence in Ghostbusters.[citation needed]Bumping the puppet
Gently bumping or flicking the puppet before taking the frame will produce a slight blur; however, care must be taken when doing this that the puppet does not move too much or that one does not bump or move props or set pieces.Moving the table
Moving the table on which the model is standing while the film is being exposed creates a slight, realistic blur. This technique was developed by Ladislas Starevich: when the characters ran, he moved the set in the opposite direction. This is seen in The Little Parade when the ballerina is chased by the devil. Starevich also used this technique on his films The Eyes of the Dragon, The Magical Clock and The Mascot. Aardman Animations used this for the train chase in The Wrong Trousers and again during the lorry chase in A Close Shave. In both cases the cameras were moved physically during a 1-2 second exposure. The technique was revived for the full-length Wallace & Gromit: The Curse of the Were-Rabbit.Go motion
The most sophisticated technique was originally developed for the film The Empire Strikes Back and used for some shots of the tauntauns and was later used on films like Dragonslayer and is quite different from traditional stop motion. The model is essentially a rod puppet. The rods are attached to motors which are linked to a computer that can record the movements as the model is traditionally animated. When enough movements have been made, the model is reset to its original position, the camera rolls and the model is moved across the table. Because the model is moving during shots, motion blur is created.A variation of go motion was used in E.T. the Extra-Terrestrial to partially animate the children on their bicycles.
-
Scene Referred vs Display Referred color workflows
Display Referred it is tied to the target hardware, as such it bakes color requirements into every type of media output request.
Scene Referred uses a common unified wide gamut and targeting audience through CDL and DI libraries instead.
So that color information stays untouched and only “transformed” as/when needed.Sources:
– Victor Perez – Color Management Fundamentals & ACES Workflows in Nuke
– https://z-fx.nl/ColorspACES.pdf
– Wicus
-
Yasuharu YOSHIZAWA – Comparison of sRGB vs ACREScg in Nuke
Answering the question that is often asked, “Do I need to use ACEScg to display an sRGB monitor in the end?” (Demonstration shown at an in-house seminar)
Comparison of scanlineRender output with extreme color lights on color charts with sRGB/ACREScg in color – OCIO -working space in NukeDownload the Nuke script: