BREAKING NEWS
LATEST POSTS
-
memerwala_londa – Ghibli like Midjourney and Kling video
https://www.reddit.com/r/midjourney/comments/1lbblfq/ghibli_style_game_guide_included/
Made everything on Edits App
Image Generation on Midjourney
Video Generation on Kling 2.1I used Joystick png to add buttons,then some asmr video sounds to make it look more lively,I used text as Buttons,
Prompts:
All Prompts are in order just like in video
First-person POV video game screenshot, playing as a young anime protagonist in a slightly oversized white t-shirt and knee-length blue shorts. Visible hands pushing open a sun-faded wooden door, forearms resting on the frame. In a dusty hallway mirror reflection: character’s soft Ghibli-style face with windblown hair. Inside a cozy coastal cottage: slanted sunlight through lace curtains, pastel walls with watercolor seascapes, overstuffed bookshelf spilling seashells. Foreground: ‘E: Rest’ prompt over a quilted sofa. Background: steaming teacup on a driftwood table, open window revealing distant lighthouse and Miyazaki fluffy clouds. Soft painterly textures, slight fisheye lens, identical HUD (minimap corner, health bar)
First-person POV video game screenshot, playing as a young anime protagonist in a slightly oversized white t-shirt and knee-length blue shorts. View includes visible hands gripping a steering wheel, sunlit arms resting on car door, and rearview mirror showing character’s soft Ghibli-style face with windblown hair. Driving through a vibrant coastal town: cobblestone streets, pastel houses with flower boxes, distant lighthouse. Soft painterly textures, Miyazaki skies with fluffy clouds, slight fisheye lens effect, HUD elements (minimap corner, health bar).
First-person POV video game screenshot, playing as a young protagonist in a loose white t-shirt and faded denim shorts. Visible arms holding a woven basket, sneakers stepping on rain-damp cobblestones. Walking through a chaotic Ghibli street market: cramped stalls selling glowing mushrooms, floating lanterns, and spiral-cut fruits. Fishmonger shouts while soot sprites dart between crates. Foreground: vendor handing you a peach (interactive ‘E’ prompt). Background: yakuza thugs lurking near a steaming noodle cart. Soft painterly lighting, depth of field, subtle HUD (minimap corner, health bar). Studio Ghibli meets Grand Theft Auto
First-person POV video game screenshot, playing as a young anime protagonist in a slightly oversized white t-shirt (salt-stained sleeves) and knee-length blue shorts, visible hands gripping a bamboo fishing rod. Kneeling on a mossy dock pier at sunset, arms resting on knees. Foreground: ‘E: Reel In’ prompt as line pulls taut. Background: pastel fishing boats, distant lighthouse under Miyazaki’s fluffy clouds. Glowing koi fish breaching turquoise water, soot sprites stealing bait from a tin. Identical soft painterly textures, fisheye lens effect, HUD (minimap corner, health bar).
Video Prompts :
All Prompts are in order just like in video
The black-haired boy strides from the rustic house toward the ocean, the camera tracking his movement in a GTA-style third-person perspective as coastal winds flutter white curtains and sunlight glimmers on distant sailboats, blending warm interior details with expanding seaside horizons under a tranquil sky.
The brown-haired boy drives a vintage blue convertible along the coastal cobblestone street, colorful flower-adorned buildings passing by as the camera follows the car’s journey toward the sunlit ocean horizon, sea breeze gently tousling his hair under a serene sky.
The young boy navigates the bustling cobblestone market, basket of oranges in arm, as vibrant stalls and fluttering awnings frame his journey, the camera tracking his focused stride through chattering crowds under swaying traditional lanterns.
A school of fish swims gracefully through crystal-clear water, sunlight filtering through the surface, coral reefs swaying gently, creating a serene underwater scene with the camera stationary.
-
Free or Open Source VFX Asset Management Systems
There are several free or open-source VFX asset management systems available that can be used in production environments. These tools vary in scope—from lightweight tools to full-fledged pipeline frameworks. Below is a breakdown of the most notable ones and what makes them stand out.
1. Free & Open-Source VFX Asset Management Systems
1.1 OpenPype (formerly Pype)
License: Open source (Apache 2.0)
– Asset management and project structure setup
– Integrates with Maya, Houdini, Nuke, Blender, and others
– Includes publishing, versioning, and task tracking
– Web interface (OpenPype Studio) for overview and managementStrengths: Actively developed, modular and extendable, production-proven in real studios
URL: https://openpype.io/
1.2 Kitsu
License: GNU GPL v3
– Production tracking, shot management
– Web-based interface with intuitive UX
– Built-in review and feedback system
– API for integration into pipelinesStrengths: Great for team collaboration, focuses on communication between departments
URL: https://www.cg-wire.com/kitsu
https://github.com/cgwire/kitsu
1.3 ftrack Community Edition
License: Proprietary (older versions may be available for small studios/educational users)
– Project management, review, and pipeline integration
– Strength: Industry-proven
Note: Current versions are commercial; older community editions may still be used.
1.4 Tactic
License: Open source (EPL 1.0)
– General-purpose asset and workflow management
– Web-based, highly configurableStrengths: Adaptable to VFX pipelines, powerful templating/scripting
Drawbacks: Steep learning curve, not VFX-specific out of the box
URL: https://www.southpawtech.com/
2. Most Powerful Open Source Option
Best Overall: OpenPype
Why:
– Specifically built for VFX and animation pipelines
– Extensively integrates with key DCCs
– Actively maintained with a large community
– Includes both asset and task management
– Works out-of-the-box but is customizable -
Tencent Hunyuan3D 2.1 goes Open Source and adds MV (Multi-view) and MV Mini
https://huggingface.co/tencent/Hunyuan3D-2mv
https://huggingface.co/tencent/Hunyuan3D-2mini
https://github.com/Tencent/Hunyuan3D-2
Tencent just made Hunyuan3D 2.1 open-source.
This is the first fully open-source, production-ready PBR 3D generative model with cinema-grade quality.
https://github.com/Tencent-Hunyuan/Hunyuan3D-2.1
What makes it special?
• Advanced PBR material synthesis brings realistic materials like leather, bronze, and more to life with stunning light interactions.
• Complete access to model weights, training/inference code, data pipelines.
• Optimized to run on accessible hardware.
• Built for real-world applications with professional-grade output quality.
They’re making it accessible to everyone:
• Complete open-source ecosystem with full documentation.
• Ready-to-use model weights and training infrastructure.
• Live demo available for instant testing.
• Comprehensive GitHub repository with implementation details. -
Lovis Odin ComfyUI-8iPlayer – Seamlessly integrate 8i volumetric videos into your AI workflows
Load holograms, animate cameras, capture frames, and feed them to your favorite AI models. Developed by Lovis Odin for Kartel.ai
You can obtain the MPD URL directly from the official 8i Web Player.https://github.com/Kartel-ai/ComfyUI-8iPlayer/
-
Thomas Müller nv-tlabs GEN3C – 3D-Informed World-Consistent Video Generation with Precise Camera Control
https://github.com/nv-tlabs/GEN3C
Load a picture, define a camera path in 3D, and then render a photoreal video.
-
AI and the Law – Disney, NBCU sue Midjourney over copyright infringement
https://www.axios.com/2025/06/11/disney-nbcu-midjourney-copyright
Why it matters: It’s the first legal action that major Hollywood studios have taken against a generative AI company.
The complaint, filed in a U.S. District Court in central California, accuses Midjourney of both direct and secondary copyright infringement by using the studios’ intellectual property to train their large language model and by displaying AI-generated images of their copyrighted characters.
FEATURED POSTS
-
Methods for creating motion blur in Stop motion
en.wikipedia.org/wiki/Go_motion
Petroleum jelly
This crude but reasonably effective technique involves smearing petroleum jelly (“Vaseline”) on a plate of glass in front of the camera lens, also known as vaselensing, then cleaning and reapplying it after each shot — a time-consuming process, but one which creates a blur around the model. This technique was used for the endoskeleton in The Terminator. This process was also employed by Jim Danforth to blur the pterodactyl’s wings in Hammer Films’ When Dinosaurs Ruled the Earth, and by Randal William Cook on the terror dogs sequence in Ghostbusters.[citation needed]Bumping the puppet
Gently bumping or flicking the puppet before taking the frame will produce a slight blur; however, care must be taken when doing this that the puppet does not move too much or that one does not bump or move props or set pieces.Moving the table
Moving the table on which the model is standing while the film is being exposed creates a slight, realistic blur. This technique was developed by Ladislas Starevich: when the characters ran, he moved the set in the opposite direction. This is seen in The Little Parade when the ballerina is chased by the devil. Starevich also used this technique on his films The Eyes of the Dragon, The Magical Clock and The Mascot. Aardman Animations used this for the train chase in The Wrong Trousers and again during the lorry chase in A Close Shave. In both cases the cameras were moved physically during a 1-2 second exposure. The technique was revived for the full-length Wallace & Gromit: The Curse of the Were-Rabbit.Go motion
The most sophisticated technique was originally developed for the film The Empire Strikes Back and used for some shots of the tauntauns and was later used on films like Dragonslayer and is quite different from traditional stop motion. The model is essentially a rod puppet. The rods are attached to motors which are linked to a computer that can record the movements as the model is traditionally animated. When enough movements have been made, the model is reset to its original position, the camera rolls and the model is moved across the table. Because the model is moving during shots, motion blur is created.A variation of go motion was used in E.T. the Extra-Terrestrial to partially animate the children on their bicycles.