BREAKING NEWS
LATEST POSTS
-
Runway – Introducing Gen-3 Alpha. Gen3, a new frontier for high-fidelity, controllable video generation.
https://runwayml.com/blog/introducing-gen-3-alpha/
Gen-3 Alpha is the first of an upcoming series of models trained by Runway on a new infrastructure built for large-scale multimodal training. It is a major improvement in fidelity, consistency, and motion over Gen-2, and a step towards building General World Models.
-
Immersity.AI turns 2D art and videos into 3D animation
Immersity AI (formerly LeiaPix), turns 2D illustrations into 3D animation, ideal for bringing a sketch, painting or scene to life.
It converts the video into an animated depth video and uses that to trigger depth in the final output.
-
Planning to move jobs? These cities are now so expensive they’re considered ‘impossibly unaffordable’
https://www.cnn.com/2024/06/14/business/house-prices-impossibly-unaffordable-intl-hnk/index.html
Top 10 “impossibly unaffordable” cities
- Hong Kong
- Sydney
- Vancouver
- San Jose
- Los Angeles
- Honolulu
- Melbourne
- San Francisco/Adelaide
- San Diego
- Toronto
-
Lighting Every Darkness with 3DGS: Fast Training and Real-Time Rendering and Denoising for HDR View Synthesis
https://srameo.github.io/projects/le3d/
LE3D is a method for real-time HDR view synthesis from RAW images. It is particularly effective for nighttime scenes.
https://github.com/Srameo/LE3D
FEATURED POSTS
-
Zibra.AI – Real-Time Volumetric Effects in Virtual Production. Now free for Indies!
A New Era for Volumetrics
For a long time, volumetric visual effects were viable only in high-end offline VFX workflows. Large data footprints and poor real-time rendering performance limited their use: most teams simply avoided volumetrics altogether. It’s similar to the early days of online video: limited computational power and low network bandwidth made video content hard to share or stream. Today, of course, we can’t imagine the internet without it, and we believe volumetrics are on a similar path.
With advanced data compression and real-time, GPU-driven decompression, anyone can now bring CGI-class visual effects into Unreal Engine.
From now on, it’s completely free for individual creators!
What it means for you?
(more…)
-
What Is The Resolution and view coverage Of The human Eye. And what distance is TV at best?
https://www.discovery.com/science/mexapixels-in-human-eye
About 576 megapixels for the entire field of view.
Consider a view in front of you that is 90 degrees by 90 degrees, like looking through an open window at a scene. The number of pixels would be:
90 degrees * 60 arc-minutes/degree * 1/0.3 * 90 * 60 * 1/0.3 = 324,000,000 pixels (324 megapixels).At any one moment, you actually do not perceive that many pixels, but your eye moves around the scene to see all the detail you want. But the human eye really sees a larger field of view, close to 180 degrees. Let’s be conservative and use 120 degrees for the field of view. Then we would see:
120 * 120 * 60 * 60 / (0.3 * 0.3) = 576 megapixels.
Or.
7 megapixels for the 2 degree focus arc… + 1 megapixel for the rest.
https://clarkvision.com/articles/eye-resolution.html
Details in the post