A short 170 year history of Neural Radiance Fields (NeRF), Holograms, and Light Fields
/ photogrammetry, production, software

https://neuralradiancefields.io/history-of-neural-radiance-fields/

 

“Lightfield and hologram capture started with a big theoretical idea 115 years ago and we have struggled to make them viable ever since. Neural Radiance fields aka NeRF along with gaming computers now for the first time provide a promising easy and low cost way for everybody to capture and display lightfields.”

“Neural Radiance fields (NeRF) recently had its third birthday but the technology is just the latest answer to a question people have been chasing since the 1860s: How do you capture and recreate space (from images)?”

 

“The plenoptic function measures physical light properties at every point in space and it describes how light transport occurs throughout a 3D volume.”

 

Google project Starline the latest in real time and compression image to 3D technology

RICOH THETA Z1 51GB camera – 360° images in RAW format
/ hardware, production

https://theta360.com/en/about/theta/z1.html

 

  • 23MP(6720 x 3360, 7K)
  • superior noise reduction performance
  • F2.1, F3.5 and F5.6
  • 4K videos (3840 x 1920, 29.97fps)
  • RAW (DNG) image format
  • 360° live streaming in 4K
  • record sound from 4 different directions when shooting video
  • editing of 360° images in Adobe Photoshop Lightroom Classic CC
  • Android™ base system for the OS. Use plug-ins to customize your own THETA.
  • Wireless 2.4 GHz: 1 to 11ch or 1 to 13ch
  • Wireless 5 GHz: W52 (36 to 48ch, channel bandwidth 20/40/80 MHz supported)

 

Theta Z1 is Ricoh’s flagship 360 camera that features 1-inch sensors, which are the largest available for dual lens 360 cameras.  It has been a highly regarded camera among 360 photographers because of its excellent image quality, color accuracy, and its ability to shoot Raw DNG photos with exceptional exposure latitude.

 

Bracketing mode 2022

Rquirement: Basic app iOS ver.2.20.0, Android ver.2.5.0, Camera firmware ver.2.10.3

https://community.theta360.guide/t/new-feature-ae-bracket-added-in-the-shooting-mode-z1-only/8247

 

HDRi for VFX

https://community.theta360.guide/t/create-high-quality-hdri-for-vfx-using-ricoh-theta-z1/4789/4

 

 

 

ND filtering

 

https://community.theta360.guide/t/neutral-density-solution-for-most-theta-cameras/7331

 

https://community.theta360.guide/t/long-exposure-nd-filter-for-ricoh-theta/1100

Ben McEwan – Demystifying ST Maps
/ production, software

https://benmcewan.com/blog/2020/02/02/demystifying-st-maps/

 

An ST map is an image where every pixel has a unique Red and Green colour value that corresponds to an X and Y coordinate in screen-space. You can use one to efficiently warp an image in Nuke.

In 3D you project the STMap against all the elements in the scene and render glass/transmittive elements on top of that.

 

 

 

 

rayshader – open source tool for producing 2D and 3D data visualizations
/ production, software

https://www.rayshader.com/

 

rayshader is an open source package for producing 2D and 3D data visualizations in R. rayshader uses elevation data in a base R matrix and a combination of raytracing, hillshading algorithms, and overlays to generate stunning 2D and 3D maps. In addition to maps, rayshader also allows the user to translate ggplot2 objects into beautiful 3D data visualizations.

The models can be rotated and examined interactively or the camera movement can be scripted to create animations. Scenes can also be rendered using a high-quality pathtracer, rayrender. The user can also create a cinematic depth of field post-processing effect to direct the user’s focus to important regions in the figure. The 3D models can also be exported to a 3D-printable format with a built-in STL export function, and can be exported to an OBJ file.

 

 

 

Autodesk open sources RV playback tool to democratize access and drive open standards
/ production, software

https://github.com/AcademySoftwareFoundation/OpenRV

 

https://adsknews.autodesk.com/news/rv-open-source

 

“Autodesk is committed to helping creators envision a better world, and having access to great tools allows them do just that. So we are making RV, our Sci-Tech award-winning media review and playback software, open source. Code contributions from RV along with DNEG’s xStudio and Sony Pictures Imageworks’ itview will shape the Open Review Initiative, the Academy Software Foundation’s (ASWF) newest sandbox project to build a unified, open source toolset for playback, review, and approval. ”

 

Texel Density measurement unit
/ production

Texel density (also referred to as pixel density or texture density) is a measurement unit used to make asset textures cohesive compared to each other throughout your entire world.

It’s measured in pixels per centimeter (ie: 2.56px/cm) or pixels per meter (ie: 256px/m).

 

https://www.beyondextent.com/deep-dives/deepdive-texeldensity

 

 

Mohsen Tabasi – Stable Diffusion for Houdini through DreamStudio
/ A.I., production

https://github.com/proceduralit/StableDiffusion_Houdini

 

https://github.com/proceduralit/StableDiffusion_Houdini/wiki/

 

This is a Houdini HDA that submits the render output as the init_image and with getting help from PDG, enables artists to easily define variations on the Stable Diffusion parameters like Sampling Method, Steps, Prompt Strength, and Noise Strength.

Right now DreamStudio is the only public server that the HDA is supporting. So you need to have an account there and connect the HDA to your account.
DreamStudio: https://beta.dreamstudio.ai/membership

 

 

What is Neural Rendering?
/ A.I., production

https://www.zumolabs.ai/post/what-is-neural-rendering

 

“The key concept behind neural rendering approaches is that they are differentiable. A differentiable function is one whose derivative exists at each point in the domain. This is important because machine learning is basically the chain rule with extra steps: a differentiable rendering function can be learned with data, one gradient descent step at a time. Learning a rendering function statistically through data is fundamentally different from the classic rendering methods we described above, which calculate and extrapolate from the known laws of physics.”

Foundry Nuke Cattery – A library of open-source machine learning models
/ A.I., production, software

The Cattery is a library of free third-party machine learning models converted to .cat files to run natively in Nuke, designed to bridge the gap between academia and production, providing all communities access to different ML models that all run in Nuke. Users will have access to state-of-the-art models addressing segmentation, depth estimation, optical flow, upscaling, denoising, and style transfer, with plans to expand the models hosted in the future.

 

https://www.foundry.com/insights/machine-learning/the-artists-guide-to-cattery

 

https://community.foundry.com/cattery