Unity 3D resources
/ IOS, lighting, production, software

http://answers.unity3d.com/questions/12321/how-can-i-start-learning-unity-fast-list-of-tutori.html

 

If you have no previous experience with Unity, start with these six video tutorials which give a quick overview of the Unity interface and some important features http://unity3d.com/support/documentation/video/

(more…)

Using Meta’s Llama 3 for your business
/ A.I., software

https://www.linkedin.com/posts/tobias-zwingmann_meta-facebook-just-spent-over-100000000-activity-7187500623704076288-_vbG

 

Meta is the only Big Tech company committed to developing AI, particularly large language models, with an open-source approach.

 

There are 3 ways you can use Llama 3 for your business:

 

1- Llama 3 as a Service
Use Llama 3 from any cloud provider as a service. You pay by use, but the price is typically much cheaper than proprietary models like GPT-4 or Claude.
→ Use Llama 3 on Azure AI catalog:
https://techcommunity.microsoft.com/t5/ai-machine-learning-blog/introducing-meta-llama-3-models-on-azure-ai-model-catalog/ba-p/4117144

 

2- Self-Hosting
If you have GPU infrastructure (on-premises or cloud), you can run Llama 3 internally at your desired scale.
→ Deploy Llama 3 on Amazon SageMaker:
https://www.philschmid.de/sagemaker-llama3

 

3- Desktop (Offline)
Tools like Ollama allow you to run the small model offline on consumer hardware like current MacBooks.
→ Tutorial for Mac:
https://ollama.com/blog/llama3

Why The New York Times might win its copyright lawsuit against OpenAI
/ A.I., ves

https://arstechnica.com/tech-policy/2024/02/why-the-new-york-times-might-win-its-copyright-lawsuit-against-openai/

 

Daniel Jeffries wrote:

“Trying to get everyone to license training data is not going to work because that’s not what copyright is about,” Jeffries wrote. “Copyright law is about preventing people from producing exact copies or near exact copies of content and posting it for commercial gain. Period. Anyone who tells you otherwise is lying or simply does not understand how copyright works.”

 

The AI community is full of people who understand how models work and what they’re capable of, and who are working to improve their systems so that the outputs aren’t full of regurgitated inputs. Google won the Google Books case because it could explain both of these persuasively to judges. But the history of technology law is littered with the remains of companies that were less successful in getting judges to see things their way.

Generative AI Glossary
/ A.I.

https://education.civitai.com/generative-ai-glossary/

 

TurboSquid move towards supporting AI against its own policies
/ A.I., software, ves

https://www.turbosquid.com/ai-3d-generator

 

The AI is being trained using a mix of Shutterstock 2D imagery and 3D models drawn from the TurboSquid marketplace. However, it’s only being trained on models that artists have approved for this use. 

 

People cannot generate a model and then immediately sell it. However, a generated 3D model can be used as a starting point for further customization, which could then be sold on the TurboSquid marketplace. However, models created using our generative 3D tool—and their derivatives—can only be sold on the TurboSquid marketplace.

 

https://resources.turbosquid.com/general-info/terms-agreements/turbosquids-policy-on-publishing-ai-generated-content/

 

TurboSquid does not accept AI-generated content from our artists
As AI-powered tools become more accessible, it is important for us to address the impact AI has on our artist community as it relates to content made licensable on TurboSquid. TurboSquid, in line with its parent company Shutterstock, is taking an ethically responsible approach to AI on its platforms. We want to ensure that artists are properly compensated for their contributions to AI projects while supporting customers with the protections and coverage issued through the TurboSquid license.

 

In order to ensure that customers are protected, that intellectual property is not misused, and that artists’ are compensated for their work, TurboSquid will not accept content uploaded and sold on our marketplace that is generated by AI. Per our Publisher Agreement, artists must have proven IP ownership of all content that is submitted. AI-generated content is produced using machine learning models that are trained using many other creative assets. As a result, we cannot accept content generated by AI because its authorship cannot be attributed to an individual person, and we would be unable to ensure that all artists who were involved in the generation of that content are compensated.

Quiet Quitting at work – Causes and remedies
/ quotes, ves

Quiet quitting isn’t about leaving a job.
It’s when people stay but mentally check out. They do the bare minimum. No excitement. No extra effort.

 

It’s a silent alarm. Your team may be losing interest right under your nose.

 

And it’s a big deal. Why?

  • Because it affects:
  • Your team’s morale
  • Your team’s productivity
  • Your company’s profitability
  • And everyone’s overall success
  • Resources are already stretched thin.
  • You need to get the best from your team.

 

What can employers do? Many of the causes are within your control:

 

➡️ Listen Well
Talk to your team often.
Listen to what they say. Then take action.

 

➡️ Recognize Efforts
Public recognition can boost morale.
A simple “thank you” goes a long way.

 

➡️ Promote Balance
Allow time for life outside work.
Overworked employees burn out.

 

➡️ Give Chances to Grow
Invest in them. Provide training.
Show them a career path.

 

➡️ Build a Positive Culture
Ensure everyone feels valued and respected.

 

➡️ Set Clear Goals
Clearly define roles. Tell them what you expect.

 

➡️ Lead by Example
Show excitement. Work hard.
Be the way you want them to be.

 

Quiet quitting isn’t just an employee issue. It’s a leadership opportunity. It’s a chance to re-engage, re-inspire, and revitalize your workplace.

 

Resources

 

https://www.linkedin.com/posts/jwmba_gallup-reports-59-of-employees-are-filling-activity-7128733317364944896-vrdC

 

https://www.cbsnews.com/news/workers-disengaged-quiet-quitting-their-jobs-gallup/

 

https://www.gallup.com/workplace/349484/state-of-the-global-workplace.aspx

 

https://www.linkedin.com/posts/gallup_increasing-engagement-is-good-news-for-employees-activity-7079881851128983552-bpHr

 

https://www.gallup.com/workplace/398306/quiet-quitting-real.aspxhttps://hbr.org/2022/09/when-quiet-quitting-is-worse-than-the-real-thing

Luma Interactive Scenes announced: Gaussian Splatting
/ A.I., photogrammetry

https://neuralradiancefields.io/luma-interactive-scenes-announced/

 

“…these are in fact Gaussian Splats that are being run and it’s a proprietary iteration of the original Inria paper. They hybridize the performance gain of realtime rendering with Gaussian Splatting with robust cloud based rendering that’s already widely being used in commercial applications. This has been in the works for a while over at Luma and I had the opportunity to try out some of my datasets on their new method.”

MICHAEL RUBLOFF

 

https://lumalabs.ai/embed/95aa8119-a2fd-4ba2-9e1d-a9ea668f4be2?mode=sparkles&background=%23ffffff&color=%23000000&showTitle=true&loadBg=true&logoPosition=bottom-left&infoPosition=bottom-right&cinematicVideo=undefined&showMenu=false

 

https://lumalabs.ai/capture/ccd7a96a-c2e2-44cf-a358-80b2fea8532f

 

DNEG announces pay cuts of up to 25% and artists’ repayment loans
/ ves

EDIT 20230919

https://www.cartoonbrew.com/artist-rights/vfx-giant-dneg-puts-forth-new-salary-reduction-proposal-after-worker-backlash-to-initial-proposal-232735.html

Revised Proposal: Initially met with backlash, DNEG has revised its proposal over the weekend. They’ve introduced a third option that focuses on reducing work hours instead of salaries, along with additional paid leave to compensate for the income reduction.

 

  1. A salary reduction of 20% to 25% for seven months, with paid leave to compensate.
  2. A temporary 50% salary reduction, supplemented by a company loan, totalling 90% of the original salary, repayable over three years.
  3. Reduced working hours to a 3-day week for seven months, with no hourly rate reduction.

 

 

https://www.linkedin.com/posts/avuuk_animation-visualeffects-dneg-activity-7107674426275442688-Fd1d

 

Today, we want to address a concerning development at DNEG. They very recently announced pay cuts of up to 25% for its employees, coupled with a rather unconventional approach to compensate for these losses through ‘loans’, which their staff need to repay overtime.

 

As of now, DNEG is imposing these pay cuts for a period of 7 months. To ‘help’ offset the financial impact on their staff, the company is offering ‘loans’ to their employees. While offering financial support during challenging times is usually commendable, the repayment terms are causing deep concern within the Animation & Visual Effects community, especially around their legality.

 

The loan offered by DNEG comes with a significant catch: employees are required to pay back the loan over a three-year period. This means that even after the pay cuts are reinstated, employees will be obligated to allocate a portion of their salaries to repay the company. Aledgedly, there is no interest on the loan (tbc). This approach has sparked a considerable backlash within our industry.

 

We at the Animation & Visual Effects Union voice very strong concern and opposition to the pay cuts, as well as the loan method. We believe pay cuts should not be compensated through loans with long-term repayment plans, placing a heavy burden on the employees who are already facing financial challenges.

 

This situation underscores the importance of open dialogue and collaboration between employers and employees during challenging times. While businesses often need to make tough decisions to navigate economic uncertainties, it’s crucial to strike a balance that doesn’t disproportionately impact the livelihoods of their dedicated workforce.

 

What can be done about this?

 

If you are a member of the Animation & Visual Effects Union, get in touch with us immediately and do not accept any pay cuts yet. You can email your BECTU official Stefan Vassalos stefan.vassalos@prospect.org.uk to get advice and organise with your colleagues at DNEG.

 

Remember, you MUST give your consent for a paycut. It is ILLEGAL to impose a cut without it. You DO NOT have to consent to a pay cut. Legal action can and will be taken against paycuts without consent. Anyone affected please get in touch with us immediately so we can represent and protect you and your livlihood as much as possible. BECTU has the power and resources to challenge moments like this, so it is imperitive YOU take action and contact us. Talk to your colleagues and get in touch. It is only through solidarity and collective effort that we can address these challenges and shape a brighter future for our industry.

 

Please feel free to share your thoughts and insights on this matter. Your input and perspective are valuable as we navigate these unprecedented times together.

Why VFX workers are unionising in Hollywood — and why it’s important
/ ves

https://www.digitalspy.com/movies/a44936059/vfx-workers-unionising-hollywood/

 

Last March, IATSE released a damning survey that showed how visual effects workers lack access to essential benefits, such as health insurance and retirement plans.

It also found VFX crews are lacking breaks and rest periods, and they’re not getting paid for working overtime, resulting in some workers failing to even make minimum wage.

This survey was aimed to organise VFX workers, one of the last areas of the production community that are still not unionised. Given the worsening of their working conditions while their craft is increasingly in demand within the industry, seeking protection has become a necessity.

SAG-AFTRA votes to strike against greedy studios execs and producers
/ ves

Fran Drescher
“Thank you everybody for coming to this press conference today. It’s really important that this negotiation be covered, because the eyes of the world, and particularly the eyes of labor, are upon us. What happens here is important because what’s happening to us is happening across all fields of labor, when employers make Wall Street and greed their priority and they forget about the essential contributors that make the machine run.

 

“We have a problem, and we are experiencing that right at this moment. This is a very seminal hour for us. I went in in earnest thinking that we would be able to avert a strike. The gravity of this move is not lost on me, or our negotiating committee, or our board members. It’s a very serious thing that impacts thousands, if not millions, of people all across this country and around the world — not only members of this union, but people who work in other industries.

 

“And so it came with great sadness that we came to this crossroads. But we had no choice. We are the victims here. We are being victimized by a very greedy entity. I am shocked by the way the people that we have been in business with are treating us. I cannot believe it, quite frankly: How far apart we are on so many things. How they plead poverty, that they’re losing money left and right when giving hundreds of millions of dollars to their CEOs. It is disgusting. Shame on them.

 

“They stand on the wrong side of history at this very moment. We stand in solidarity, in unprecedented unity. Our union and our sister unions and the unions around the world are standing by us, as well as other labor unions. Because at some point, the jig is up. You cannot keep being dwindled and marginalized and disrespected and dishonored. The entire business model has been changed by streaming, digital, AI.

 

This is a moment of history and is a moment of truth. If we don’t stand tall right now, we are all going to be in trouble. We are all going to be in jeopardy of being replaced by machines and big business who cares more about Wall Street than you and your family. Most of Americans don’t have more than $500 in case of an emergency. This is a very big deal, and it weighed heavy on us. But at some point you have to say, ‘No, we’re not going to take this anymore. You people are crazy. What are you doing? Why are you doing this?’

 

“Privately, they all say we’re the center of the wheel. Everybody else tinkers around our artistry, but actions speak louder than words, and there was nothing there. It was insulting. So we came together in strength and solidarity and unity with the largest strike authorization vote in our union’s history. And we made the hard decision that we tell you as we stand before you today. This is major. It’s really serious and it’s going to impact every single person that is in labor. We are fortunate enough to be in a country right now that happens to be labor-friendly, and yet we were facing opposition that was so labor-unfriendly, so tone deaf to what we are saying.

 

“You cannot change the business model as much as it has changed and not expect the contract to change too. We’re not going to keep doing incremental changes on a contract that no longer honors what is happening right now with this business model that was foisted upon us. What are we doing… moving around furniture on the Titanic? It’s crazy. So the jig is up AMPTP. We stand tall. You have to wake up and smell the coffee. We are labor and we stand tall and we demand respect and to be honored for our contribution. You share the wealth because you cannot exist without us. Thank you.”

 

 

John H. Liu

Not only is the AMPTP planning on holding out until non-essential writers and actors go broke, but they are also engaging in union-busting by proposing that background actors be paid $200 for scan rights. Background actors make up the vast majority of SAG. By paying them for one day and never again, these actors won’t make minimums for health insurance and SAG membership. Less SAG, less Pension/Health to pay out, less leverage to strike ever again. Not to mention that if a background actor one day becomes famous, the studios will own the rights to that actor’s scan, and can circumvent paying the actor a fair wage. This is a fundamental alteration to the way that business is done in the entertainment industry and threatens to throw the entire landscape of Hollywood into chaos.”
RICOH THETA Z1 51GB camera – 360° images in RAW format
/ hardware, production

https://theta360.com/en/about/theta/z1.html

 

  • 23MP(6720 x 3360, 7K)
  • superior noise reduction performance
  • F2.1, F3.5 and F5.6
  • 4K videos (3840 x 1920, 29.97fps)
  • RAW (DNG) image format
  • 360° live streaming in 4K
  • record sound from 4 different directions when shooting video
  • editing of 360° images in Adobe Photoshop Lightroom Classic CC
  • Android™ base system for the OS. Use plug-ins to customize your own THETA.
  • Wireless 2.4 GHz: 1 to 11ch or 1 to 13ch
  • Wireless 5 GHz: W52 (36 to 48ch, channel bandwidth 20/40/80 MHz supported)

 

Theta Z1 is Ricoh’s flagship 360 camera that features 1-inch sensors, which are the largest available for dual lens 360 cameras.  It has been a highly regarded camera among 360 photographers because of its excellent image quality, color accuracy, and its ability to shoot Raw DNG photos with exceptional exposure latitude.

 

Bracketing mode 2022

Rquirement: Basic app iOS ver.2.20.0, Android ver.2.5.0, Camera firmware ver.2.10.3

https://community.theta360.guide/t/new-feature-ae-bracket-added-in-the-shooting-mode-z1-only/8247

 

HDRi for VFX

https://community.theta360.guide/t/create-high-quality-hdri-for-vfx-using-ricoh-theta-z1/4789/4

 

 

 

ND filtering

 

https://community.theta360.guide/t/neutral-density-solution-for-most-theta-cameras/7331

 

https://community.theta360.guide/t/long-exposure-nd-filter-for-ricoh-theta/1100

How to learn to become a VFX artist
/ ves
  • Structure your learning time
  • Consistency
  • Retention
  • Mental health
  • Don’t feel intimidated
  • Don’t feel rushed
  • Be kind
  • Luck is when preparation meets opportunity
Foundry Nuke Cattery – A library of open-source machine learning models
/ A.I., production, software

The Cattery is a library of free third-party machine learning models converted to .cat files to run natively in Nuke, designed to bridge the gap between academia and production, providing all communities access to different ML models that all run in Nuke. Users will have access to state-of-the-art models addressing segmentation, depth estimation, optical flow, upscaling, denoising, and style transfer, with plans to expand the models hosted in the future.

 

https://www.foundry.com/insights/machine-learning/the-artists-guide-to-cattery

 

https://community.foundry.com/cattery

 

Remote working pros and cons
/ production

www.leforttalentgroup.com/business-blog/is-the-genie-out-forever

Cons of remote working:

  • 1-Prefer 2 distinct locations in life — 1 for work, 1 for everything else
  • 2-Being able to manage the group of employees in one location is preferable — Meetings, training, management of teams and personalities has been easier.
  • 3-Confidentiality and Security — depending on the nature of the business, being able to lessen liabilities by containing the work location
  • 4-Social community — Many fully enjoy the traditional work community and build life long connections
  • 5-Love — A quick Google search shows various sources that cite anywhere from 20-33 percent of people met their spouse through work. What will those stats look like in a year or two from now?
  • 6-Road Warriors with great sound systems in their cars — Some enjoy the commute to unwind after work cranking tunes or catch up with friends and family waiting for the gridlock to ease. Others to continue working from the car.

Pros of remote working:

  • 1-The overhead costs — Keeping large commercial real estate holdings and related maintenance costs
  • 2-Killer commutes — 5-20 hours/week per employee in lost time now potentially used for other purposes
  • 3-Daily Daycare Scramble — Racing to drop them off or pick them up each day
  • 4-Environmentally, a lower carbon footprint — Less traffic, less pollution
  • 5-Quality Family time — Many parents are spending more time with their growing children

Some useful tips about working online:

  • Clarify and focus on priorities.
  • Define and manage expectations more explicitly than normal (give context to everything)
  • Log all your working hours.
  • Learn about and respect people’s boundaries.
  • Pay attention to people’s verbal and physical cues.
  • Pay attention to both people’s emotional, hidden and factual cues.
  • Be wary about anticipating, judging, rationalizing, competing, defending, rebutting…
Visual Code Studio – Free. Built on open source. Runs everywhere code editor.
/ production, python, software

code.visualstudio.com/

 

https://www.freecodecamp.org/news/how-to-set-up-vs-code-for-web-development/

 

Visual Studio Code is a lightweight but powerful source code editor which runs on your desktop and is available for Windows, macOS and Linux. It comes with built-in support for JavaScript, TypeScript and Node.js and has a rich ecosystem of extensions for other languages (such as C++, C#, Java, Python, PHP, Go) and runtimes (such as .NET and Unity).
Python and TCL: Tips and Tricks for Foundry Nuke
/ Featured, production, python, software

www.andreageremia.it/tutorial_python_tcl.html

https://www.gatimedia.co.uk/list-of-knobs-2

https://learn.foundry.com/nuke/developers/63/ndkdevguide/knobs-and-handles/knobtypes.html

 

http://www.andreageremia.it/tutorial_python_tcl.html

 

http://thoughtvfx.blogspot.com/2012/12/nuke-tcl-tips.html


Check final image quality

https://www.compositingpro.com/tech-check-compositing-shot-in-nuke/

Local copy:
http://pixelsham.com/wp-content/uploads/2023/03/compositing_pro_tech_check_nuke_script.nk

 

Nuke tcl procedures
https://www.gatimedia.co.uk/nuke-tcl-procedures

 

Knobs
https://learn.foundry.com/nuke/developers/63/ndkdevguide/knobs-and-handles/knobtypes.html

 

# return to the top
nuke.Root().begin()
nuke.allNodes()
nuke.Root().end()



# check if Nuke is running in UI or batch mode
nuke.env['gui'] # True or False



# prformatted font to use in a text node:
liberation mono




# import node from a path
# Replace '/path/to/your/script.nk' with the actual path to your Nuke script
script_path = '/path/to/your/script.nk'
# Create the node in the script
mynode = nuke.nodePaste(script_path)
# or
mynode = nuke.scriptReadFile(script_path) #asynchronous so the code wont wait for its completion, mynode  is empty
# same as 
mynode = nuke.tcl('source "{}"'.format(script_path))
my node will be empty and it wont select the node either
# or synchronous
mynode = NukeUI.Scriptlets.loadScriptlet(script_path) 




# connect a knob on an internal node to a parent's knob
# add a python expression to the internal node's knob like:
nuke.thisNode().parent()['falseColors'].getValue()
# or the opposite
not nuke.thisNode().parent()['falseColors'].getValue()
# or as tcl expression
1- parent.thatNode.disable




# check session performance
Sebastian Schütt – Monitoring Nuke’s sessions performance
nuke.startPerformanceTimers() nuke.resetPerformanceTimers() nuke.stopPerformanceTimers() # set a project start and end frames new_frame_start = 1 new_frame_end = 100 project_settings = nuke.Root() project_settings['first_frame'].setValue(new_frame_start) project_settings['last_frame'].setValue(new_frame_end) # disable/enable a node newReadNode['disable'].setValue(True) # force refresh a node myNode['update'].execute() # or myNode.forceValidate() # pop up UI alert warning nuke.alert('prompt')   # return a given node hdriGenNode = nuke.toNode('HDRI_Light_Export') clampTo1 = nuke.toNode('HDRI_Light_Export.Clamp To 1')   # access nodes within a group nuke.toNode('GroupNodeName.nestedNodeName')   # access a knob on a node hdriGenNode.knob('checkbox').getValue()   # return the node type topNode.Class() nuke.selectedNode().Class() nuke.selectedNode().name()   # return nodes within a group hdriGenNode = nuke.toNode('HDRI_Light_Export') hdriGenNode.begin() sel = nuke.selectedNodes() hdriGenNode.end()   # nodes position node.setXpos( 111 ) node.setYpos( 222 ) xPos = node.xpos() yPos = node.ypos() print 'new x position is', xPos print 'new y position is', yPos # execute a node's button through python node['button'].execute() # add knobs div = nuke.Text_Knob('someTextKnob','') myNode.addKnob(div) lgt_name = nuke.EvalString_Knob('lgt1_name','LGT1 name', 'some text') # id, label, txt myNode.addKnob(lgt_name) lgt_size = nuke.XY_Knob('lgt1_size', 'LGT1 size') myNode.addKnob(lgt_size) lgt_3Dpos = nuke.XYZ_Knob('lgt1_3Dpos', 'LGT1 3D pos') myNode.addKnob(lgt_3Dpos) lgt_distance = nuke.Double_Knob('lgt1_distance', ' distance') myNode.addKnob(lgt_distance ) lgt_isSun = nuke.Boolean_Knob('lgt1_isSun', ' sun/HMI') myNode.addKnob(lgt_isSun ) lgt_mask_clr = nuke.AColor_Knob('lgt1_maskClr', 'LGT1 mask clr') lgt_mask_clr.setValue([0.12, 0.62, 0.115, 0.65]) lgt_mask_clr.setVisible(False) myNode.addKnob(lgt_mask_clr) # add tab group knob lightTab = nuke.Tab_Knob('lgt1_tabBegin', 'LGT1, nuke.TABBEGINGROUP) myNode.addKnob(lightTab) lightTab = nuke.Tab_Knob('lgt1_tabEnd', 'LGT1', nuke.TABENDGROUP) myNode.addKnob(lightTab) # note if you have only one tab and you are programmatically adding to the bottom of it # remove the last endGroup node to make sure the new knobs go into the tab myNode.removeKnob(myNode['endGroup']) # python script knob remove_script = """ node = nuke.thisNode() for knob in node.knobs(): print(knob) if "lgt%s" in knob: node.removeKnob(node.knobs()[knob]) node.begin() lightGizmo = nuke.toNode('lgt%s') nuke.delete(lightGizmo) node.end() """ % (str(length), str(length)) lgt_remove = nuke.PyScript_Knob('lgt1_remove', 'LGT1 Remove', remove_script) myNode.addKnob(lgt_remove ) # link checkbox to function through knobChanged hdriGenNode.knob('knobChanged').setValue(''' nk = nuke.thisNode() k = nuke.thisKnob() if ("Jabuka_checkbox" in k.name()): print 'ciao' ''') # knobChanged production example my_code = """ n = nuke.thisNode() k = nuke.thisKnob() if k.name()=="sheetOrSequence" or k.name()=="showPanel": #print(nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue()) if nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue() == 0.0: n['frameEnd'].setValue(nuke.toNode(n.name() + '.MasterSwitch')['masterAppendClip_lastFrame'].getValue()) elif nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue() == 1.0 : n['frameEnd'].setValue(nuke.toNode(n.name() + '.MasterSwitch')['masterContactSheet_lastFrame'].getValue()) """ nuke.toNode("JonasCSheet1").knob("knobChanged").setValue(my_code) # retrieve the knobChanged callback node['knobChanged'].toScript() # nuke knobChanged callback https://corson.be/nuke_python_snippet/ # “knobChanged” is an “hidden” knob which holds code executed each time that we touch any node’s knob. # Thanks to that we can filter some user actions on the node and doing cool stuff like dynamically adding things inside a group. # This follows the node code = """ knob = nuke.thisKnob() if knob.name() == 'size': print "size : %s" % knob.value() """ nuke.selectedNode()["knobChanged"].setValue(code) def find_dependent_nodes(selected_node, targetClass): dependent_nodes = set() visited_nodes = set() def recursive_search(node): if node in visited_nodes: return visited_nodes.add(node) dependents = node.dependent() for dependent_node in dependents: print(dependent_node.Class()) if dependent_node.Class() == targetClass: dependent_nodes.add(dependent_node) recursive_search(dependent_node) recursive_search(selected_node) return dependent_nodes find_dependent_nodes(node, 'Write') # nuke changed through a nuke callback def myCallback(): # Code to execute when any checkbox knob changes print("Some checkbox value has changed!") n = nuke.thisNode() k = nuke.thisKnob() if k.name()=="myknob" or k.name()=="showPanel": print('do this') nuke.addKnobChanged(myCallback) nuke.removeKnobChanged(myCallback) # remove it first every time you wish to change the callback # nuke callback production example (note this will need to be saved in a place that nuke can retrieve: https://support.foundry.com/hc/en-us/articles/115000007364-Q100248-Adding-Callbacks-in-Nuke) def sheetOrSequenceCallback(): # Code to execute when any checkbox knob changes #print("Some checkbox value has changed!") n = nuke.thisNode() k = nuke.thisKnob() if k.name()=="sheetOrSequence" or k.name()=="showPanel": #print(nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue()) if nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue() == 0.0: n['frameEnd'].setValue(nuke.toNode(n.name() + '.MasterSwitch')['masterAppendClip_lastFrame'].getValue()) elif nuke.toNode(n.name() + '.MasterSwitch')['which'].getValue() == 1.0 : n['frameEnd'].setValue(nuke.toNode(n.name() + '.MasterSwitch')['masterContactSheet_lastFrame'].getValue()) # Add the callback function to the knob nuke.addKnobChanged(sheetOrSequenceCallback) nuke.removeKnobChanged(sheetOrSequenceCallback) # more about callbacks
# return all knobs for label, knob in sorted(jonasNode.knobs().items()): print(label, knob.value()) # remove a knob for label, knob in sorted(mynode.knobs().items()): if 'keyshot' in label.lower(): mynode.removeKnob(knob) # work inside a node group posNode.begin() posNode.end()   # move back to root level nuke.Root().begin() # Add a button link to docs import webbrowser browser = webbrowser.get('chrome') site = 'https://yoursite' browser.open(site)   # return all nodes nuke.allNodes() # python code inside a text node message [python -exec { import re import json output = 'hello' ... ... }] [python output]   # connect nodes blur.setInput(0, read) # label a node blur['label'].setValue("Size: [value size]\nChannels: [value channels]\nMix: [value mix]") # disconnect nodes node.setInput(0, None)   # arrange nodes for n in nuke.allNodes(): n.autoplace()   # snap them to closest grid for n in nuke.allNodes(): nuke.autoplaceSnap( n )   # help on commands help(nuke.Double_Knob)   # rename nodes node['name'].setValue('new') # query the format of an image at a given node level myNode.input(0).format().width()   # select given node all_nodes = nuke.allNodes() for i in all_nodes: i.knob("selected").setValue(False) myNode.setSelected(True)   # return the connected nodes metaNode.dependent()   # return the input node metaNode.input(0)   # copy and paste node nuke.toNode('original node').setSelected(True) nuke.nodeCopy(nukescripts.cut_paste_file()) nukescripts.clear_selection_recursive() newNode = nuke.nodePaste(nukescripts.cut_paste_file()) # copy and paste node alternative https://corson.be/nuke_python_snippet/ node = nuke.selectedNode() newNode = nuke.createNode(node.Class(), node.writeKnobs(nuke.WRITE_NON_DEFAULT_ONLY | nuke.TO_SCRIPT), inpanel=False) node.writeKnobs(nuke.WRITE_USER_KNOB_DEFS | nuke.WRITE_NON_DEFAULT_ONLY | nuke.TO_SCRIPT)   # set knob value metaNode.knob('operation').setValue('Avg Intensities')   # get knob value writeNode.knob('file').value() # get a pulldown choice knob label pulldown_knob = node[knob_name] pulldown_index = pulldown_knob.value() # Get the current index of the pulldown knob pulldown_label = pulldown_knob.enumName(pulldown_index) # link two knobs' attributes # add knob link k = nuke.Link_Knob('attr1_id','attr1') k.makeLink(node.name(), 'attr2_id.attr2')   # link two knobs between different nodes sel = nuke.selectedNode() lgt_colorspace = nuke.Link_Knob('colorspace', 'Colorspace') sel.addKnob(lgt_colorspace) Read1 = nuke.selectedNode() sel.knob('colorspace').makeLink(Read1.name(), 'colorspace') # link pulldown menus
Ben, how do I expression-link Pulldown Knobs?
This syntax can be read as {child node}.{knob} — link to {parent node}.{knob} nuke.toNode('lgtRenderStatistics.Text2').knob('yjustify').setExpression('lgtRenderStatistics.yjustify') nuke.toNode('lgtRenderStatistics.Text2').knob('xjustify').setExpression('lgtRenderStatistics.xjustify')   # create a grade node set to only red and change its gain mg = nuke.nodes.Grade(name='test2',channels='red') mg['white'].setValue(2) # remove a node nuke.delete(newNode)   # get one value out of an array paramater mynode.knob(pos_name).value()[0] mynode.knob(pos_name).value()[1]   # find all nodes of type write writeNodesList = [] for node in nuke.allNodes('Write'): writeNodesList.append(node)   # create an expression in python to connect parameters myNode.knob("ROI").setExpression("parent." + pos_name) # link knobs between nodes through an expression (https://learn.foundry.com/nuke/content/comp_environment/expressions/linking_expressions.html) Transform1.scale # connect two checkbox knobs so that one works the opposite of the other (False:True) node = nuke.toNode('myNode.Text2_all_sphericalcameratest_beauty') node.knob('disable').setExpression('parent.viewStats ? 0 : 1') # connect parameters between nodes at different level through an (non python) expression maskGradeNode.knob('white').setExpression('parent.parent.lgt1_maskClr') # connect parameters between nodes at different level through a python expression nuke.thisNode().parent()['sheetOrSequence'].getValue() # or using python in TCL [python {nuke.thisNode().parent()['sheetOrSequence'].getValue()}] # multiline python expression in TCL [python {nuke.thisNode().parent()['sheetOrSequence'].getValue()}] [python {print(nuke.thisNode())}] # multiline python expression from code with a return statement nuke.selectedNode().knob('which').setExpression('''[python -execlocal x = 2 for i in range(10): x += i ret = x]''', 0) # connect 2d knobs on the same node through a python expression nuke.thisNode()['TL'].getValue()[0] + ((nuke.thisNode()['TR'].getValue()[0] - nuke.thisNode()['TL'].getValue()[0])/2) # To add this as a python expression on each x and y of a 2d knob newLabel_expression_x = "[python nuke.thisNode()\['TL'\].getValue()\[0\] + ((nuke.thisNode()\['TR'\].getValue()\[0\] - nuke.thisNode()\['TL'\].getValue()\[0\])/2)]" newLabel_expression_y = "[python nuke.thisNode()\['TL'\].getValue()\[1\] + 10]" node['lightLabel'].setExpression(newLabel_expression_x, 0) node['lightLabel'].setExpression(newLabel_expression_y, 1) # note this may launch some errors when generating the node # a tcl expression seems to work best newLabel_expression_x = "lgt" + str(length) + "_tl.x() + 20" newLabel_expression_y = "lgt" + str(length) + "_tl.y() + 20" lgt_label.setExpression(newLabel_expression_x, 0) lgt_label.setExpression(newLabel_expression_y, 1) # load gizmo nuke.load('Offset')   # set knobs colors hdriGenNode.knob('add').setLabel("<span style="color: yellow;"&gt;Add Light") # or at creation, add knob with color lgt_LUX = nuke.Text_Knob('lgt%s_LUX' %str(length),"<font color='yellow'> LUX",'0') # id, label, txt # or when creating the knob manually: <font color='#FF0000'>Keyshot 1 or <font color='red'>Keyshot 1 # set color knob values hdriGenNode.knob('lgt_maskClr_1').setValue([0.0, 0.5, 0.0, 0.8]) hdriGenNode.knob('lgt_maskClr_1').setValue(0.4,3) # set only the alpha # return nuke file path nuke.root().knob('name').value()   # write metadata metadata_content = '{set %sName %s}\n{set %sMaxLuma %s}\n{set %sEV %s}\n{set %sLUX %s}\n{set %sPos2D %s}\n{set %sPos3D %s}\n{set %sDistance %s}\n{set %sScale %s}\n{set %sOutputPath %s}\n' % (lgtName, lgtCustomName, lgtName, str(maxL[0]), lgtName, str(lgt_EV), lgtName, str(lgt_LUX), lgtName, string.replace(str(pos2D),' ',''), lgtName, string.replace(str(pos3D),' ',''), lgtName, str(distance), lgtName, string.replace(str(scale),' ',''), lgtName, outputPath) metadataNode["metadata"].fromScript(metadata_content) # read metadata # metadata should be stored under the read node itself under one of the tabs readNode.metadata().get( 'exr/arnold/host/name' )   # return scene name nuke.root().knob('name').value()   # animate text by getting a knob's value of a specific node: [value Read1.first]   # animate text by getting a knob's value of current node: [value this.size]   # add to the menus mainMenu = nuke.menu( "Nodes" ) mainMenuItem = mainMenu.findItem( "NewMenuName" ) if not mainMenuItem : mainMenuItem = mainMenu.addMenu( "NewMenuName" ) subMenuItem = mainMenuItem.findItem( "subMenu" ) if not subMenuItem: subMenuItem = mainMenuItem.addMenu( "subMenu" ) return [ mainMenuItem ] menus = myMenus() for menu in menus: menu.addCommand('my tool', 'mytool.file.function()', None)   # add aov layer nuke.Layer(mynode, [mynode +'.red', mynode +'.green', mynode +'.blue', mynode +'.alpha'])   # onCreate options (like an onload option) # https://community.foundry.com/discuss/topic/106936/how-to-use-the-oncreate-callback # https://benmcewan.com/blog/2018/09/10/add-new-functionality-to-default-nodes-with-addoncreate/ # For example, you could do this: def setIt(): n = nuke.thisNode() k= n.knob( 'artist' ) user = envTools.getUser() k.setValue(user) nuke.addOnCreate(setIt, nodeClass = "") # retrieve the oncreate function sel = nuke.selectedNodes() code = sel[0]['onCreate'].getValue() print(code) # Or if you want to bake your code directly to a node: code = """ n = nuke.thisNode() k= n.knob( 'artist' ) user = envTools.getUser() k.setValue(user) """ nuke.selectedNode()["onCreate"].setValue(code) # Problem with onCreate is that it's run every time the node is created, which means even open a script will trigger the code.   # replace known nodes nodeToPaste = '''set cut_paste_input [stack 0] version 12.2 v10 push $cut_paste_input Group { name DeepToImage tile_color 0x60ff selected true xpos 862 ypos -3199 addUserKnob {20 DeepToImage} addUserKnob {6 volumetric_composition l "volumetric composition" +STARTLINE} volumetric_composition true } Input { inputs 0 name Input1 xpos -891 ypos -705 } DeepToImage { volumetric_composition {{parent.volumetric_composition}} name DeepToImage xpos -891 ypos -637 } ModifyMetaData { metadata { {remove exr/chunkCount ""} } name ModifyMetaData1 xpos -891 ypos -611 } Output { name Output1 xpos -891 ypos -530 } end_group ''' fileName = '/tmp/deleteme.cache' out_file = open(fileName, "w") out_file.write(str(nodeToPaste)) out_file.close() allNodes = nuke.allNodes() for i in allNodes: i.knob("selected").setValue(False) for node in allNodes: if 'DeepToImage' in node.name(): node.setSelected(True) newNode = nuke.nodePaste(fileName) nuke.delete(node) # force a knob on the same line hdriGenNode.addKnob(lgt_name) # stay on the same line lgt_lightGroup.clearFlag(nuke.STARTLINE) hdriGenNode.addKnob(lgt_lightGroup) # start a new line lgt_extractMode.setFlag(nuke.STARTLINE) hdriGenNode.addKnob(lgt_extractMode)   # text message per frame set cut_paste_input [stack 0] version 12.2 v10 push 0 push 0 push 0 push 0 Text2 { inputs 0 font_size_toolbar 100 font_width_toolbar 100 font_height_toolbar 100 message "MultiplyFloat.a 0.003\nMultiplyFloat2.a 0.35" old_message {{77 117 108 116 105 112 108 121 70 108 111 97 116 46 97 32 32 32 48 46 48 48 51 10 77 117 108 116 105 112 108 121 70 108 111 97 116 50 46 97 32 48 46 51 53} } box {175.2000122 896 1206.200012 1014} transforms {{0 2} } global_font_scale 0.5 center {1024 540} cursor_initialised true autofit_bbox false initial_cursor_position {{175.2000122 974.4000854} } group_animations {{0} imported: 0 selected: items: "root transform/"} animation_layers {{1 11 1024 540 0 0 1 1 0 0 0 0} } name Text20 selected true xpos 1197 ypos -132 } FrameRange { first_frame 1017 last_frame 1017 time "" name FrameRange19 selected true xpos 1197 ypos -77 } AppendClip { inputs 5 firstFrame 1017 meta_from_first false time "" name AppendClip3 selected true xpos 1433 } push 0 Reformat { format "2048 1080 0 0 2048 1080 1 2K_DCP" name Reformat4 selected true xpos 1843 ypos -251 } Merge2 { inputs 2 name Merge5 selected true xpos 1843 } push $cut_paste_input Reformat { format "2048 1080 0 0 2048 1080 1 2K_DCP" name Reformat5 selected true xpos 1715 ypos 80 } Merge2 { inputs 2 name Merge6 selected true xpos 1843 ypos 86 } # check negative pixels set cut_paste_input [stack 0] version 12.2 v10 push $cut_paste_input Expression { expr0 "r < 0 ? 1 : 0" expr1 "g < 0 ? 1 : 0" expr2 "b < 0 ? 1 : 0" name Expression4 selected true xpos 1032 ypos -106 } FilterErode { channels rgba size -1.3 name FilterErode4 selected true xpos 1032 ypos -58 } # check where the user is clicking on the viewer area import nuke from PySide2.QtWidgets import QApplication from PySide2.QtCore import QObject, QEvent, Qt from PySide2.QtGui import QMouseEvent class ViewerClickCallback(QObject): def eventFilter(self, obj, event): if event.type() == QEvent.MouseButtonPress and event.button() == Qt.LeftButton: # Mouse click detected mouse_pos = event.pos() print("Mouse clicked at position:", mouse_pos.x(), mouse_pos.y()) return super(ViewerClickCallback, self).eventFilter(obj, event) #Create an instance of the callback callback = ViewerClickCallback() #Install the event filter on the application qapp = QApplication.instance() qapp.installEventFilter(callback) qapp.removeEventFilter(callback) ## you can put this under a node's button and close the callback after a given mouse click ## OR closing the callback through a different button import nuke from PySide2.QtCore import QObject, QEvent, Qt from PySide2.QtWidgets import QApplication class ViewerClickCallback(QObject): def __init__(self, arg1): super().__init__() self.arg1 = arg1 def eventFilter(self, obj, event): if event.type() == QEvent.MouseButtonPress and event.button() == Qt.LeftButton: # Mouse click detected mouse_pos = event.pos() print("Mouse clicked at position:", mouse_pos.x(), mouse_pos.y(),'\n') if self.arg1 == 'bl': jonasNode['bl'].setValue([mouse_pos.x(), mouse_pos.y()]) qapp.removeEventFilter(callback) return super(ViewerClickCallback, self).eventFilter(obj, event) # Create an instance of the callback callback = ViewerClickCallback('bl') # Store the callback as a global variable nuke.root().knob('custom_callback').setValue(callback) # Install the event filter on the application qapp = QApplication.instance() qapp.installEventFilter(callback) ## on the second button: import nuke # Retrieve the callback object from the global variable callback = nuke.root().knob('custom_callback').value() # Retrieve the QApplication instance qapp = QApplication.instance() # Remove the event filter qapp.removeEventFilter(callback) # collect deepsamples for a given node nodelist = ['DeepSampleB_hdri','DeepSampleA_hdri','DeepSampleB_spheres','DeepSampleA_spheres','DeepSampleB_volume','DeepSampleA_volume','DeepSampleB_furmg','DeepSampleA_furmg','DeepSampleB_furfg','DeepSampleA_furfg','DeepSampleB_furbg','DeepSampleA_furbg','DeepSampleB_checkers','DeepSampleA_checkers'] nodelist = ['DeepSampleB_hdri'] finalSamplesList = [] for nodeName in nodelist: print(nodeName) finalSamplesList.append(nodeName) finalSampes = 0 for posX in range(0,1921): for posY in range(0,1081): nukeNode = nuke.toNode(nodeName) nukeNode['pos'].setValue([posX,posY]) current_posSamples = nukeNode['samples'].getValue() finalSamples = finalSamples + current_posSamples print(finalSamples) finalSamplesList.append(finalSamples) # false color expressions set cut_paste_input [stack 0] version 13.2 v8 push $cut_paste_input Expression { expr0 "(r > 2) || (g > 2) || (b > 2) ? 3:0" expr1 "((r > 1) && (r < 2)) || ((g > 1) && (g < 2)) || ((b > 1) && (b < 2))\n ? 2:0" expr2 "((r > 0) && (r < 1)) || ((g > 0) && (g < 1)) || ((b > 0) && (b < 1))\n ? 1:0" name Expression4 selected true xpos 90 ypos 1795 }

(more…)

Photography basics: Exposure Value vs Photographic Exposure vs Il/Luminance vs Pixel luminance measurements
/ Featured, lighting, photography

Also see: http://www.pixelsham.com/2015/05/16/how-aperture-shutter-speed-and-iso-affect-your-photos/

 

In photography, exposure value (EV) is a number that represents a combination of a camera’s shutter speed and f-number, such that all combinations that yield the same exposure have the same EV (for any fixed scene luminance).

 

 

The EV concept was developed in an attempt to simplify choosing among combinations of equivalent camera settings. Although all camera settings with the same EV nominally give the same exposure, they do not necessarily give the same picture. EV is also used to indicate an interval on the photographic exposure scale. 1 EV corresponding to a standard power-of-2 exposure step, commonly referred to as a stop

 

EV 0 corresponds to an exposure time of 1 sec and a relative aperture of f/1.0. If the EV is known, it can be used to select combinations of exposure time and f-number.

 

https://www.streetdirectory.com/travel_guide/141307/photography/exposure_value_ev_and_exposure_compensation.html

Note EV does not equal to photographic exposure. Photographic Exposure is defined as how much light hits the camera’s sensor. It depends on the camera settings mainly aperture and shutter speed. Exposure value (known as EV) is a number that represents the exposure setting of the camera.

 

Thus, strictly, EV is not a measure of luminance (indirect or reflected exposure) or illuminance (incidental exposure); rather, an EV corresponds to a luminance (or illuminance) for which a camera with a given ISO speed would use the indicated EV to obtain the nominally correct exposure. Nonetheless, it is common practice among photographic equipment manufacturers to express luminance in EV for ISO 100 speed, as when specifying metering range or autofocus sensitivity.

 

The exposure depends on two things: how much light gets through the lenses to the camera’s sensor and for how long the sensor is exposed. The former is a function of the aperture value while the latter is a function of the shutter speed. Exposure value is a number that represents this potential amount of light that could hit the sensor. It is important to understand that exposure value is a measure of how exposed the sensor is to light and not a measure of how much light actually hits the sensor. The exposure value is independent of how lit the scene is. For example a pair of aperture value and shutter speed represents the same exposure value both if the camera is used during a very bright day or during a dark night.

 

Each exposure value number represents all the possible shutter and aperture settings that result in the same exposure. Although the exposure value is the same for different combinations of aperture values and shutter speeds the resulting photo can be very different (the aperture controls the depth of field while shutter speed controls how much motion is captured).

EV 0.0 is defined as the exposure when setting the aperture to f-number 1.0 and the shutter speed to 1 second. All other exposure values are relative to that number. Exposure values are on a base two logarithmic scale. This means that every single step of EV – plus or minus 1 – represents the exposure (actual light that hits the sensor) being halved or doubled.

https://www.streetdirectory.com/travel_guide/141307/photography/exposure_value_ev_and_exposure_compensation.html

 

Formula

https://en.wikipedia.org/wiki/Exposure_value

 

https://www.scantips.com/lights/math.html

 

which means   2EV = N² / t

where

  • N is the relative aperture (f-number) Important: Note that f/stop values must first be squared in most calculations
  • t is the exposure time (shutter speed) in seconds

EV 0 corresponds to an exposure time of 1 sec and an aperture of f/1.0.

Example: If f/16 and 1/4 second, then this is:

(N² / t) = (16 × 16 ÷ 1/4) = (16 × 16 × 4) = 1024.

Log₂(1024) is EV 10. Meaning, 210 = 1024.

 

Collecting photographic exposure using Light Meters

https://photo.stackexchange.com/questions/968/how-can-i-correctly-measure-light-using-a-built-in-camera-meter

The exposure meter in the camera does not know whether the subject itself is bright or not. It simply measures the amount of light that comes in, and makes a guess based on that. The camera will aim for 18% gray, meaning if you take a photo of an entirely white surface, and an entirely black surface you should get two identical images which both are gray (at least in theory)

https://en.wikipedia.org/wiki/Light_meter

For reflected-light meters, camera settings are related to ISO speed and subject luminance by the reflected-light exposure equation:

where

  • N is the relative aperture (f-number)
  • t is the exposure time (“shutter speed”) in seconds
  • L is the average scene luminance
  • S is the ISO arithmetic speed
  • K is the reflected-light meter calibration constant

 

For incident-light meters, camera settings are related to ISO speed and subject illuminance by the incident-light exposure equation:

where

  • E is the illuminance (in lux)
  • C is the incident-light meter calibration constant

 

Two values for K are in common use: 12.5 (Canon, Nikon, and Sekonic) and 14 (Minolta, Kenko, and Pentax); the difference between the two values is approximately 1/6 EV.
For C a value of 250 is commonly used.

 

Nonetheless, it is common practice among photographic equipment manufacturers to also express luminance in EV for ISO 100 speed. Using K = 12.5, the relationship between EV at ISO 100 and luminance L is then :

L = 2(EV-3)

 

The situation with incident-light meters is more complicated than that for reflected-light meters, because the calibration constant C depends on the sensor type. Illuminance is measured with a flat sensor; a typical value for C is 250 with illuminance in lux. Using C = 250, the relationship between EV at ISO 100 and illuminance E is then :

 

E = 2.5 * 2(EV)

 

https://nofilmschool.com/2018/03/want-easier-and-faster-way-calculate-exposure-formula

Three basic factors go into the exposure formula itself instead: aperture, shutter, and ISO. Plus a light meter calibration constant.

f-stop²/shutter (in seconds) = lux * ISO/C

 

If you at least know four of those variables, you’ll be able to calculate the missing value.

So, say you want to figure out how much light you’re going to need in order to shoot at a certain f-stop. Well, all you do is plug in your values (you should know the f-stop, ISO, and your light meter calibration constant) into the formula below:

lux = C (f-stop²/shutter (in seconds))/ISO

 

Exposure Value Calculator:

https://www.vroegop.nu/exposure-value-calculator/

 

From that perspective, an exposure stop is a measurement of Exposure and provides a universal linear scale to measure the increase and decrease in light, exposed to the image sensor, due to changes in shutter speed, iso & f-stop.
+-1 stop is a doubling or halving of the amount of light let in when taking a photo.
1 EV is just another way to say one stop of exposure change.

 

One major use of EV (Exposure Value) is just to measure any change of exposure, where one EV implies a change of one stop of exposure. Like when we compensate our picture in the camera.

 

If the picture comes out too dark, our manual exposure could correct the next one by directly adjusting one of the three exposure controls (f/stop, shutter speed, or ISO). Or if using camera automation, the camera meter is controlling it, but we might apply +1 EV exposure compensation (or +1 EV flash compensation) to make the result goal brighter, as desired. This use of 1 EV is just another way to say one stop of exposure change.

 

On a perfect day the difference from sampling the sky vs the sun exposure with diffusing spot meters is about 3.2 exposure difference.

 ~15.4 EV for the sun
 ~12.2 EV for the sky

That is as a ballpark. All still influenced by surroundings, accuracy parameters, fov of the sensor…

 

 

EV calculator

https://www.scantips.com/lights/evchart.html#calc

http://www.fredparker.com/ultexp1.htm

 

Exposure value is basically used to indicate an interval on the photographic exposure scale, with a difference of 1 EV corresponding to a standard power-of-2 exposure step, also commonly referred to as a “stop”.

 

https://contrastly.com/a-guide-to-understanding-exposure-value-ev/

 

Retrieving photographic exposure from an image

All you can hope to measure with your camera and some images is the relative reflected luminance. Even if you have the camera settings. https://en.wikipedia.org/wiki/Relative_luminance

 

If you REALLY want to know the amount of light in absolute radiometric units, you’re going to need to use some kind of absolute light meter or measured light source to calibrate your camera. For references on how to do this, see: Section 2.5 Obtaining Absolute Radiance from http://www.pauldebevec.com/Research/HDR/debevec-siggraph97.pdf

 

IF you are still trying to gauge relative brightness, the level of the sun in Nuke can vary, but it should be in the thousands. Ie: between 30,000 and 65,0000 rgb value depending on time of the day, season and atmospherics.

 

The values for a 12 o’clock sun, with the sun sampled at EV 15.5 (shutter 1/30, ISO 100, F22) is 32.000 RGB max values (or 32,000 pixel luminance).
The thing to keep an eye for is the level of contrast between sunny side/fill side.  The terminator should be quite obvious,  there can be up to 3 stops difference between fill/key in sunny lit objects.

 

Note: In Foundry’s Nuke, the software will map 18% gray to whatever your center f/stop is set to in the viewer settings (f/8 by default… change that to EV by following the instructions below).
You can experiment with this by attaching an Exposure node to a Constant set to 0.18, setting your viewer read-out to Spotmeter, and adjusting the stops in the node up and down. You will see that a full stop up or down will give you the respective next value on the aperture scale (f8, f11, f16 etc.).
One stop doubles or halves the amount or light that hits the filmback/ccd, so everything works in powers of 2.
So starting with 0.18 in your constant, you will see that raising it by a stop will give you .36 as a floating point number (in linear space), while your f/stop will be f/11 and so on.

If you set your center stop to 0 (see below) you will get a relative readout in EVs, where EV 0 again equals 18% constant gray.
Note: make sure to set your Nuke read node to ‘raw data’

 

In other words. Setting the center f-stop to 0 means that in a neutral plate, the middle gray in the macbeth chart will equal to exposure value 0. EV 0 corresponds to an exposure time of 1 sec and an aperture of f/1.0.

 

To switch Foundry’s Nuke’s SpotMeter to return the EV of an image, click on the main viewport, and then press s, this opens the viewer’s properties. Now set the center f-stop to 0 in there. And the SpotMeter in the viewport will change from aperture and fstops to EV.

 

If you are trying to gauge the EV from the pixel luminance in the image:
– Setting the center f-stop to 0 means that in a neutral plate, the middle 18% gray will equal to exposure value 0.
– So if EV 0 = 0.18 middle gray in nuke which equal to a pixel luminance of 0.18, doubling that value, doubles the EV.

.18 pixel luminance = 0EV
.36 pixel luminance = 1EV
.72 pixel luminance = 2EV
1.46 pixel luminance = 3EV
...

 

This is a Geometric Progression function: xn = ar(n-1)

The most basic example of this function is 1,2,4,8,16,32,… The sequence starts at 1 and doubles each time, so

  • a=1 (the first term)
  • r=2 (the “common ratio” between terms is a doubling)

And we get:

{a, ar, ar2, ar3, … }

= {1, 1×2, 1×22, 1×23, … }

= {1, 2, 4, 8, … }

In this example the function translates to: n = 2(n-1)
You can graph this curve through this expression: x = 2(y-1)  :

You can go back and forth between the two values through a geometric progression function and a log function:

(Note: in a spreadsheet this is: = POWER(2; cell# -1)  and  =LOG(cell#, 2)+1) )

2(y-1) log2(x)+1
x y
1 1
2 2
4 3
8 4
16 5
32 6
64 7
128 8
256 9
512 10
1024 11
2048 12
4096 13

 

Translating this into a geometric progression between an image pixel luminance and EV:

(more…)

teaching AI + ethics from elementary to high school
/ A.I.

codeorg.medium.com/microsoft-code-org-partner-to-teach-ai-ethics-from-elementary-to-high-school-4b983fd809e3

At a time when AI and machine learning are changing the very fabric of society and transforming entire industries, it is more important than ever to give every student the opportunity to not only learn how these technologies work, but also to think critically about the ethical and societal impacts of AI.

Polycam integration with Sketchfab
/ IOS, photogrammetry, software

https://sketchfab.com/blogs/community/polycam-adds-sketchfab-integration/

3D scanning is becoming more accessible with the LiDAR scanners in the new iPhone 12 Pro and iPad Pro.
Polycam’s integration lets users log in to their Sketchfab account and publish directly to Sketchfab.

Chandigarh Design School – GO48 International Challenge
/ cool

GO48 Challenge is an international competition that celebrates the creative skills of the global community comprising students, artists, designers, faculty, professionals and industry experts.

The contest would have 5 Exciting Competitions, each comprising of 2 Challenges, all bound by the common thread 48.
That means, you submit your art/design solution, either in 48 minutes or 48 hours.

To this effect, you can work on design solutions in the following 10 Challenges:

Go48 Graphix : Visual Communication Design : LoGO48 and MotionX.

Go48 Anim8 : 2d & 3d Animation : 3D As8 and Anim8.

Go48 Live : Filmmaking : Photography and Live!

Go48 GameIT : Game Design : CharACTer and Game IT.

Go48 UI/UX : User Interaction / User Experience Design : UI Eye and XD48.

chandigarhdesignschool.com/go48-competition/

Epic Games Invests in SideFX
/ ves

www.sidefx.com/community/epic-games-invests-in-sidefx/

Epic Games is now a minority investor in SideFX

AnimationXpress.com interviews Daniele Tosti for TheCgCareer.com channel
/ Featured, ves

https://www.animationxpress.com/vfx/meet-daniele-tosti-a-senior-cg-artist-who-is-on-a-mission-to-inspire-the-next-generation-of-artists/

 

You’ve been in the VFX Industry for over a decade. Tell us about your journey.

It all started with my older brother giving me a Commodore64 personal computer as a gift back in the late 80′. I realised then I could create something directly from my imagination using this new digital media format. And, eventually, make a living in the process.
That led me to start my professional career in 1990. From live TV to games to animation. All the way to live action VFX in the recent years.

I really never stopped to crave to create art since those early days. And I have been incredibly fortunate to work with really great talent along the way, which made my journey so much more effective.

 

What inspired you to pursue VFX as a career?

An incredible combination of opportunities, really. The opportunity to express myself as an artist and earn money in the process. The opportunity to learn about how the world around us works and how best solve problems. The opportunity to share my time with other talented people with similar passions. The opportunity to grow and adapt to new challenges. The opportunity to develop something that was never done before. A perfect storm of creativity that fed my continuous curiosity about life and genuinely drove my inspiration.

 

Tell us about the projects you’ve particularly enjoyed working on in your career

I quite enjoyed working on live TV projects, as the combination of tight deadlines and high quality was quite an incredible learning platform as a professional artist. But working on large, high end live action feature projects was really where I learnt most of my trade. And gave me the most satisfaction.

Every film I worked on had some memorable experiences. Right from Avatar to Iron Man 3 to Jungle Book to The Planet of the Apes to The Hobbits to name a few.

But above all, the technical challenges and the high quality we reached in each and every of the projects that I worked on, the best memories come from working with amazing and skilled artists, from a variety of disciplines. As those were my true mentors and became my best friends.

Post Production, Animation, VFX, Motion Graphics, Video Editing …

 

What are some technologies and trends that you think are emerging in the VFX Industry?

In the last few years there has definitely been a bias from some major studios to make VFX a commodity. In the more negative sense of the word. When any product reaches a level of quality that attracts a mass of consumers and reaches a plateau of opportunities, large corporation tend to respond with maximising its sale values by leveraging marketing schemes and deliverable more than the core values of the product itself. This is often a commoditisation approach that tends to empower agents who are not necessarily knowledgeable of a product’s cycles, and in that process, lowering the quality of the product itself for the sake of profits. It is a pretty common event in modern society and it applies to any brand name, not just VFX.

One challenge with VFX’s technology and artistry is that it relies on the effectiveness of artists and visionaries for the most. And limiting the authority, ownerships and perspective of such a crowd has definitely directly impacted the overall quality of the last decade of productions, both technically and artistically. There are very few and apart creative forces who have been able to deliver project that one could identify as a truly creative breakthrough. While the majority of productions seem to have suffered from some of these commoditisation patterns.

The other bigger challenge with this current trend is that VFX, due to various, historical business arrangements, is often relying on unbalanced resources as well as very small and feeble economic cycles and margins. Which make the entire industry extremely susceptible to marketing failures and to unstable leadership. As a few recent bankruptcies have demonstrated.

It is taking some reasonable time for the VFX crowd to acknowledge these trends and learn to be profitable, as the majority has never been educated on fair business practices.

But. Thankfully, the VFX circle is also a crowd of extremely adaptable and talented individuals, who are quite capable at resolving issues, finding alternatives and leveraging their passion. Which I believe is one of the drives behind the current evolution in the use of artificial intelligence, virtual reality, virtual production, real time rendering, and so on.

There is still a long path ahead of us but I hope we are all learning ways to make our passion speaks in profitable ways for everyone.

It is also highly likely that, in a near future, larger software and hardware corporation, thanks to their more profitable business practices, large development teams and better understanding of marketing, will eventually take over a lot of the cycles that the current production houses currently run. And in that process allow creative studios to focus back on VFX artistry.

 

What effect has the pandemics-induced lockdown had on the industry?

It is still early to say. I fear that if live action production does not start soon, we may see some of the economic challenges I mention above. At both studio and artists’ scale. There is definitely a push from production houses to make large distribution clients understand the fragility of the moment, especially in relation to payment cycles and economic support. Thus, there is still a fair risk that the few studios which adopted a more commoditised view to production will make their artists pay some price for their choices.

But, any challenge brings opportunities. For example, there is finally some recognition into a momentum to rely on work-from-home as a feasible solution to a lot of the current office production’s limitations and general artistry restrictions. Which, while there is no win-win in this pandemic, could be a silver lining.

 

What would you say to the budding artists who wish to become CG artists or VFX professionals?

Follow your passion but treat this career as any other business.
Learn to be adaptable. Find a true balance between professional and family life. Carefully plan your future. And watch our channel to learn more about all these.

Being a VFX artist is fundamentally based on mistrust.
This because schedules, pipelines, technology, creative calls… all have a native and naive instability to them that causes everyone to grow a genuine but beneficial lack of trust in the status quoThe VFX motto: “Love everyone but trust no one” is born on that.

 

What inspired you to create a channel for aspiring artists?

As many fellow and respected artists, I love this industry, but I had to understand a lot of business practices at my own expenses.
You can learn tools, cycles and software from books and schools. But production life tends to drive its own rhythms and there are fewer opportunities to absorb those.

Along my career I had some challenges finding professional willing to share their time to invest into me. But I was still extremely fortunate to find other mentors who helped me to be economically and professionally successful in this business. I owe a lot to these people. I promised myself I would exchange that favour by helping other artists, myself.

 

What can students expect to learn from your channel?

I am excited to have the opportunity to fill some of the voids that the current education systems and industry may have. This by helping new artists with true life stories by some of the most accomplished and successful talents I met during my career. We will talk about technology trends as much as our life experiences as artists. Discussing career advises. Trying to look into the future of the industry. And suggesting professional tips. The aim through this mentor-ship is to inspire new generations to focus on what is more important for the VFX industry. Take responsibilities for their art and passions as much as their families.

And, in the process, to feel empowered to materialise from their imagination more and more of those creative, awe inspiring moments that this art form has gifted us with so far.

 

http://TheCGCareer.com

 

14 Signs Of An Adaptable Person
/ quotes

www.forbes.com/sites/jeffboss/2015/09/03/14-signs-of-an-adaptable-person/#46bd90e016ea

1. Adaptable people experiment.

2. Adaptable people see opportunity where others see failure.

3. Adaptable people are resourceful.

4. Adaptable people think ahead.

5. Adaptable people don’t whine.

6. Adaptable people talk to themselves.

7. Adaptable people don’t blame.

8. Adaptable people don’t claim fame.

9. Adaptable people are curious.

10. Adaptable people adapt.

11. Adaptable people stay current.

12. Adaptable people see systems.

13. Adaptable people open their minds.

14. Adaptable people know what they stand for.

Pandemic Production Prospects, Possibilities, Concerns
/ ves

www.shootonline.com/news/pandemic-production-prospects-possibilities-concerns

 

“For many, production has stopped in its tracks due to the coronavirus pandemic. ”

 

“Others have stepped up their in-house activity, tapping into their homegrown production and post capabilities.” [Or working from home]

 

“While losing the physical proximity and communal nature of collaboration, creatives and artists have managed to stay connected through technology”

 

“While some projects have “completely died,” said Gavin Wellsman [a creative director at The Mill in New York], others are still in the pipeline and have adapted to a world where social distancing is imperative and live-action production as we’ve known it is no longer feasible at the moment. Clients are turning to visual effects, CG and other options.”

 

“Still, much work has fallen by the wayside. And many projects don’t translate properly from live action to another [full CG] discipline.”

 

“London-based independent production house MindsEye launched HomeStudio…. HomeStudio brings together a lineup of directors who have their own equipment, DPs with studio space, and stop-frame animators who can turn out content in this period of imposed self-isolation. This isn’t a roster of talent that a company has signed in the traditional sense; rather it’s a collection of talent that’s being made available to agencies and brands.”

 

“However, ingenuity, imagination and improvisation can only go so far when production and post companies are suffering from poor cash flow, a situation which is exacerbated by the COVID-19 crisis. …many companies would settle for–or at least welcome with open arms–getting paid in a timely fashion by marketers and ad agencies for services already rendered. ”

 

“In a live poll of over 500 AICP member participants during a Zoom Town Hall last month, the issue of outstanding receivables was the most immediate concern. It was found that 28% of companies reported that they are owed in excess of $1 million, while 23% are owed between $500,000-$1 million and 34% are owed between $100,000-$500,000. The members were also polled on how late these payments are: 29% reported that payments are 45 or more days late (per their contracted terms), and one-third are 30-45 days late. Extrapolating across the industry, conservatively, this is well in excess of $200 million.”

 

“Matt Miller, president and CEO of AICP: A healthy production and post community is integral to the overall economy’s recovery once we are clear of the pandemic. Production and post talent will be needed to help brands connect with the consumer marketplace and bring it back to life. It’s thus in the interest of [all] marketers and agencies to do what they can–and should do–to contribute to keeping the production and post sectors whole. “

A question of ethics – What CG simulation and deepfakes means for the future of performance
/ A.I., production, ves

www.ibc.org/create-and-produce/re-animators-night-of-the-living-avatars/5504.article

“When your performance is captured as data it can be manipulated, reworked or sampled, much like the music industry samples vocals and beats. If we can do that then where does the intellectual property lie? Who owns authorship of the performance? Where are the boundaries?”

“Tracking use of an original data captured performance is tricky given that any character or creature you can imagine can be animated using the artist’s work as a base.”

“Conventionally, when an actor contracts with a studio they will assign rights to their performance in that production to the studio. Typically, that would also licence the producer to use the actor’s likeness in related uses, such as marketing materials, or video games.

Similarly, a digital avatar will be owned by the commissioners of the work who will buy out the actor’s performance for that role and ultimately own the IP.

However, in UK law there is no such thing as an ‘image right’ or ‘personality right’ because there is no legal process in the UK which protects the Intellectual Property Rights that identify an image or personality.

The only way in which a pure image right can be protected in the UK is under the Law of Passing-Off.”

“Whether a certain project is ethical or not depends mainly on the purpose of using the ‘face’ of the dead actor,” “Legally, when an actor dies, the rights of their [image/name/brand] are controlled through their estate, which is often managed by family members. This can mean that different people have contradictory ideas about what is and what isn’t appropriate.”

“The advance of performance capture and VFX techniques can be liberating for much of the acting community. In theory, they would be cast on talent alone, rather than defined by how they look.”

“The question is whether that is ethically right.”

The modern phenomenon of the two days weekend break
/ ves

www.bbc.com/worklife/article/20200117-the-modern-phenomenon-of-the-weekend

“The idea of reducing the working week from an average of five days to four is gaining traction around the world.

“There are a number of parallels between debates today and those that took place in the 19th century when the weekend as we now know it was first introduced. Having Saturdays as well as Sundays off work is actually a relatively modern phenomenon.

“the weekend did not simply arise from government legislation – it was shaped by a combination of campaigns

“Religious bodies argued that a break on Saturday would improve working class “mental and moral culture”…. and greater attendance at church on Sundays.

“In 1842 a campaign group called the Early Closing Association was formed. It lobbied government to keep Saturday afternoon free for worker leisure in return for a full day’s work on (Saint) Monday.

“a burgeoning leisure industry saw the new half-day Saturday as a business opportunity… Perhaps the most influential leisure activity to help forge the modern week was the decision to stage football matches on Saturday afternoon.

“The adoption of the modern weekend was neither swift nor uniform as, ultimately, the decision for a factory to adopt the half-day Saturday rested with the manufacturer. Campaigns for an established weekend had begun in the 1840s but it did not gain widespread adoption for another 50 years…. it was embraced by employers who found that the full Saturday and Sunday break reduced absenteeism and improved efficiency.