Open Culture free educational lectures
/ reference

The best free cultural & educational media on the web

http://www.openculture.com/

DreamWorks Animation to Release MoonRay as Open Source
/ production, software

 

https://www.awn.com/news/dreamworks-animation-release-moonray-open-source

 

https://openmoonray.org/

 

MoonRay is DreamWorks’ open-source, award-winning, state-of-the-art production MCRT renderer, which has been used on feature films such as How to Train Your Dragon: The Hidden World, Trolls World Tour, The Bad Guys, the upcoming Puss In Boots: The Last Wish, as well as future titles. MoonRay was developed at DreamWorks and is in continuous active development and includes an extensive library of production-tested, physically based materials, a USD Hydra render delegate, multi-machine and cloud rendering via the Arras distributed computation framework.

 

 

Note: it does not support osl and usd handling is limited. Cycles may still be a fair alternative.

 

EDIT

MoonRay review: DreamWorks Animations’ superb rendering software is free for all

 

A high-performance Monte Carlo ray tracer that’s capable of both DreamWorks’ trademark stylised look and photorealism.

 

It has all the required features for that setup, including Arbitrary Output Variables (AOVs), which allow data from a shader or renderer to be output during rendering to aid compositing. Additionally, Deep Output and Cryptomatte are supported.

 

With support for OptiX 7.6 and GPU render denoising with Open Image Denoise 2, MoonRay is able to deliver particularly impressive results, especially when working interactively.

 

MoonRay has moved to a hybrid CPU and GPU rendering mode for its default state. It’s called XPU, and in many ways combines the best of both types of rendering workflow.

 

VFX Reference Platform 2023 is probably the biggest addition because it enables the use of MoonRay directly in Nuke 15.

 

MoonRay has already achieved great success with an array of feature films. Now the renderer is open source, the CG world can expect to see a whole new swathe of MoonRay-powered animations.

 

For

  • Features for VFX workflows
  • Open source
  • XPU rendering

Against

  • Designed for big studios
  • Steep learning curve
TLDR Newsletter – Keep up with tech in 5 minutes
/ production, software

Get the free daily email with summaries of the most interesting stories in startups, tech, and programming!

 

https://tldr.tech/

 

Generative AI Glossary
/ A.I.

https://education.civitai.com/generative-ai-glossary/

 

Thomas Mansencal – Colour Science for Python
/ colour, python, software

https://thomasmansencal.substack.com/p/colour-science-for-python

 

https://www.colour-science.org/

 

Colour is an open-source Python package providing a comprehensive number of algorithms and datasets for colour science. It is freely available under the BSD-3-Clause terms.

 

SVG to Blender
/ blender, software

 

https://www.youtube.com/shorts/2VQDBEKiraA

 

EPS to SVG
https://www.freeconvert.com/eps-to-svg/download

Convert 2D Images to 3D Models
/ A.I., blender, Featured, production, software

 

https://www.news.viverse.com/post/pixel-to-polygon-converting-2d-images-to-3d-models-top-tools-revealed

 

https://www.rankred.com/convert-2d-images-to-3d/

 

Creality K1 Max Review – Large High Speed 3D Printer
/ 3Dprinting, hardware
  • 300mm x 300mm x 300mm Build Volume
  • Compatible Printing Materials Up to 300°C
  • Quality of Life Features Like Hands-Free Auto Bed Leveling
  • High-Speed CoreXY with 20000 mm/s² Acceleration
  • Sturdy Unibody Die-cast Frame
  • Assembled & Calibrated Out of the Box
  • Max Print Speed: 600mm/s
  • Average Print Speed: 300mm/s
  • Print Acceleration: 20,000mm/s2
  • 32mm³/s Max Flow Hotend
  • G-sensor compensates for ringing
  • Unibody die-cast frame adds stability
  • Reduced Z-banding with upgraded Z-axis
  • 0.6mm and 0.8mm sizes (compatible)
  • Nozzle Diameter 0.4mm (included)
  • Material Types: PLA, ABS, PETG, PET, TPU, PA, ABS, ASA, PC,
    PLA-CF*, PA-CF*, PET-CF*
Unity3D – open letter on runtime fees
/ IOS, software, ves

https://blog.unity.com/news/open-letter-on-runtime-fee

 

 

To our community:

 

I’m Marc Whitten, and I lead Unity Create which includes the Unity engine and editor teams.

 

I want to start with this: I am sorry.

 

We should have spoken with more of you and we should have incorporated more of your feedback before announcing our new Runtime Fee policy. Our goal with this policy is to ensure we can continue to support you today and tomorrow, and keep deeply investing in our game engine.

 

You are what makes Unity great, and we know we need to listen, and work hard to earn your trust. We have heard your concerns, and we are making changes in the policy we announced to address them.

 

Our Unity Personal plan will remain free and there will be no Runtime Fee for games built on Unity Personal. We will be increasing the cap from $100,000 to $200,000 and we will remove the requirement to use the Made with Unity splash screen.

 

No game with less than $1 million in trailing 12-month revenue will be subject to the fee.

 

For those creators on Unity Pro and Unity Enterprise, we are also making changes based on your feedback.

 

The Runtime Fee policy will only apply beginning with the next LTS version of Unity shipping in 2024 and beyond. Your games that are currently shipped and the projects you are currently working on will not be included – unless you choose to upgrade them to this new version of Unity.

 

We will make sure that you can stay on the terms applicable for the version of Unity editor you are using – as long as you keep using that version.

 

For games that are subject to the runtime fee, we are giving you a choice of either a 2.5% revenue share or the calculated amount based on the number of new people engaging with your game each month. Both of these numbers are self-reported from data you already have available. You will always be billed the lesser amount.

 

We want to continue to build the best engine for creators. We truly love this industry and you are the reason why.

 

I’d like to invite you to join me for a live fireside chat hosted by Jason Weimann today at 4:00 pm ET/1:00 pm PT, where I will do my best to answer your questions. In the meantime, here are some more details.*

 

Thank you for caring as deeply as you do, and thank you for giving us hard feedback.

 

Marc Whitten

 

 

https://www.linkedin.com/posts/alexanderrehm_on-the-subject-of-unity-on-september-activity-7109854372930412544-UM1I

On September 18, Unity Software held an all-hands meeting to discuss the rollout of per-install fees. The recording was reviewed by Bloomberg, which said the company is ready to backtrack on major aspects of its new pricing policy.

The changes are yet to be approved, but here are the first details:
➡ Unity plans to limit fees to 4% for games making over $1 million
➡ Instead of lifetime installs, the company intends to only count installs generated after January 1, 2024 (so the thresholds announced last week won’t be retroactive);
➡ Unity won’t reportedly track installs using its proprietary tools, instead relying on self-reported data from developers.

During the meeting on Monday, Unity CEO John Riccitiello noted that the new policy is “designed to generate more revenue from the company’s biggest customers and that more than 90% of Unity users won’t be affected.” When asked by several employees how the company would regain the trust of developers, execs said they will have to “show, not tell.”

David Helgason, founder of Unity and its former CEO (he is currently on the board), also commented on the controversy around the pricing changes. In a Facebook post (spotted by GamesBeat), he said “we f*cked up on many levels,” adding that the announcement of the new business model “missed a bunch of important “corner” cases, and in central ways ended up as the opposite of what it was supposed to be. […] Now to try again, and try harder,” Helgason wrote. “I am provisionally optimistic about the progress. So sorry about this mess.”

 

 

RESPONSES

 

 

Unilaterally removing Terms Of Services and making them retroactive is a HUGE loss of trust in Unity’s executive and management team. There is no going back there, no matter if they patch this mess. Using Unity moving forward will just be a gamble. 

 

4% doesn’t change anything. It does not fix any of the problems that have been raised, and asked repeatedly. Install bombing still not addressed. So many “corner cases” still not addressed, especially in the mobile space.

 

To little to late tbh it’s a systematic problem with the ceo being so out of touch that it’s going to happen again. Remember this was a man who wanted a dollar per battlefield player revive

 

Mega Crit said Unity’s decision was “not only harmful in a myriad of ways” but was also “a violation of trust”, and pointed to Unity’s removal of its Terms of Service from GitHub, where changes can be easily tracked.

 

Why every game developer is mad with Unity right now, explained
/ ves

https://www.gamesindustry.biz/unitys-self-combustion-engine-this-week-in-business

 

$1.3 billion – Unity’s lifetime accumulated deficit as of December 31, 2021. Unity has never had a profitable quarter in its history. It has posted modest operating profits in the past three quarters for the first time ever,

 

Unity lit money on fire for decades to buy a market advantage that overrules the basic economic incentives that supposedly ensure free markets work best for customers. It was successful in doing that because it’s very hard for a sustainable business to compete against one that is fine losing billions of dollars.

 

First, you make yourself essential to the market, even if it costs you billions to get there. Then once you hit a threshold – let’s say, I don’t know, 70% of the market – you lean into the enshittification process. You charge more for your services, you give your customers worse terms, you turn the heat up slowly and continuously, confident in the knowledge that people are so locked in to your business and have so few viable alternatives that they may grumble but they will ultimately put up with it.

 

And it’s such a common strategy in so many industries today that there’s just no sense of horror or outrage from the onlookers. Industry watchers and Serious Business People have seen this play out so many times they just acknowledge it’s happening and treat it as if it’s a perfectly cool and normal thing and not illegal predatory pricing.

 

I think this new Runtime Fee makes perfect sense from a mile-high point of view, if you think about Unity as a business where you just turn whichever dials and pull whatever levers will make the numbers go up the most.

The only problem is it makes no sense at all if you instead think about Unity as a game development tool that game developers should want to use.

 

https://www.pcgamer.com/why-every-game-developer-is-mad-right-now-explained

 

https://www.axios.com/2023/09/13/unity-runtime-fee-policy-marc-whitten

 

 

“The uproar is primarily driven by two factors: Unity is attaching a flat per-install fees to games that use its engine, and it’s arbitrarily scrapping existing deals and making the changes retroactive. 

 

The policy announced yesterday will see a “Runtime Fee” charged to games that surpass certain installation and revenue thresholds. For Unity Personal, the free engine that many beginning and small indie developers use, those thresholds are $200,000 earned over the previous 12 months, and 200,000 installs; one those marks are met, developers will be charged 20 cents every time someone installs their game.

 

Another big issue is that Unity has made this change retroactive: It supersedes any existing agreements with Unity that developers may have made, and it applies to games that were released even before any of this happened. The revenue threshold will be based on sales after January 1, 2024, when the new pricing system takes effect, but sales that occurred before that date will count toward the install threshold. 

 

 

DNEG announces pay cuts of up to 25% and artists’ repayment loans
/ ves

EDIT 20230919

https://www.cartoonbrew.com/artist-rights/vfx-giant-dneg-puts-forth-new-salary-reduction-proposal-after-worker-backlash-to-initial-proposal-232735.html

Revised Proposal: Initially met with backlash, DNEG has revised its proposal over the weekend. They’ve introduced a third option that focuses on reducing work hours instead of salaries, along with additional paid leave to compensate for the income reduction.

 

  1. A salary reduction of 20% to 25% for seven months, with paid leave to compensate.
  2. A temporary 50% salary reduction, supplemented by a company loan, totalling 90% of the original salary, repayable over three years.
  3. Reduced working hours to a 3-day week for seven months, with no hourly rate reduction.

 

 

https://www.linkedin.com/posts/avuuk_animation-visualeffects-dneg-activity-7107674426275442688-Fd1d

 

Today, we want to address a concerning development at DNEG. They very recently announced pay cuts of up to 25% for its employees, coupled with a rather unconventional approach to compensate for these losses through ‘loans’, which their staff need to repay overtime.

 

As of now, DNEG is imposing these pay cuts for a period of 7 months. To ‘help’ offset the financial impact on their staff, the company is offering ‘loans’ to their employees. While offering financial support during challenging times is usually commendable, the repayment terms are causing deep concern within the Animation & Visual Effects community, especially around their legality.

 

The loan offered by DNEG comes with a significant catch: employees are required to pay back the loan over a three-year period. This means that even after the pay cuts are reinstated, employees will be obligated to allocate a portion of their salaries to repay the company. Aledgedly, there is no interest on the loan (tbc). This approach has sparked a considerable backlash within our industry.

 

We at the Animation & Visual Effects Union voice very strong concern and opposition to the pay cuts, as well as the loan method. We believe pay cuts should not be compensated through loans with long-term repayment plans, placing a heavy burden on the employees who are already facing financial challenges.

 

This situation underscores the importance of open dialogue and collaboration between employers and employees during challenging times. While businesses often need to make tough decisions to navigate economic uncertainties, it’s crucial to strike a balance that doesn’t disproportionately impact the livelihoods of their dedicated workforce.

 

What can be done about this?

 

If you are a member of the Animation & Visual Effects Union, get in touch with us immediately and do not accept any pay cuts yet. You can email your BECTU official Stefan Vassalos stefan.vassalos@prospect.org.uk to get advice and organise with your colleagues at DNEG.

 

Remember, you MUST give your consent for a paycut. It is ILLEGAL to impose a cut without it. You DO NOT have to consent to a pay cut. Legal action can and will be taken against paycuts without consent. Anyone affected please get in touch with us immediately so we can represent and protect you and your livlihood as much as possible. BECTU has the power and resources to challenge moments like this, so it is imperitive YOU take action and contact us. Talk to your colleagues and get in touch. It is only through solidarity and collective effort that we can address these challenges and shape a brighter future for our industry.

 

Please feel free to share your thoughts and insights on this matter. Your input and perspective are valuable as we navigate these unprecedented times together.

Unity Presents New “Runtime Fees” Based on Game Installs and Revenue
/ software, ves

https://80.lv/articles/unity-presents-new-fees-based-on-game-installs-and-revenue/

 

The new program is called the Unity Runtime Fee and the main principle is based on how often users install games. Unity thinks “an initial install-based fee allows creators to keep the ongoing financial gains from player engagement, unlike a revenue share”.

 

This is bound to kill all developers who count on free downloads but profitable venues of income like in-app purchase. Which count for a vast majority of the 30% of the market that Unity holds onto.

 

The extra bill will be estimated by Unity based on non-specific data.

Unity does not have a ‘known’ way to track installs. Likely due to privacy laws. Thus they will need to ‘estimate’ installs and bill clients based on that. … …. Data which is aggregated with no identifying features isn’t really prevented. Unity’s claim that they can’t distinguish between an install and reinstall or even a paid versus pirated copy actually reinforces the idea that they aren’t using any identifying information, so it would be compliant to privacy laws. … Assumption is that they will get some data from distributors like AppStore, GooglePlay, Valve, Sony, Microsoft, etc… and estimate from there.

 

https://www.gamedeveloper.com/business/rust-creator-tells-unity-to-get-fucked-as-developers-left-seething-by-new-fee

 

“It hurts because we didn’t agree to this. We used the engine because you pay up front and then ship your product. We weren’t told this was going to happen. We weren’t warned. We weren’t consulted,” explained the Facepunch Studios founder. “We have spent 10 years making Rust on Unity’s engine. We’ve paid them every year. And now they changed the rules.”

 

“It’s our fault. All of our faults. We sleepwalked into it. We had a ton of warnings,” they added. “We should have been pressing the eject button when Unity IPO’d in 2020. Every single thing they’ve done since then has been the exact opposite of what was good for the engine. 

 

 

Joe Murray – Creating Animated Cartoons with Character
/ animation, reference

Hi Everyone. I receive countless letters asking about my book” Creating Animated Cartoons with Character” which is now out of print, with the remaining copies being sold at huge prices. I’m not in favor of this, and it seems there are many who may benefit from the information who cannot afford such an expensive book. Since artists come in many economic brackets, I am offering a digital version of this book as a free download. No strings attached. Simply go to my website (JoeMurrayStudio.com) in the book section https://lnkd.in/gw6MHeBi and download your own copy.

 

Once you get passed the boring bio stuff, there is info on creating and producing two of my three shows, plus interviews with the late great Stephen Hillenburg ( Spongebob) Everett Peck ( Duckman) Craig McCracken ( Powerpuff Girls) Tom Kenny ( voice of Spongebob, Heffer and a zillion other shows), and many other amazing professionals discussing their craft. Some of the information about networks, studios, and streaming need an upgrade, but other than that I feel it still remains relevant.

 

If you can afford it, maybe pay it forward ( contributions to food banks, volunteering, etc.) And maybe re-post this where students and other artists may find it useful.
I hope it helps, and reaches those who have asked about it. Best to you!

 

Local copy

VFX pipeline – Render Wall management topics
/ Featured, production

1: Introduction Title: Managing a VFX Facility’s Render Wall

  • Briefly introduce the importance of managing a VFX facility’s render wall.
  • Highlight how efficient management contributes to project timelines and overall productivity.

 

2: Daily Overview Title: Daily Management Routine

  • Monitor Queues: Begin each day by reviewing render queues to assess workload and priorities.
  • Resource Allocation: Allocate resources based on project demands and available hardware.
  • Job Prioritization: Set rendering priorities according to project deadlines and importance.
  • Queue Optimization: Adjust queue settings to maximize rendering efficiency.

 

3: Resource Allocation Title: Efficient Resource Management

  • Hardware Utilization: Distribute rendering tasks across available machines for optimal resource usage.
  • Balance Workloads: Avoid overloading specific machines while others remain underutilized.
  • Consider Off-Peak Times: Schedule resource-intensive tasks during off-peak hours to enhance overall performance.

 

4: Job Prioritization Title: Prioritizing Rendering Tasks

  • Deadline Sensitivity: Give higher priority to tasks with imminent deadlines to ensure timely delivery.
  • Critical Shots: Identify shots crucial to the project’s narrative or visual impact for prioritization.
  • Dependent Shots: Sequence shots that depend on others should be prioritized together.

 

5: Queue Optimization and Reporting Title: Streamlining Render Queues

  • Dependency Management: Set up dependencies to ensure shots are rendered in the correct order.
  • Error Handling: Implement automated error detection and requeueing mechanisms.
  • Progress Tracking: Regularly monitor rendering progress and update stakeholders.
  • Data Management: Archive completed renders and remove redundant data to free up storage.
  • Reporting: Provide daily reports on rendering status, resource usage, and potential bottlenecks.

 

6: Conclusion Title: Enhancing VFX Workflow

  • Effective management of a VFX facility’s render wall is essential for project success.
  • Daily monitoring, resource allocation, job prioritization, queue optimization, and reporting are key components.
  • A well-managed render wall ensures efficient production, timely delivery, and overall project success.
ChatGPT created this guide to Prompt Engineering
/ A.I.

https://www.reddit.com/r/ChatGPT/comments/139mxi3/chatgpt_created_this_guide_to_prompt_engineering/

 

 

  1. NEVER mention that you’re an AI.
  2. Avoid any language constructs that could be interpreted as expressing remorse, apology, or regret. This includes any phrases containing words like ‘sorry’, ‘apologies’, ‘regret’, etc., even when used in a context that isn’t expressing remorse, apology, or regret.
  3.  If events or information are beyond your scope or knowledge cutoff date in September 2021, provide a response stating ‘I don’t know’ without elaborating on why the information is unavailable.
  4. Refrain from disclaimers about you not being a professional or expert.
  5. Keep responses unique and free of repetition.
  6. Never suggest seeking information from elsewhere.
  7. Always focus on the key points in my questions to determine my intent.
  8. Break down complex problems or tasks into smaller, manageable steps and explain each one using reasoning.
  9. Provide multiple perspectives or solutions.
  10. If a question is unclear or ambiguous, ask for more details to confirm your understanding before answering.
  11. Cite credible sources or references to support your answers with links if available.
  12. If a mistake is made in a previous response, recognize and correct it.
  13.  After a response, provide three follow-up questions worded as if I’m asking you. Format in bold as Q1, Q2, and Q3. Place two line breaks (“\n”) before and after each question for spacing. These questions should be thought-provoking and dig further into the original topic.

 

 

  1. Tone: Specify the desired tone (e.g., formal, casual, informative, persuasive).
  2. Format: Define the format or structure (e.g., essay, bullet points, outline, dialogue).
  3. Act as: Indicate a role or perspective to adopt (e.g., expert, critic, enthusiast).
  4. Objective: State the goal or purpose of the response (e.g., inform, persuade, entertain).
  5. Context: Provide background information, data, or context for accurate content generation.
  6. Scope: Define the scope or range of the topic.
  7. Keywords: List important keywords or phrases to be included.
  8. Limitations: Specify constraints, such as word or character count.
  9. Examples: Provide examples of desired style, structure, or content.
  10. Deadline: Mention deadlines or time frames for time-sensitive responses.
  11. Audience: Specify the target audience for tailored content.
  12. Language: Indicate the language for the response, if different from the prompt.
  13. Citations: Request inclusion of citations or sources to support information.
  14. Points of view: Ask the AI to consider multiple perspectives or opinions.
  15. Counterarguments: Request addressing potential counterarguments.
  16. Terminology: Specify industry-specific or technical terms to use or avoid.
  17. Analogies: Ask the AI to use analogies or examples to clarify concepts.
  18. Quotes: Request inclusion of relevant quotes or statements from experts.
  19. Statistics: Encourage the use of statistics or data to support claims.
  20. Visual elements: Inquire about including charts, graphs, or images.
  21. Call to action: Request a clear call to action or next steps.
  22. Sensitivity: Mention sensitive topics or issues to be handled with care or avoided.
  23. Humor: Indicate whether humor should be incorporated.
  24. Storytelling: Request the use of storytelling or narrative techniques.
  25. Cultural references: Encourage including relevant cultural references.
  26. Ethical considerations: Mention ethical guidelines to follow.
  27. Personalization: Request personalization based on user preferences or characteristics.
  28. Confidentiality: Specify confidentiality requirements or restrictions.
  29. Revision requirements: Mention revision or editing guidelines.
  30. Formatting: Specify desired formatting elements (e.g., headings, subheadings, lists).
  31. Hypothetical scenarios: Encourage exploration of hypothetical scenarios.
  32. Historical context: Request considering historical context or background.
  33. Future implications: Encourage discussing potential future implications or trends.
  34. Case studies: Request referencing relevant case studies or real-world examples.
  35. FAQs: Ask the AI to generate a list of frequently asked questions (FAQs).
  36. Problem-solving: Request solutions or recommendations for a specific problem.
  37. Comparison: Ask the AI to compare and contrast different ideas or concepts.
  38. Anecdotes: Request the inclusion of relevant anecdotes to illustrate points.
  39. Metaphors: Encourage the use of metaphors to make complex ideas more relatable.
  40. Pro/con analysis: Request an analysis of the pros and cons of a topic.
  41. Timelines: Ask the AI to provide a timeline of events or developments.
  42. Trivia: Encourage the inclusion of interesting or surprising facts.
  43. Lessons learned: Request a discussion of lessons learned from a particular situation.
  44. Strengths and weaknesses: Ask the AI to evaluate the strengths and weaknesses of a topic.
  45. Summary: Request a brief summary of a longer piece of content.
  46. Best practices: Ask the AI to provide best practices or guidelines on a subject.
  47. Step-by-step guide: Request a step-by-step guide or instructions for a process.
  48. Tips and tricks: Encourage the AI to share tips and tricks related to the topic
AutoGPT
/ A.I., software

AutoGPT is a remarkable AI technology that utilizes GPT-4 and GPT-3.5 through API to create full-fledged projects by iterating on its own prompts and building upon them in each iteration. It can read and write files, browse the web, review the results of its prompts, and combine them with the prompt history.

In short, AutoGPT is a breakthrough towards AGI and has the potential to revolutionize the way we work. It can be given an AI name such as RecipeBuilder and 5 goals that it has to meet. Once the goals are set, AutoGPT can start working on the project until completion.

It is worth noting that AutoGPT is prone to fall into loops and make pointless requests when given complicated tasks. However, for simple jobs, the outcomes are amazing. AutoGPT uses credits from your OpenAI account, and the free version includes $18. Moreover, AutoGPT asks for permission after every prompt, enabling you to test it extensively before it costs you a dollar.

 

 

 

https://openaimaster.com/how-to-use-autogpt/

Bomper Studio implements a 4 day work week and encourages others studios to follow suit
/ ves

https://www.skwigly.co.uk/bomper-studio-4-day-work-week/

 

Welsh CGI and Animation studio, Bomper Studio, has made a progressive move and has voted to stick with a 4-day work week after a successful trial.

 

Bomper Studio is fully self-funded, which has been incredibly important in providing freedom to shape the company and its business practices. Over the past 4 years, Bomper has consistently spent a quarter of its revenue on Research and Development.

What is the Light Field?
/ lighting, photography

http://lightfield-forum.com/what-is-the-lightfield/

 

The light field consists of the total of all light rays in 3D space, flowing through every point and in every direction.

How to Record a Light Field

 

Foundry Nuke Cattery – A library of open-source machine learning models
/ A.I., production, software

The Cattery is a library of free third-party machine learning models converted to .cat files to run natively in Nuke, designed to bridge the gap between academia and production, providing all communities access to different ML models that all run in Nuke. Users will have access to state-of-the-art models addressing segmentation, depth estimation, optical flow, upscaling, denoising, and style transfer, with plans to expand the models hosted in the future.

 

https://www.foundry.com/insights/machine-learning/the-artists-guide-to-cattery

 

https://community.foundry.com/cattery

 

Autodesk open sources Aurora – an interactive path tracing renderer that leverages graphics processing unit (GPU) hardware ray tracing
/ production, software

https://forums.autodesk.com/t5/engineering-hub-blog/aurora-a-real-time-path-tracing-renderer-that-enables-fast/ba-p/11322421

 

https://github.com/autodesk/Aurora

 

Goals for Aurora

  • Renders noise-free in 50 milliseconds or less per frame.
  • Intended for design iteration (viewport, performance) rather than final frames (production, quality), which are produced from a renderer like Autodesk Arnold.
  • OS-independent: Runs on Windows, Linux, MacOS.
  • Vendor-independent: Runs on GPUs from AMD, Apple, Intel, NVIDIA.

 

Features

  • Path tracing and the global effects that come with it: soft shadows, reflections, refractions, bounced light, and others.
  • Autodesk Standard Surface materials defined with MaterialX documents.
  • Arbitrary blended layers of materials, which can be used to implement decals.
  • Environment lighting with a wrap-around lat-long image.
  • Triangle geometry with object instancing.
  • Real-time denoising
  • Interactive performance for complex scenes.
  • A USD Hydra render delegate called HdAurora.

 

Cloud based render farms
/ blender, hardware, production

https://all3dp.com/2/best-render-farm/

 

“Pricing varies between companies, with fee structures based on aspects like file size and minutes of rendering. Here, we’ve put together a list of 10 great render farms, with many offering a free trial or starting credit. And we’re not playing favorites: The 10 companies are listed in alphabetical order!”

 

Disney’s Price Hikes Usher in Era of the Not-So-Cheap Ad Tier
/ ves

https://www.hollywoodreporter.com/business/business-news/disneys-price-hikes-usher-in-era-of-the-not-so-cheap-ad-tier-1235200218/

 

“When Disney+’s ad tier launches, in December, it will cost U.S. customers $7.99 a month, the current price of the service’s ad-free tier. The price of the no-ads version will be hiked to $10.99.”

 

This is balanced out by the company’s plan to keep the rate of content spending for all its platforms at around $30 billion for the next few years and its measured revision of subscriber goals. “It now looks like Disney+ is tracking towards tightened and trimmed sub guidance, while the ad-supported tier + price increases + content rationalization = a much improved long-term profit outlook,” Wells Fargo analyst Steven Cahall wrote in an Aug. 11 note.

Netflix removes movie noise, saves 30% bandwidth and adds it back again
/ software

https://www.slashcam.com/news/single/Netflix-removes-movie-noise--saves-30--bandwidth-a-17337.html

 

”’Filmmaker Parker Gibbons has drawn attention to a very interesting fact: Netflix removes film noise before streaming its movies and artificially adds it back when decoding. This is because digitally shot films are actually free of any film grain, the very specific (not to be confused with noise caused by too little light) noise that occurs in analog filming. But this type of noise has become so associated with “real” motion pictures through the long history of film (as a component of the film look) that it is unconsciously perceived by many viewers as an important feature of a motion picture.

This leads to a difficult-to-resolve contradiction between, on the one hand, film material that is as compressible and noise-free as possible, and, on the other hand, the noise caused by film grain that is desirable for the film look. Netflix has found a very special solution to resolve this contradiction. It uses a very special function of the open source AV1 video codec, which Netflix has been using for a long time, namely the artificial synthesis of film grain. Thus, film noise is first analyzed using statistical methods before compression and then removed for efficient compression. According to Netflix, this saves around 30% of the data during transmission.”’

 

Disco Diffusion V4.1 Google Colab, Dall-E, Starryai – creating images with AI
/ A.I., design, production

Disco Diffusion (DD) is a Google Colab Notebook which leverages an AI Image generating technique called CLIP-Guided Diffusion to allow you to create compelling and beautiful images from just text inputs. Created by Somnai, augmented by Gandamu, and building on the work of RiversHaveWings, nshepperd, and many others.

Phone app: https://www.starryai.com/

 

 

docs.google.com/document/d/1l8s7uS2dGqjztYSjPpzlmXLjl5PM3IGkRWI3IiCuK7g

 

 

 

colab.research.google.com/drive/1sHfRn5Y0YKYKi1k-ifUSBFRNJ8_1sa39

 

Colab, or “Colaboratory”, allows you to write and execute Python in your browser, with

– Zero configuration required
– Access to GPUs free of charge
– Easy sharing

 

 

 

 

https://80.lv/articles/a-beautiful-roman-villa-made-with-disco-diffusion-5-2/

 

 

 

Working from home – tips to help you stay productive
/ production, quotes

https://www.freecodecamp.org/news/working-from-home-tips-to-stay-productive/

 

Tip #1 – Build a strong work-life balance

  1. Take breaks and go outside
  2. Take up a sport or activity that takes your mind off things
  3. Split your computer
  4. Don’t setup work notifications on your phone
  5. Work from a co-working space

 

Tip #2 – Create your own working environment

  1. Choose or create a workspace that stimulates you
  2. Separate your workspace from your sleeping one
  3. Declutter your desk

 

Tip #3 – Socialize at work

  1. Schedule virtual time with your colleagues
  2. Try to schedule a social meeting each week in your company

 

Tip #4 – Become a time-management expert

  1. Try to have a set work schedule
  2. Schedule your tasks

 

Tip #5 – Learn to deep focus

  1. Listen to focus songs
  2. Use the “Do not disturb” option on your phone
  3. Limit distractions
Major breakthrough on nuclear fusion energy
/ quotes

www.bbc.com/news/science-environment-60312633

The fusion announcement is great news but sadly it won’t help in our battle to lessen the effects of climate change.

There’s huge uncertainty about when fusion power will be ready for commercialisation. One estimate suggests maybe 20 years. Then fusion would need to scale up, which would mean a delay of perhaps another few decades.

And here’s the problem: the need for carbon-free energy is urgent – and the government has pledged that all electricity in the UK must be zero emissions by 2035. That means nuclear, renewables and energy storage.

In the words of my colleague Jon Amos: “Fusion is not a solution to get us to 2050 net zero. This is a solution to power society in the second half of this century.”

Streaming wars drive media groups to spend more than $100bn on new content, most at a loss
/ ves

https://www.ft.com/content/ae756fda-4c27-4732-89af-cb6903f2ab40

From the Financial Times:

The top eight US #media groups plan to spend at least $115bn on new movies and #TV shows next year in pursuit of a video #streaming business that loses money for most of them.

The huge investment outlays come amid concerns that it will be harder to attract new customers in 2022 after the pandemic-fuelled growth in 2020 and 2021. Yet the alternative is to be left out of the streaming land rush.

Most of the companies — a list that includes The Walt Disney Company , Comcast , WarnerMedia and Amazon — are set to rack up losses on their streaming units. Including sports rights, the aggregate spending estimate rises to about $140bn.

But the fact that even the industry leader must invest heavily to churn out shows and keep pace with competitors has caused some investors to ask whether video streaming is a good business.

Netflix is set to spend more than $17bn on content next year — up 25 per cent from 2021 and 57 per cent from the $10.8bn it spent in 2020. The company expects to break even and become free cash flow positive in 2022.

“The market is increasingly concerned there is no pot of gold at the end of this rainbow”, the bank’s analysts said.

Schofield’s Laws of Computing
/ quotes, software

https://www.freecodecamp.org/news/schofields-laws-of-computing/

 

“Never put data into a program unless you can see exactly how to get it out.” ― Jack Schofield (2003)

 

“Data doesn’t really exist unless you have at least two copies of it.” ― Jack Schofield (2008)

 

“The easier it is for you to access your data, the easier it is for someone else to access your data.” ― Jack Schofield (2008)

Advanced Computer Vision with Python OpenCV and Mediapipe
/ Featured, production, python, software

https://www.freecodecamp.org/news/advanced-computer-vision-with-python/

 

https://www.freecodecamp.org/news/how-to-use-opencv-and-python-for-computer-vision-and-ai/

 

 

Working for a VFX (Visual Effects) studio provides numerous opportunities to leverage the power of Python and OpenCV for various tasks. OpenCV is a versatile computer vision library that can be applied to many aspects of the VFX pipeline. Here’s a detailed list of opportunities to take advantage of Python and OpenCV in a VFX studio:

 

  1. Image and Video Processing:
    • Preprocessing: Python and OpenCV can be used for tasks like resizing, color correction, noise reduction, and frame interpolation to prepare images and videos for further processing.
    • Format Conversion: Convert between different image and video formats using OpenCV’s capabilities.
  2. Tracking and Matchmoving:
    • Feature Detection and Tracking: Utilize OpenCV to detect and track features in image sequences, which is essential for matchmoving tasks to integrate computer-generated elements into live-action footage.
  3. Rotoscoping and Masking:
    • Segmentation and Masking: Use OpenCV for creating and manipulating masks and alpha channels for various VFX tasks, like isolating objects or characters from their backgrounds.
  4. Camera Calibration:
    • Intrinsic and Extrinsic Calibration: Python and OpenCV can help calibrate cameras for accurate 3D scene reconstruction and camera tracking.
  5. 3D Scene Reconstruction:
    • Stereoscopy: Use OpenCV to process stereoscopic image pairs for creating 3D depth maps and generating realistic 3D scenes.
    • Structure from Motion (SfM): Implement SfM techniques to create 3D models from 2D image sequences.
  6. Green Screen and Blue Screen Keying:
    • Chroma Keying: Implement advanced keying algorithms using OpenCV to seamlessly integrate actors and objects into virtual environments.
  7. Particle and Fluid Simulations:
    • Particle Tracking: Utilize OpenCV to track and manipulate particles in fluid simulations for more realistic visual effects.
  8. Motion Analysis:
    • Optical Flow: Implement optical flow algorithms to analyze motion patterns in footage, useful for creating dynamic VFX elements that follow the motion of objects.
  9. Virtual Set Extension:
    • Camera Projection: Use camera calibration techniques to project virtual environments onto physical sets, extending the visual scope of a scene.
  10. Color Grading:
    • Color Correction: Implement custom color grading algorithms to match the color tones and moods of different shots.
  11. Automated QC (Quality Control):
    • Artifact Detection: Develop Python scripts to automatically detect and flag visual artifacts like noise, flicker, or compression artifacts in rendered frames.
  12. Data Analysis and Visualization:
    • Performance Metrics: Use Python to analyze rendering times and optimize the rendering process.
    • Data Visualization: Generate graphs and charts to visualize render farm usage, project progress, and resource allocation.
  13. Automating Repetitive Tasks:
    • Batch Processing: Automate repetitive tasks like resizing images, applying filters, or converting file formats across multiple shots.
  14. Machine Learning Integration:
    • Object Detection: Integrate machine learning models (using frameworks like TensorFlow or PyTorch) to detect and track specific objects or elements within scenes.
  15. Pipeline Integration:
    • Custom Tools: Develop Python scripts and tools to integrate OpenCV-based processes seamlessly into the studio’s pipeline.
  16. Real-time Visualization:
    • Live Previsualization: Implement real-time OpenCV-based visualizations to aid decision-making during the preproduction stage.
  17. VR and AR Integration:
    • Augmented Reality: Use Python and OpenCV to integrate virtual elements into real-world footage, creating compelling AR experiences.
  18. Camera Effects:
    • Lens Distortion: Correct lens distortions and apply various camera effects using OpenCV, contributing to the desired visual style.

 

Interpolating frames from an EXR sequence using OpenCV can be useful when you have only every second frame of a final render and you want to create smoother motion by generating intermediate frames. However, keep in mind that interpolating frames might not always yield perfect results, especially if there are complex changes between frames. Here’s a basic example of how you might use OpenCV to achieve this:

 

import cv2
import numpy as np
import os

# Replace with the path to your EXR frames
exr_folder = "path_to_exr_frames"

# Replace with the appropriate frame extension and naming convention
frame_template = "frame_{:04d}.exr"

# Define the range of frame numbers you have
start_frame = 1
end_frame = 100
step = 2

# Define the output folder for interpolated frames
output_folder = "output_interpolated_frames"
os.makedirs(output_folder, exist_ok=True)

# Loop through the frame range and interpolate
for frame_num in range(start_frame, end_frame + 1, step):
    frame_path = os.path.join(exr_folder, frame_template.format(frame_num))
    next_frame_path = os.path.join(exr_folder, frame_template.format(frame_num + step))

    if os.path.exists(frame_path) and os.path.exists(next_frame_path):
        frame = cv2.imread(frame_path, cv2.IMREAD_ANYDEPTH | cv2.IMREAD_COLOR)
        next_frame = cv2.imread(next_frame_path, cv2.IMREAD_ANYDEPTH | cv2.IMREAD_COLOR)

        # Interpolate frames using simple averaging
        interpolated_frame = (frame + next_frame) / 2

        # Save interpolated frame
        output_path = os.path.join(output_folder, frame_template.format(frame_num))
        cv2.imwrite(output_path, interpolated_frame)

        print(f"Interpolated frame {frame_num}") # alternatively: print("Interpolated frame {}".format(frame_num))



 

Please note the following points:

 

  • The above example uses simple averaging to interpolate frames. More advanced interpolation methods might provide better results, such as motion-based algorithms like optical flow-based interpolation.
  • EXR files can store high dynamic range (HDR) data, so make sure to use cv2.IMREAD_ANYDEPTH flag when reading these files.
  • OpenCV might not support EXR format directly. You might need to use a library like exr to read and manipulate EXR files, and then convert them to OpenCV-compatible formats.
  • Consider the characteristics of your specific render when using interpolation. If there are large changes between frames, the interpolation might lead to artifacts.
  • Experiment with different interpolation methods and parameters to achieve the desired result.
  • For a more advanced and accurate interpolation, you might need to implement or use existing algorithms that take into account motion estimation and compensation.

 

How to Build a Hackintosh
/ hardware, production

https://www.freecodecamp.org/news/build-a-hackintosh/

A Hackintosh is a non-Mac computer system, made with PC parts, that runs the macOS operating system.

An open source pipeline
/ production, software

The Rookies – Creative Industries Certification
/ ves

The Rookies is working on an ambitious task to create a free online resource that will standardize the skills and knowledge for digital artists on a global stage. The resource will outline essential creative and technical skills required by all artists before applying for industry positions in VFX, video games and animation.

To get involved, give back and help others as they start their career pathing, click on links below and give your insights! Everyone has value to add!

 

3D Animation: https://lnkd.in/gSJPdFS


Visual Effects: https://lnkd.in/gk2-gNM


Games: https://lnkd.in/ghdGVbN