BREAKING NEWS
LATEST POSTS
-
-
VFX House DNEG Acquires Exclusive License to Ziva Technologies From Unity
“Unity has also entered into an agreement with DNEG, a leading technology-enabled visual effects (VFX) and animation company for the creation of feature film, television, and multiplatform content, for an exclusive perpetual license of the Ziva IP. Unity will continue to retain ownership of all the technology acquired from Ziva Dynamics, and we will continue to evaluate the best way to enhance our core offerings with it over time.”
-
Foundry Nuke – VectorFrameBlend v1.1 by Nikolai Wüstemann – Blend up to 11 nearby frames together, while preserving all detail
https://www.nukepedia.com/gizmos/time/vectorframeblend
Blend up to 11 nearby frames together, while preserving all detail
VectorFrameBlend can average/median/min/max/plus up to +- 5 frames with full motion awareness. Compared to the last version or other similar solutions, I built it as technically correct as possible and it provides thorough settings to improve the filtering quality and edge cases (literally).
You can also use the ‘External’ mode and connect the ‘vec’ input to another VectorFrameBlend, to use its internally generated vectors.
This can be useful, if you want to analyse a certain layer (for example a diffuse color pass that holds a lot of clean details), but apply the frame blending on somewhere else. Apart from that, the tool can of course be used on live action plates, utility passes or whatever comes to mind.
FEATURED POSTS
-
AI and the Law – Copyright Traps for Large Language Models – This new tool can tell you whether AI has stolen your work
https://github.com/computationalprivacy/copyright-traps
Copyright traps (see Meeus et al. (ICML 2024)) are unique, synthetically generated sequences who have been included into the training dataset of CroissantLLM. This dataset allows for the evaluation of Membership Inference Attacks (MIAs) using CroissantLLM as target model, where the goal is to infer whether a certain trap sequence was either included in or excluded from the training data.
This dataset contains non-member (
label=0
) and member (label=1
) trap sequences, which have been generated using this code and by sampling text from LLaMA-2 7B while controlling for sequence length and perplexity. The dataset contains splits according toseq_len_{XX}_n_rep_{YY}
where sequences ofXX={25,50,100}
tokens are considered andYY={10, 100, 1000}
number of repetitions for member sequences. Each dataset also contains the ‘perplexity bucket’ for each trap sequence, where the original paper showed that higher perplexity sequences tend to be more vulnerable.Note that for a fixed sequence length, and across various number of repetitions, each split contains the same set of non-member sequences (
n_rep=0
). Also additional non-members generated in exactly the same way are provided here, which might be required for some MIA methodologies making additional assumptions for the attacker.
-
STOP FCC – SAVE THE FREE NET
Help saving free sites like this one.
The FCC voted to kill net neutrality and let ISPs like Comcast ruin the web with throttling, censorship, and new fees. Congress has 60 legislative days to overrule them and save the Internet using the Congressional Review Act
https://www.battleforthenet.com/http://mashable.com/2012/01/17/sopa-dangerous-opinion/