If a blind person gained sight, could they recognize objects previously touched?
/ colour, quotes

news.psu.edu/story/141360/2006/04/17/research/probing-question-if-blind-person-gained-sight-could-they-recognize

 

Blind people who regain their sight may find themselves in a world they don’t immediately comprehend. “It would be more like a sighted person trying to rely on tactile information,” Moore says.

 

Learning to see is a developmental process, just like learning language, Prof Cathleen Moore continues. “As far as vision goes, a three-and-a-half year old child is already a well-calibrated system.”

Photography basics: Color Temperature and White Balance
/ colour, Featured, lighting, photography

 

Color Temperature of a light source describes the spectrum of light which is radiated from a theoretical “blackbody” (an ideal physical body that absorbs all radiation and incident light – neither reflecting it nor allowing it to pass through) with a given surface temperature.

https://en.wikipedia.org/wiki/Color_temperature

 

Or. Most simply it is a method of describing the color characteristics of light through a numerical value that corresponds to the color emitted by a light source, measured in degrees of Kelvin (K) on a scale from 1,000 to 10,000.

 

More accurately. The color temperature of a light source is the temperature of an ideal backbody that radiates light of comparable hue to that of the light source.

As such, the color temperature of a light source is a numerical measurement of its color appearance. It is based on the principle that any object will emit light if it is heated to a high enough temperature, and that the color of that light will shift in a predictable manner as the temperature is increased. The system is based on the color changes of a theoretical “blackbody radiator” as it is heated from a cold black to a white hot state.

 

So, why do we measure the hue of the light as a “temperature”? This was started in the late 1800s, when the British physicist William Kelvin heated a block of carbon. It glowed in the heat, producing a range of different colors at different temperatures. The black cube first produced a dim red light, increasing to a brighter yellow as the temperature went up, and eventually produced a bright blue-white glow at the highest temperatures. In his honor, Color Temperatures are measured in degrees Kelvin, which are a variation on Centigrade degrees. Instead of starting at the temperature water freezes, the Kelvin scale starts at “absolute zero,” which is -273 Centigrade.

 

More about black bodies here: http://www.pixelsham.com/2013/03/14/black-body-color

 

 

The Sun closely approximates a black-body radiator. The effective temperature, defined by the total radiative power per square unit, is about 5780 K. The color temperature of sunlight above the atmosphere is about 5900 K. Time of the day and atmospheric conditions bias the purity of the light that reaches us from the sun.

Some think that the Sun’s output in visible light peaks in the yellow. However, the Sun’s visible output peaks in the green:

  

 

 

http://solar-center.stanford.edu/SID/activities/GreenSun.html

Independently, we refer to the sun as a pure white light source. And we use its spectrum as a reference for other light sources.

Because the sun’s spectrum can change depending on so many factors (including pollution), a standard called D65 was defined (by the International Commission on Illumination) to represent what is considered as the average spectrum of the sun in average conditions.

This in reality tends to bias towards an overcast day of 6500K. And while it is implemented at different temperatures by different manufacturers, it is still considered a more common standard.

 

https://en.wikipedia.org/wiki/Illuminant_D65

 

https://www.scratchapixel.com/lessons/digital-imaging/colors

 

 

In this context, the White Point of a light defines the neutral color of its given color space.

https://chrisbrejon.com/cg-cinematography/chapter-1-color-management/#Colorspace

 

D65 corresponds to what the spectrum of the sun would typically look like on a midday sun somewhere in Western/Northern Europe (figure 9). This D65 which is also called the daylight illuminant is not a spectrum which we can exactly reproduce with a light source but rather a reference against which we can compare the spectrum of existing lights.

 

Another rough analogue of blackbody radiation in our day to day experience might be in heating a metal or stone: these are said to become “red hot” when they attain one temperature, and then “white hot” for even higher temperatures.

 

Similarly, black bodies at different temperatures also have varying color temperatures of “white light.” Despite its name, light which may appear white does not necessarily contain an even distribution of colors across the visible spectrum.

 

The Kelvin Color Temperature scale imagines a black body object— (such as a lamp filament) being heated. At some point the object will get hot enough to begin to glow. As it gets hotter its glowing color will shift, moving from deep reds, such as a low burning fire would give, to oranges & yellows, all the way up to white hot.

 

Color temperatures over 5,000K are called cool colors (bluish white), while lower color temperatures (2,700–3,000 K) are called warm colors (yellowish white through red)

  

 

https://www.ni.com/en-ca/innovations/white-papers/12/a-practical-guide-to-machine-vision-lighting.html

 

Our eyes are very good at judging what is white under different light sources, but digital cameras often have great difficulty with auto white balance (AWB) — and can create unsightly blue, orange, or even green color casts. Understanding digital white balance can help you avoid these color casts, thereby improving your photos under a wider range of lighting conditions.

 

 

White balance (WB) is the process of removing these color casts from captured media, so that objects which appear white in perception (or expected) are rendered white in your medium.

This color cast is due to the way light itself is formed and spread.

 

What a white balancing procedure does is it identifies what is white in your footage. It doesn’t know what white is until you tell it what it is.

 

You can often do this with AWB (Automatic White Balance), but the results are not always desirable. That is why you may choose to manually change your white balance.

When you white balance you are telling your camera to treat any object with similar chrominance and luminance as white.

 

Different type of light sources generate different color casts.

 

As such, camera white balance has to take into account this “color temperature” of a light source, which mostly refers to the relative warmth or coolness of white light.

 

Matching the temperature value of an indoor/outdoor cast makes for a white balance.
The two color temperatures you’ll hear most often discussed are outdoor lighting which is often ball parked at 5600K and indoor (tungsten) lighting which is generally ball parked at 3200K. These are the two numbers you’ll hear over and over again. Higher color temperatures (over 5000K) are considered “cool” (i.e. Blue’ish). Lower color temperatures (under 5000K) are considered “warm” (i.e. orange’ish).

 

Therefore if you are shooting indoors under tungsten lighting at 3200K you will set your white balance for indoor shooting at this color temperature. In this case, your camera will correct your camera’s settings to ensure that white appears white. Your camera will either have an indoor 3200K auto option (even the most basic camera’s have this option) or you can choose to set it manually.

 

Things get complicated if you’re filming indoors during the day under tungsten lighting while the outdoor light is coming through a window. Now what we have is a mixing of color temperatures. What you need to understand in this situation is that there is no perfect white balance setting in a mixed color temperature setting. You will need to make a compromise on one end of the spectrum or the other. If you set your white balance to tungsten 3200K the daylight colors will appear very blue. If you set your white balance to optimize for daylight 5600K then your tungsten lighting will appear very orange.

 

Where to use which light:
For lighting building interiors, it is often important to take into account the color temperature of illumination. A warmer (i.e., a lower color temperature) light is often used in public areas to promote relaxation, while a cooler (higher color temperature) light is used to enhance concentration, for example in schools and offices.

 

 

REFERENCES

 


How to Convert Temperature (K) to RGB: Algorithm and Sample Code

https://tannerhelland.com/2012/09/18/convert-temperature-rgb-algorithm-code.html

 

http://www.vendian.org/mncharity/dir3/blackbody/UnstableURLs/bbr_color.html

 

http://riverfarenh.com/light-bulb-color-chart/

 

https://www.lightsfilmschool.com/blog/filmmaking-white-balance-and-color-temperature

 

https://astro-canada.ca/le_spectre_electromagnetique-the_electromagnetic_spectrum-eng

 

http://www.3drender.com/glossary/colortemp.htm

 

http://pernime.info/light-kelvin-scale/

 

http://lowel.tiffen.com/edu/color_temperature_and_rendering_demystified.html

 

https://en.wikipedia.org/wiki/Color_temperature

 

https://www.sylvania.com/en-us/innovation/education/light-and-color/Pages/color-characteristics-of-light.aspx

 

How to Convert Temperature (K) to RGB:
http://www.tannerhelland.com/4435/convert-temperature-rgb-algorithm-code/

 

  

 

 

https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_for_cinema_4d_ci_Lights_html

 

 

 

Jim Carrey Motivational speech
/ quotes
https://www.youtube.com/watch?v=YAzTIOy0ID0
Gamma correction

http://www.normankoren.com/makingfineprints1A.html#Gammabox

 

https://en.wikipedia.org/wiki/Gamma_correction

 

http://www.photoscientia.co.uk/Gamma.htm

 

https://www.w3.org/Graphics/Color/sRGB.html

 

http://www.eizoglobal.com/library/basics/lcd_display_gamma/index.html

 

https://forum.reallusion.com/PrintTopic308094.aspx

 

Basically, gamma is the relationship between the brightness of a pixel as it appears on the screen, and the numerical value of that pixel. Generally Gamma is just about defining relationships.

Three main types:
– Image Gamma encoded in images
– Display Gammas encoded in hardware and/or viewing time
– System or Viewing Gamma which is the net effect of all gammas when you look back at a final image. In theory this should flatten back to 1.0 gamma.

 

Our eyes, different camera or video recorder devices do not correctly capture luminance. (they are not linear)
Different display devices (monitor, phone screen, TV) do not display luminance correctly neither. So, one needs to correct them, therefore the gamma correction function.

The human perception of brightness, under common illumination conditions (not pitch black nor blindingly bright), follows an approximate power function (note: no relation to the gamma function), with greater sensitivity to relative differences between darker tones than between lighter ones, consistent with the Stevens’ power law for brightness perception. If images are not gamma-encoded, they allocate too many bits or too much bandwidth to highlights that humans cannot differentiate, and too few bits or too little bandwidth to shadow values that humans are sensitive to and would require more bits/bandwidth to maintain the same visual quality.

https://blog.amerlux.com/4-things-architects-should-know-about-lumens-vs-perceived-brightness/

cones manage color receptivity, rods determine how large our pupils should be. The larger (more dilated) our pupils are, the more light enters our eyes. In dark situations, our rods dilate our pupils so we can see better. This impacts how we perceive brightness.

 

https://www.cambridgeincolour.com/tutorials/gamma-correction.htm

A gamma encoded image has to have “gamma correction” applied when it is viewed — which effectively converts it back into light from the original scene. In other words, the purpose of gamma encoding is for recording the image — not for displaying the image. Fortunately this second step (the “display gamma”) is automatically performed by your monitor and video card. The following diagram illustrates how all of this fits together:

 

Display gamma
The display gamma can be a little confusing because this term is often used interchangeably with gamma correction, since it corrects for the file gamma. This is the gamma that you are controlling when you perform monitor calibration and adjust your contrast setting. Fortunately, the industry has converged on a standard display gamma of 2.2, so one doesn’t need to worry about the pros/cons of different values.

 

Gamma encoding of images is used to optimize the usage of bits when encoding an image, or bandwidth used to transport an image, by taking advantage of the non-linear manner in which humans perceive light and color. Human response to luminance is also biased. Especially sensible to dark areas.
Thus, the human visual system has a non-linear response to the power of the incoming light, so a fixed increase in power will not have a fixed increase in perceived brightness.
We perceive a value as half bright when it is actually 18% of the original intensity not 50%. As such, our perception is not linear.

 

You probably already know that a pixel can have any ‘value’ of Red, Green, and Blue between 0 and 255, and you would therefore think that a pixel value of 127 would appear as half of the maximum possible brightness, and that a value of 64 would represent one-quarter brightness, and so on. Well, that’s just not the case.

 

Pixar Color Management
https://renderman.pixar.com/color-management


– Why do we need linear gamma?
Because light works linearly and therefore only works properly when it lights linear values.

 

– Why do we need to view in sRGB?
Because the resulting linear image in not suitable for viewing, but contains all the proper data. Pixar’s IT viewer can compensate by showing the rendered image through a sRGB look up table (LUT), which is identical to what will be the final image after the sRGB gamma curve is applied in post.

This would be simple enough if every software would play by the same rules, but they don’t. In fact, the default gamma workflow for many 3D software is incorrect. This is where the knowledge of a proper imaging workflow comes in to save the day.

 

Cathode-ray tubes have a peculiar relationship between the voltage applied to them, and the amount of light emitted. It isn’t linear, and in fact it follows what’s called by mathematicians and other geeks, a ‘power law’ (a number raised to a power). The numerical value of that power is what we call the gamma of the monitor or system.

 

Thus. Gamma describes the nonlinear relationship between the pixel levels in your computer and the luminance of your monitor (the light energy it emits) or the reflectance of your prints. The equation is,

Luminance = C * value^gamma + black level

– C is set by the monitor Contrast control.

– Value is the pixel level normalized to a maximum of 1. For an 8 bit monitor with pixel levels 0 – 255, value = (pixel level)/255.

 

– Black level is set by the (misnamed) monitor Brightness control. The relationship is linear if gamma = 1. The chart illustrates the relationship for gamma = 1, 1.5, 1.8 and 2.2 with C = 1 and black level = 0.

 

Gamma affects middle tones; it has no effect on black or white. If gamma is set too high, middle tones appear too dark. Conversely, if it’s set too low, middle tones appear too light.

 

The native gamma of monitors – the relationship between grid voltage and luminance – is typically around 2.5, though it can vary considerably. This is well above any of the display standards, so you must be aware of gamma and correct it.

 

A display gamma of 2.2 is the de facto standard for the Windows operating system and the Internet-standard sRGB color space.

 

The old standard for Mcintosh and prepress file interchange is 1.8. It is now 2.2 as well.

 

Video cameras have gammas of approximately 0.45 – the inverse of 2.2. The viewing or system gamma is the product of the gammas of all the devices in the system – the image acquisition device (film+scanner or digital camera), color lookup table (LUT), and monitor. System gamma is typically between 1.1 and 1.5. Viewing flare and other factor make images look flat at system gamma = 1.0.

 

Most laptop LCD screens are poorly suited for critical image editing because gamma is extremely sensitive to viewing angle.

 

More about screens

https://www.cambridgeincolour.com/tutorials/gamma-correction.htm

CRT Monitors. Due to an odd bit of engineering luck, the native gamma of a CRT is 2.5 — almost the inverse of our eyes. Values from a gamma-encoded file could therefore be sent straight to the screen and they would automatically be corrected and appear nearly OK. However, a small gamma correction of ~1/1.1 needs to be applied to achieve an overall display gamma of 2.2. This is usually already set by the manufacturer’s default settings, but can also be set during monitor calibration.

LCD Monitors. LCD monitors weren’t so fortunate; ensuring an overall display gamma of 2.2 often requires substantial corrections, and they are also much less consistent than CRT’s. LCDs therefore require something called a look-up table (LUT) in order to ensure that input values are depicted using the intended display gamma (amongst other things). See the tutorial on monitor calibration: look-up tables for more on this topic.

About black level (brightness). Your monitor’s brightness control (which should actually be called black level) can be adjusted using the mostly black pattern on the right side of the chart. This pattern contains two dark gray vertical bars, A and B, which increase in luminance with increasing gamma. (If you can’t see them, your black level is way low.) The left bar (A) should be just above the threshold of visibility opposite your chosen gamma (2.2 or 1.8) – it should be invisible where gamma is lower by about 0.3. The right bar (B) should be distinctly visible: brighter than (A), but still very dark. This chart is only for monitors; it doesn’t work on printed media.

 

The 1.8 and 2.2 gray patterns at the bottom of the image represent a test of monitor quality and calibration. If your monitor is functioning properly and calibrated to gamma = 2.2 or 1.8, the corresponding pattern will appear smooth neutral gray when viewed from a distance. Any waviness, irregularity, or color banding indicates incorrect monitor calibration or poor performance.

 

Another test to see whether one’s computer monitor is properly hardware adjusted and can display shadow detail in sRGB images properly, they should see the left half of the circle in the large black square very faintly but the right half should be clearly visible. If not, one can adjust their monitor’s contrast and/or brightness setting. This alters the monitor’s perceived gamma. The image is best viewed against a black background.

 

This procedure is not suitable for calibrating or print-proofing a monitor. It can be useful for making a monitor display sRGB images approximately correctly, on systems in which profiles are not used (for example, the Firefox browser prior to version 3.0 and many others) or in systems that assume untagged source images are in the sRGB colorspace.

 

On some operating systems running the X Window System, one can set the gamma correction factor (applied to the existing gamma value) by issuing the command xgamma -gamma 0.9 for setting gamma correction factor to 0.9, and xgamma for querying current value of that factor (the default is 1.0). In OS X systems, the gamma and other related screen calibrations are made through the System Preference

 

https://www.kinematicsoup.com/news/2016/6/15/gamma-and-linear-space-what-they-are-how-they-differ

Linear color space means that numerical intensity values correspond proportionally to their perceived intensity. This means that the colors can be added and multiplied correctly. A color space without that property is called ”non-linear”. Below is an example where an intensity value is doubled in a linear and a non-linear color space. While the corresponding numerical values in linear space are correct, in the non-linear space (gamma = 0.45, more on this later) we can’t simply double the value to get the correct intensity.

 

The need for gamma arises for two main reasons: The first is that screens have been built with a non-linear response to intensity. The other is that the human eye can tell the difference between darker shades better than lighter shades. This means that when images are compressed to save space, we want to have greater accuracy for dark intensities at the expense of lighter intensities. Both of these problems are resolved using gamma correction, which is to say the intensity of every pixel in an image is put through a power function. Specifically, gamma is the name given to the power applied to the image.

 

CRT screens, simply by how they work, apply a gamma of around 2.2, and modern LCD screens are designed to mimic that behavior. A gamma of 2.2, the reciprocal of 0.45, when applied to the brightened images will darken them, leaving the original image.

Comic bubbles template
/ reference

QR code logos
/ design

 

You are unique
/ jokes

Questions to ask the interviewer
/ production, quotes, ves

(2017) Digital Domain Holdings Reports $64 Million in Losses
/ ves

http://variety.com/2017/biz/asia/losses-at-digital-domain-holdings-double-1202020056/

Digital Domain Holdings, the Hong Kong-based visual effects and virtual reality group, saw losses in 2016 more than double to $64.3 million.

The core Digital Domain 3.0 business provided VFX for films including “Beauty and the Beast,” “Deadpool” and “X-Men: Apocalypse” during the year.

Revenues across the group increased 45% from $68 million (HK$527 million) in 2015 to $98.5 million (HK$763 million) last year. Net losses, which totaled $23.1 million (HK$179 million) in 2015, reached $64.3 million (HK$498 million) in 2016.

The company pointed to content development and research and development costs for virtual reality content and games, 360° and virtual humans, and a more than fourfold increase in amortization of intangible assets, as causes of the financial pain.

Autodesk announces new products to replace 123D
/ software

http://blog.123dapp.com/

We’re incredibly proud of these products, and even more proud of what you all have MADE with them. But we recognize that the portfolio has become complex. We are making some changes to simplify our Autodesk portfolio and workflows for people everywhere who love to make things. We are consolidating these tools and features into key apps such as Tinkercad, Fusion 360, and ReMake.

Today, we are sharing the news that in early 2017, after we complete this consolidation, we’ll be shutting down 123Dapp.com

5 Thought Experiments That Will Melt Your Brain
/ quotes

https://medium.com/pcmag-access/5-thought-experiments-that-will-melt-your-brain-bb5ab7c7fe3c#.ikhd2rsvq

 

1- the basic concept of the “Swampman” thought experiment posited by the philosopher Donald Davidson in the late-1980s. In this experiment a man is traveling through a swamp and killed by a bolt of lightning, but — by sheer chance — another bolt of lightning strikes a nearby swamp and rearranges all the organic particles to create an exact replica (including all the memories and such) of the man who was killed. The new Swampman wakes up and lives the rest of the deceased man’s life.

 

2- Achilles and the tortoise are racing at constant speeds: Very fast and very slow, respectively. At some point in the race, Achilles reaches the tortoise’s original starting point. But in the time it took Achilles to get there, the tortoise has moved forward. So, then Achilles’s next task would be to make up the new gap between himself and the tortoise, however by the time he did that, the tortoise would have again moved forward by some smaller amount. The process then repeats itself again and again. Achilles is always faced with a new (if smaller) gap to overcome. The takeaway: The great Achilles loses a race to a big dumb lumbering tortoise and no deficit is ever surmountable.

 

3- let’s say you just froze time at some point along an arrow’s trajectory . At that particular instant, the arrow is suspended in space in a single location. In any one instant of time, no motion is occurring. The arrow can only be in one place or the other and never in-between. So, how does it get from one instant to another if there is never a moment when it is in between the two places?

 

4- the question at hand is would a blind person who learned to distinguish basic shapes by touch be able to distinguish those objects when he suddenly received the power of sight? In other words, does information from one sensation translate to another, or do we associate them only in our minds?

 

https://news.psu.edu/story/141360/2006/04/17/research/probing-question-if-blind-person-gained-sight-could-they-recognize

 

5- You are on a bridge overlooking a set of trolley tracks and you notice that five people have been tied down to the tracks by a devious (and presumably moustache-twirling) villain. Then you see an out-of-control trolley barreling down the tracks that will certainly kill the unfortunate people unless someone intervenes. you realize that you are sharing your bridge with a gigantic fat man, who — if you were to push him in front of the trolley — would have enough girth to stop the trolley and save the five bound people, though he will certainly be killed.You are now faced with the following options: 1) Do nothing and the five people will die, or 2) Push the fat man in front of the trolley and sacrifice him for the five people. In either scenario, are you at all culpable in these innocent people’s deaths? Should the law make any distinction?
 

Photography basics: Shutter angle and shutter speed and motion blur
/ Featured, photography

http://www.shutterangle.com/2012/cinematic-look-frame-rate-shutter-speed/

 

https://www.cinema5d.com/global-vs-rolling-shutter/

 

https://www.wikihow.com/Choose-a-Camera-Shutter-Speed

 

https://www.provideocoalition.com/shutter-speed-vs-shutter-angle/

 

 

Shutter is the device that controls the amount of light through a lens. Basically in general it controls the amount of time a film is exposed.

 

Shutter speed is how long this device is open for, which also defines motion blur… the longer it stays open the blurrier the image captured.

 

The number refers to the amount of light actually allowed through.

 

As a reference, shooting at 24fps, at 180 shutter angle or 1/48th of shutter speed (0.0208 exposure time) will produce motion blur which is similar to what we perceive at naked eye

 

Talked of as in (shutter) angles, for historical reasons, as the original exposure mechanism was controlled through a pie shaped mirror in front of the lens.

 

 

A shutter of 180 degrees is blocking/allowing light for half circle.  (half blocked, half open). 270 degrees is one quarter pie shaped, which would allow for a higher exposure time (3 quarter pie open, vs one quarter closed) 90 degrees is three quarter pie shaped, which would allow for a lower exposure (one quarter open, three quarters closed)

 

The shutter angle can be converted back and fort with shutter speed with the following formulas:
https://www.provideocoalition.com/shutter-speed-vs-shutter-angle/

 

shutter angle =
(360 * fps) * (1/shutter speed)
or
(360 * fps) / shutter speed

 

shutter speed =
(360 * fps) * (1/shutter angle)
or
(360 * fps) / shutter angle

 

For example here is a chart from shutter angle to shutter speed at 24 fps:
270 = 1/32
180 = 1/48
172.8 = 1/50
144 = 1/60
90 = 1/96
72 = 1/120
45 = 1/198
22.5 = 1/348
11 = 1/696
8.6 = 1/1000

 

The above is basically the relation between the way a video camera calculates shutter (fractions of a second) and the way a film camera calculates shutter (in degrees).

Smaller shutter angles show strobing artifacts. As the camera only ever sees at least half of the time (for a typical 180 degree shutter). Due to being obscured by the shutter during that period, it doesn’t capture the scene continuously.

 

This means that fast moving objects, and especially objects moving across the frame, will exhibit jerky movement. This is called strobing. The defect is also very noticeable during pans.  Smaller shutter angles (shorter exposure) exhibit more pronounced strobing effects.

 

Larger shutter angles show more motion blur. As the longer exposure captures more motion.

Note that in 3D you want to first sum the total of the shutter open and shutter close values, than compare that to the shutter angle aperture, ie:

 

shutter open -0.0625
shutter close 0.0625
Total shutter = 0.0625+0.0625 = 0.125
Shutter angle = 360*0.125 = 45

 

shutter open -0.125
shutter close 0.125
Total shutter = 0.125+0.125 = 0.25
Shutter angle = 360*0.25 = 90

 

shutter open -0.25
shutter close 0.25
Total shutter = 0.25+0.25 = 0.5
Shutter angle = 360*0.5 = 180

 

shutter open -0.375
shutter close 0.375
Total shutter = 0.375+0.375 = 0.75
Shutter angle = 360*0.75 = 270

 

 

Faster frame rates can resolve both these issues.

Bicycles history graph
/ design

Sparkles and Wine by Nacho Guzman
/ music

Loading...