Using a high lumen projector instead of a flat screen

Using a high lumen projector instead of a flat screen

If you are thinking about getting a projector instead of flat screen. Basically the way that the math works is that a 1000 lumen projector will give you 1000 lux against a square meter. Or to do another conversion 1 lux is 1 lumen per square cm. But there are lots of measures of what a projector is throwing out measured as light source and lots of different units of what we see on the screen.

The basic idea is that lumens (or nits for a flatscreen) measure what is emitted and lux measures what is reflected off a screen and seen by the eye.

But if course there are complexities. First is what is the projector pumping out. the first is “raw lumens” from the bulb. Lots of folks advertise that. Then there are ANSI lumens which are lower. That measures visible light. Finally that light gets processed and reduced so output lumens can be 30% of that. So a 6,000 lumens light source could be more like 1,000-2,000 ANSI lumens after processing. This is the measure for a point source that is a projector. .

So if you have a 12’ x 9’ screen you need 3000 lumen minimum to cover 4x3 is 12 square meters and ideal 4500 lumen project to get good brightness so that’s about 270 to 350 lux. So the actual brightness depends on how big a screen you are projecting to.

Moreover there are a lot of different ways of measuring the brightness as reflected on a screen. Lux as you see is lumens/square meter/second.

A nit from the Latin meaning brightness is usually used for a flat panel. It is measured and candelas per square meter of the flat panel. What’s a candela, it’s the light of a single whale candle from the 1800s. So a 1000 nit screen pumps out 1,000 candelas per square Meter. And because a flat screen doesn’t bounce light in fact Lux and nits are the same. Note that lumens is just brightness whereas nits are brightness/area.

Nits for flat screens (vs Lux)

For an outdoor LED sign as an example in shadows 800-1,500 nits works. What is a nit well it is actually the light of one candle (a candela) radiating from a 1 square radian (steradian) light source. That is a candela from a curved surface. For a flat panel it is really a measure that takes into account how much of the screen takes up in your field of view (so there is an implicit distance applied. Obviously if the screen is two miles away it’s nits are lower.

Technically, A candela is measured the same way it is 1 lumen per square steradian. As an aside a nit is not standard but is the way to refer to 1cd/m^2. confused yet. So a nit can also be measured as 1 liken/steradian/meter and it is at a specific frequency near green of 550nm. So then a lumen is a measure of light intensity and a candela is that much light across a portion of a sphere that is 1 radian by 1 radian.

A lux is a lumen in a square meter so you can imagine it as a one square meter sensor that is place so the angle it covers from a point source it is exactly one square radian. That is when viewing the light falls it the light falls on one square meter and it is set perfectly so that from the light source it covers one radian by one radian (about 57 x 57 degrees)

To get a sense of brightness. A moonless night is 0.001 lux, ordinary living room is 50 lux, office lighting 300-500 lux, overcast day 1000 lux, full daylight is 10,000 lux and direct sunlight is 32K-100K lux. Woo that’s alot.

The final complexity is that nits is actually measured against a curved surface. So it is technically light per steraradian. There are 2 pi radians make a circle. That is a square radian. Which I have not heard of before. So the conversion from Lux to nits is just multiply by pi. So 1 nit is 3.1415 lux. So office lighting for instance generates 1,000-1,500 nits. (Candelas/square meter)

And since we are nerding out a foot-candle is the light of a single candle from one foot away and is equal to 10.76 lux.

For in the sun it would be 2-3,500 nits. and ultra bright would be 4,500-6,000.

So what about phones? As an example the iPhone 12 and 13 are 625 and 800 respectively. While the pro versions are 800 to 1,000 nits. So you can see why it’s hard to see your phone screen in bright sunlight.

And as a comparison let’s look at TVs the very good LG OLED G1 is about 870 nits vs 780 for last years GX for 10% of the screen. So you can see why the recommendation is to use OLED on darkened rooms. In contrast the Samsung QN90A LED is 1,500 nits so more suitable to brighter rooms although not direct sunlight. Also the reason the measure is taken this way is that brightness falls if it has to cover the entire screen (unusual for movies) so the all screen brightness of an LG falls to 175 Nits and 700 for the Samsung. Of course OLED has incredible blacks (since OLEDS can go all dark unlike LEDs). The standard for good HDR is 1,000 nits

So for instance to figure out what you need for good HDR you need 1,000 nits or 3,141 lux.

There’s a great chart that gives you a sense of how many lumens you need for a given size screen. But as Benq says to get say 600 nits with a projector with a 120 inch diagonal you are going to need 12,000 lumen projector. Wow! That is still less than what you need for a 1,000 nit HDR system so that’s why projectors that really do HDR you need a serious projector.

As an example the $65K Sony VPL-VW5000ES is an incredible 5,000 lumens. The thing weighs 95 pounds and is pretty much state of the art. Or for the billionaires there is the VPL-GTZ180 for $88K but is an astonishing 10K lumens.

%d bloggers like this: