Interesting fact of the day: Not counting point sources of light, any source of light that doesnt appear as a single point will have the same brightness no matter what distance you happen to be from it.
In fact the opposite is true in a sense. Once you get close enough to an object that you can no longer see the whole object within your field of view, then it will get **less** bright as you get closer.
Amazingly this even applies to the sun. IF you were at the surface of the sun, just a few feet away (Such that 1 meter square of the surface of the sun was within your field of vision) it would only appear to be 93 lumens bright. That would be equivalent to only a 6 watt incandescent light bulb! Compare that to the brightness of the sun from earth which is a whopping 127,000 lumens.
I am talking about perceived brightness at the eye.. and yes its very much as I described and it has to do with the pixelated nature of the eye.
@freemo Also, on the surface of the Sun, the area from with light reaches you, is 180°x180°, way more than the 0.5x0.5° as seen from Earth. So you will be toast within a fraction of a second, although a 0.5x0.5° piece of the Sun would be the same brightness as seen from Earth.
@freemo replace with → which
Incorrect, but only because you seem to misunderstood what was asserted.
@freemo Well, you will certainly be toast very quickly, but you didn't define what "brightness" means, so the post was not unambiguous. To a layman, brightness usually means how bright something is lit by the light source (at a particular distance, that is). Surface brightness is a concept that needs to be introduced.
@AlphaCephei yes i should have been clear and said "perceived brightness of an object" to be more clear.
Little side note.. you are right of course you'd burn up, but that brings up an unrelated interesting fact:
The sun is very poor at generating heat. In fact the human body outputs significantly more energy per cubit meter than the sun does.
Another little relevant point.. I did use lumens, and while you are correct my wording wasnt poor, lumens do not measure total irradiated energy (that is lux).. lumens is a particular measure of the amount of light falling on a surface. Which in this case, implied the eye.
@freemo True, the heat production of a particular volume of the solar core has been likened to that of a dunghill of the same volume. The Sun is very ineffective at fusing hydrogen, that's why it last so long. But it is also a very big dunghill, so to say...😁
@AlphaCephei haha yup... big dung hill, minimized surface area (as a sphere), poor heat transfer at certain layers AND is sitting in a near-vacuum... so the heat builds up over time.
Another interesting fact, did you know the sun is not massive enough to cause fusion at all if it weren't for quantum tunneling? This effect causes just enough extra heat to allow the process to begin at all.
@freemo Perceived brightness of a *particular viewing angle smaller than the viewing angle of the object*. The object in whole expanse, as long as it fits the field of view, does in fact get brighter when it is closer, because the viewing angle increases. And therefore, it lights up closer objects (say, Mercury) more than objects farther away (e,g. Saturn)
> Perceived brightness of a *particular viewing angle smaller than the viewing angle of the object*.
This was alrady stated in the OP when I said the following:
> Such that 1 meter square of the surface of the sun was within your field of vision
> The object in whole expanse, as long as it fits the field of view, does in fact get brighter when it is closer,
Incorrect, but a common misconception.
As you get closer to an object the object fills a larger portion of your retina. This will cause the object to transmit more overall energy to your eye, but the **same** energy per receptor of the eye. Therefore the brightness of the object appears the same, there is simply more of your pixels detecting it.
@freemo That's surface brightness again. Each retina cell receives the same amount of light, but more cells receive light, so the *object* gets brighter *overall*. The projection of the Sun inside the eye is something else again than the illumination of a piece of surface like a planet.
I know all that, I know that a telescope will not make the Andromeda galaxy appear brighter per unit area as naked eye but it will increase contrast to the background & large objects are easier to see than small.
@freemo I'm dropping out here, it's 1:12 am, need to go to bed now. Have a good night and clear skies!🌌
@AlphaCephei have a great night, thanks for the wonderful chat.
@AlphaCephei The illumination of a piece of surface here is the opening of the eye not a planet
And yea a telescope would make andromeda brighter to the eye since it would collect light more so and focus it into that same area.. it would be like having a larger retina.
We might be misconnecteding again.
@freemo Nope. This only holds for point sources. Ojects get brighter in telescopes the less magnification is used (more light in less area). However, the *exit pupil* of the telescope also gets larger as mag. decreases. Exit pupil > eye pupil wastes light outside the eye pupil. The optimal mag. (Mo) for aperture D is at exit pupil=eye pupil Ø (≈7mm).
Exit pupil = D/M, so Mo = D/7mm. A gains (D/7mm)^2 more light but extends object area by Mo^2=(D/7mm)^2 which cancels. See
https://www.rocketmime.com/astronomy/Telescope/SurfaceBrightness.html
@freemo Therefore, D=N*7mm yields same surface brightness as D=7mm, only at Nx magnification. This is the reason why there are 7x50 glasses but no 6x50. It's basically the same principle as in your initial post: telescopes make things appear closer. That's why I thought we are on the same line here. It's different for photography where exposure time depends on f-stop number (focal length/D), or point sources where all light ends up in 1 retina cell.
I know telescopes for ~50 years, I have 3.
@freemo I have even studied astronomy for a couple of semesters. I would have become an astronomer if computer science which I also loved wouldn't have been more promising financially. But I'm still an amateur astronomer and write articles on astronomical topics. I know what I'm talking about.
@AlphaCephei I think we are getting bogged down with technical terms... let me put this in a technical but practical way.
If you have a camera in a otherwise dark scene with a large bon fire (non-point source), the exposure time, iso, etc would be the same no matter the distance you are from the fire. In addition the light meter would measure the same brightness of the fire no matter your distance from it.
@freemo Correct, as the surface brightness is constant. Which is what the light meter measures. You could also take the Moon as the bonfire. 1/250 s at f/16 and ISO 100 will give you a nicely exposed image. You could use the same setting on Earth (in sunshine) - or on the surface of the Moon.
However, the *overall* brightness of the Moon as seen fro Earth is barely enough to read a newspaper headline, while on the Moon, you could easily read the newspaper in reflected Moonlight.
@freemo Not entirely true. You are confusing (overall) brightness with surface brightness. Surface brightness, that is light power received per unit solid angle, does not decrease with distance, but since luminous area decreases with distance squared, so does overall brightness = solid angle * surface brightness. For this reason, photographing the Moon (same distance from Sun as Earth) requires the same exposure settings as a sunlit landscape, but Saturn (10x distance) 100x more (+6.6 EV)