Engines exhaust and light

N_Molson

Addon Developer
Addon Developer
Donator
Joined
Mar 5, 2010
Messages
9,272
Reaction score
3,244
Points
203
Location
Toulouse
I was thinking about the most realistic way to implement lighting coming from the engines exhausts of a rocket. The way I understand it, that light comes from ionized gases generated by the extreme heat coming out of the nozzles. Then there is the diffusion thing, meaning than in atmosphere light will bounce from molecule to molecule and create some kind of halo. This is magnified by weather conditions such as fog (when there is a lot of water in the air). So the model of "a big ball of light" (pointlight) below the nozzles is reasonably accurate.

Now while the rocket climbs to space, atmosphere gets thinner and thinner (we see in rocket launches videos that the exhausts shape changes completely, as gases are much less "contained" by the surrounding atmosphere they tend to expand in all directions, creating large and ghostly plumes.

What should be the effect on lighting ? It seems to me that with a lower "ionized gases density" below the rocket, the light coming from it should dim, right ? And once in vacuum, some engines leave almost no visible exhaust plume at all (unlike what you have in say Kerbal Space Program and a lot of 3D animations...). So in those conditions, I mean without any exhaust plume visible, no light should come "from below the rocket", right ?

Now, in vacuum, the long nozzles typically used do that "containement" job, which means the volume of gases inside those nozzles is still compressed and ionized. So, instead of a "pointlight" behind the rocket, we should have "spotlights" where each nozzle acts as a projector, right ?

I'd like to have your inputs on this, discuss it, maybe links from launch videos, etc... :cheers:
 

n72.75

Move slow and try not to break too much.
Orbiter Contributor
Addon Developer
Tutorial Publisher
Donator
Joined
Mar 21, 2008
Messages
2,687
Reaction score
1,337
Points
128
Location
Saco, ME
Website
mwhume.space
Preferred Pronouns
he/him
On my phone right now, so I cant write as much as I'd like. I'm add more later in this subject.

The luminosity of the exhaust plume should decrease with altitude, as the density of the plume decreases and the exhaust goes from over expanded to under expanded. The light from the chamber should remain relatively unchanged, as it's on the other side of the throat normal shock and downstream changes in pressure should leave it virtually unaffected.
 

N_Molson

Addon Developer
Addon Developer
Donator
Joined
Mar 5, 2010
Messages
9,272
Reaction score
3,244
Points
203
Location
Toulouse
So yeah, I'd say it should pretty much follow the ParticleStreams ATM_LOG setting given that as you say light is emitted from the exhaust stream we simulate with the ParticleStream thing, and the LOG function works rather well there.
 

N_Molson

Addon Developer
Addon Developer
Donator
Joined
Mar 5, 2010
Messages
9,272
Reaction score
3,244
Points
203
Location
Toulouse
So I managed to set light intensity as a function of atmospheric pressure, and the result is quite interesting, above 20 km it dims very quickly, gives a visual clue you are transitioning from atmosphere to space. (y)
 

4throck

Enthusiast !
Joined
Jun 19, 2008
Messages
3,502
Reaction score
1,008
Points
153
Location
Lisbon
Website
orbiterspaceport.blogspot.com
Here you can get an idea of how much light is generated by the exhausts:

Separation is at around 3:49, but you can also see how it illuminates the pad at lift-off.
Sharing a SpaceX video because their cameras are more consistent.
 

Thorsten

Active member
Joined
Dec 7, 2013
Messages
785
Reaction score
56
Points
43
Been working on that quite a lot myself, so I have a well-developed set of ideas...

Then there is the diffusion thing, meaning than in atmosphere light will bounce from molecule to molecule and create some kind of halo.

In theory - yes. In practice, the only light source that creates appreciable scattering on air molecules however is the sun (that's sky-blue and the reason why shadows on a sunny day are not grey but bluish). A rocket exhaust produces not enough light by far to visibly illuminate the atmosphere except...


This is magnified by weather conditions such as fog (when there is a lot of water in the air).

... when the atmosphere is optically thick (in clean air, you can see some 300 km at sea level, in heavy fog it might merely be a couple of meters). But the scattering in fog tends to be odd - that's usually Mie scattering which is pretty directional - the fog lights up when it can do forward scattering. That's why less dense clouds with the sun behind them acquire spectacular colors, but when the sun is elsewhere the same clouds look pretty unassuming.

There's no simple way to get Mie scattering properly implemented, it has to be part of a rendering framework, but I'd argue that to a first approximation, a rocket exhaust seen through fog isn't the most important thing you need to worry about.

Now while the rocket climbs to space, atmosphere gets thinner and thinner (we see in rocket launches videos that the exhausts shape changes completely, as gases are much less "contained" by the surrounding atmosphere they tend to expand in all directions, creating large and ghostly plumes.

That's in my opinion the most important change as you transit from ground to atmosphere - the exhaust plume widens (and browsing through a screenshot collection of many space apps, games etc. most commercial developers seem blissfully unaware of the effect...)



It seems to me that with a lower "ionized gases density" below the rocket, the light coming from it should dim, right ?

Yes, a bit. But the plume doesn't widen dramatically. You still see lots of flame from the Shuttle SRBs at the separation point, and that's 45 km in altitude - or more than five atmospheric scale heights, so most of the density is already well gone.

Say the plume expands by a factor 10, but the light intensity you perceive is the line integral through the plume, so the expansion along your view direction doesn't matter for the light you see, so you lose just about sqrt(10) or a factor 3. But the eye perceives intensities logarithmically, so a factor three under the perception log isn't really that much effect, it dims slightly, but sky-blue dims a lot which gives you better contrast, so in the end it depends more on where your viewpoint is.

Also, generally if you want to compare with videos, photos etc. - a recurring theme is that one has to be careful because the low light characteristics of the human eye and a camera are rather different. I've had more than one discussion about afterburner flames at night with people pulling very bright photos from the internet - while the consulted military aviator confirmed that this isn't how you really see the exhaust. Generally video and photo comparisons by day are far more trustworthy.
 

N_Molson

Addon Developer
Addon Developer
Donator
Joined
Mar 5, 2010
Messages
9,272
Reaction score
3,244
Points
203
Location
Toulouse
Good post, thank you!

Yeah obviously at night things are different, I'd say cameras saturate just like microphones do and tend to output a big ball of white light, while the eye adapts better.

I must say I never saw in person a rocket launch ; that would involve traveling to Florida, French Guyana or Baikonur, or maybe China if they make launches public. All those destinations are quite far from France mainland, and the trip would be quite costly. One day...

Another interesting, almost philosophical question is : what do we want to simulate : cameras or the human eye ? I'd say both : we want things like rocket cams and 'god-mode' external views... Those are choices to make...
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,429
Reaction score
680
Points
203
Well, something most games actually do lack is a good simulation of an Automatic Lighting Control (ALC). The ALC is what controls the opening of the iris on a camera. When in auto mode, the ALC, will automatically adjust the iris opening to prevent what you describe which is oversaturation or over-exposure. It also prevents under-exposure.
 

N_Molson

Addon Developer
Addon Developer
Donator
Joined
Mar 5, 2010
Messages
9,272
Reaction score
3,244
Points
203
Location
Toulouse
I guess that would be work for our gifted friend Jarmonik... But he's doing so much already, I'll leave him alone ;)
 

4throck

Enthusiast !
Joined
Jun 19, 2008
Messages
3,502
Reaction score
1,008
Points
153
Location
Lisbon
Website
orbiterspaceport.blogspot.com
Some games support HDR monitors/TVs so you can certainly work based on physics and NIT values.
But you'd need to use physical values for all the Solar System, every material, etc., etc.

My suggestion is to simply simulate what regular SDR cameras show....
 

Thorsten

Active member
Joined
Dec 7, 2013
Messages
785
Reaction score
56
Points
43
Another interesting, almost philosophical question is : what do we want to simulate : cameras or the human eye ?

Well, I've always preferred what the human eye can see (my theme in simulation is immersion, I want the feeling of 'being there') - so in designing the ALS rendering framework for Flightgear, I've consistently banned effects like lense flare or motion blur which are really characteristic for cameras and instead have used a human-eye tuned model for the mapping of light source intensity to pixel color. Apart from the very bright (a screen conceptually can't blind you like the sun when you look directly into the virtual sun) or the very dim (when the perception transits from the focus area of the eye to the periphery, the light sensitivity increases a lot and the perception formula works no longer like the log) that works quite well.

Though I admit lense flare etc. are really popular with some people, that kind of 'game aesthetics' is what many are used to.

Whatever you decide to do, the trick is to do it consistently across the whole scene, otherwise the effect will be much reduced.

When in auto mode, the ALC, will automatically adjust the iris opening to prevent what you describe which is oversaturation or over-exposure. It also prevents under-exposure.

But human eye perception doesn't work that way - you know that the color of something didn't change just because the window shutter opened, so the brain corrects for the pupil reflex much more smoothly than a camera - in reality the light intensity can change by a factor of 100.000 easily and you still see pretty the same color. When a camera closes the opening, you see a very visible change of the scene in the movie (I've struggled with that often enough, because more often than not we don't want the scene to change color when panning the camera, and it's a bitch to correct in post-processing...)
 

n72.75

Move slow and try not to break too much.
Orbiter Contributor
Addon Developer
Tutorial Publisher
Donator
Joined
Mar 21, 2008
Messages
2,687
Reaction score
1,337
Points
128
Location
Saco, ME
Website
mwhume.space
Preferred Pronouns
he/him
One other thing to remember, speaking of the difference between the human eye and cameras, with old films of rocket launches, exhaust can be a bit pinkish due to halation.
 

Linguofreak

Well-known member
Joined
May 10, 2008
Messages
5,017
Reaction score
1,254
Points
188
Location
Dallas, TX
Well, I've always preferred what the human eye can see (my theme in simulation is immersion, I want the feeling of 'being there') - so in designing the ALS rendering framework for Flightgear, I've consistently banned effects like lense flare or motion blur

I take it you don't wear glasses? :) Of course, lense flare from dirty glasses doesn't tend to be quite as spectacular as camera lense flare, but it is a thing.

Motion blur also is something that you definitely get with the human eye, the big difference being that human vision doesn't work frame-by-frame. Blur can help avoid frame-based effects like wagon-wheeling.

It just struck me: I've generally found refresh rates on displays in excess of 60 Hz rather silly: I don't tend to notice framerate dips until things get below ~20 Hz. But a good way of simulating the frameless nature of human vision would be to not update the whole screen at once, instead, stochastically choose which pixels to update every frame, and that strategy will probably work better the higher your framerate is. So 120+ Hz may not be so silly after all.
 
Top