Heat score
1Topic analysis
Rendering the Sky, Sunsets, and Planets
There’s this photo that’s been sitting on my inspiration board for a while, of the space shuttle Endeavour, suspended in space in low Earth orbit at sunset. It shows Earth’s upper atmosphere as a backdrop, featuring beautiful, colorful layers ranging from dark orange to blue before fading away into the deep black of space. Not only is that gradient of color aesthetically pleasing, but the phenomenon behind those colors, atmospheric scattering , is even more of an interesting topic once you start looking into how it works and how to reproduce it. I wanted to build my own version of this effect with shaders, rendering the sky’s distinctive blue color and realistic sunsets and sunrises directly in the browser. The goal was to get as close as I could to that photo, while also moving toward the kind of atmospheric rendering often seen in games and other shader-based media. Here’s a compilation of what came out of this month-long journey, all running in real time: I didn’t originally plan on writing about this subject, but the enthusiasm around the recent Artemis II mission, combined with my own interest in all things space, made it feel worth exploring in depth. It also felt like the perfect opportunity to build an interactive experience that could make the topic more accessible. In this write-up, we’ll see how to implement an atmospheric scattering shader post-processing effect step-by-step , starting with the implementation of the different building blocks (raymarching, Rayleigh and Mie scattering, as well as ozone absorption) to render a realistic sky dome , and then adapt the result to render it as an atmospheric shell around a planet . Finally, we'll look into Sebastian Hillaire’s LUT-based approach for a more performant result, or at least my attempt at implementing it, as this was very much the stepping outside of my comfort zone phase for this project. Here's a list of the key references I used during this project: You may have, at some point or another, tried to slap a blue gradient background behind some of your work in an attempt to give it a more "atmospheric" look and call it a day, but quickly noticed doing so never feels quite right 1 . For a more true to life implementation, we must treat the sky and its color as the result of light interacting with air and its constituents , while taking into account several variables, such the altitude of the observer, the amount of dust, the time of day, etc, all of that in a volume . With that established, our goal for this first part is to use this as guiding principle to lay the foundation for our atmosphere shader, and get to a result that feels almost indistinguishable from a real sky, at any time of the day. Much like how we’d approach volumetric clouds or volumetric light , one easy way to sample the atmosphere is through raymarching . We can cast rays from the camera’s position into the scene and step through the transparent medium to answer the two following questions: If you need a quick refresher on raymarching with some simple examples, I invite you to check out Painting with Math: A Gentle Study of Raymarching To answer the first one, we need to accumulate the atmospheric density encountered along the ray to obtain what is known as the optical depth . We will model this using the Rayleigh density function , which tells us how much "air" there is at a given altitude h . This is important to take into account that the atmosphere gets thinner as altitude increases. Sampling Rayleigh density and accumulating optical depth Then, from the optical depth, we can compute the transmittance T at a given point along the ray: the fraction of light that survives while traveling through the atmosphere. If you’ve read my article on volumetric clouds 2 , we’re using a formula that may look familiar for this: Beer's Law : Computing transmittance The rayleighBeta variable, or Rayleigh scattering coefficient , tells us how much red, green, and blue light gets scattered by air molecules over a given distance. In shader code, we store it as vec3(0.0058, 0.0135, 0.0331) . With this in place, we can now describe how light is attenuated as it travels through the atmosphere. However, density and transmittance only tell us how much light is available to scatter, not how that light is distributed toward the viewer. For that, we need to account for the angle between the incoming sunlight and the view ray, which is what the Rayleigh phase function models. Rayleigh phase function Putting all this together, we can have a somewhat accurate representation of how much scattered light accumulates along a given ray at any given altitude. The widget below represents the process we just described, showing you: As you can see, we’re accumulating shades of blue at lower altitude! This is mostly due to the Rayleigh scattering coefficient’s value: Since shorter wavelengths scatter more strongly, more blue light is redirected toward the viewer, thus resulting in the sky appearing blue during daytime. If we expand this idea into a full-on fragment shader, going from a single ray to one ray per pixel, we can render a realistic sky, as demonstrated below: This raymarching process yields a beautiful blue sky , with a lighter white haze towards the horizon as rays travel through more atmosphere there, and deeper, darker blue colors as the altitude increases and the atmosphere gets thinner. While Rayleigh scattering alone yields a decent result, there are still additional atmospheric effects that we can take into account to make our sky rendering closer to reality: The first one can be modeled with the following two functions: Mie density and phase function To get the updated scattering term that takes Mie scattering and Ozone into account, we simply add it to the current implementation of our sky shader on top of the Rayleigh density and phase function: Rayleigh, Mie, and Ozone scattering terms Mie scattering, like its Rayleigh counterpart, also requires a few variables to model its effect accurately: Regarding the ozone model, we also have a few additional variables to go through: The widget below showcases the result of integrating both of those new terms into our sky shader: As you can see, this version yields both: At this point, we have a decent sky fragment shader capable of rendering a natural color for any altitude and taking into account a diverse set of transmittance models (Mie, Rayleigh, and Ozone). That still leaves us with lighting to work on. You may have noticed in the previous widget that moving the sun close to the horizon only results in a white, hazy glow , without any light attenuation or a sunset/sunrise effect. This is expected, as our current raymarching loop only accounts for light being attenuated along the view ray, from the camera to each sample. It does not yet account for how much sunlight is lost while traveling through the atmosphere before reaching that sample point. As we did for in related past articles, we need to introduce, for any given sample point alongside our ray, a standalone nested loop to light-march in the direction of the light source and sample the transmittance along that path. In our previous implementation, the optical depth was only computed along the ray through viewODR , viewODM , and viewODO . For this updated version, we will: With this in place, we now have the ability to render our sky under any light condition; sunsets, sunrises, zenith, and anything in between. I invite you to take a little break and play with the widget above to appreciate the different colors of the sky our shader can now yield through this now fully implemented sky model. Notice how: The shader we just built in this first section checks a lot of boxes, but we have in place right now is just a mere flat background. If we were to use it in a React Three Fiber scene in its current state, we would simply have a nice backdrop for our scenes and not much more beyond that. In this section, we will turn our flat shader into a proper post-processing effect , allowing us to render the atmosphere as: To apply atmospheric scattering to a scene, we aren't just drawing a sky; we need to fill the space between the camera and the different objects rendered on screen. Lucky us, we already partially did that work in part one: we have all the density data necessary to compute the stuff in the volume that is our 3D scene. The only thing needed here is to: getWorldPosition function This is the exact same process as the one we performed in On Shaping Light , where we explored ways to render Volumetric Lighting as a post-processing effect in a React Three Fiber scene. In this article, we went from screen-space to world-space using the formula below: xNDC = uv.x * 2.0 - 1.0 yNDC = uv.y * 2.0 - 1.0 zNDC = depth * 2.0 - 1.0 clipSpace = { x: xNDC, y: yNDC, z: zNDC, 1.0 } worldSpace = viewMatrixInverse * projectionMatrixInverse * clipSpace worldSpace /= worldSpace.w Now that we know how to obtain the worldPosition of the current pixel, we can: Doing this will ensure our raymarch loop now marches along a 3D ray . Sampling along a 3D ray The last thing we need to do now is to have our raymarching take into account any geometry in the scene. To do so, we will use the depth buffer of our scene to define our raymarch stepSize rather than using a constant so that we can space our sample points to fit the ray we are currently marching along. The playground below renders the same shader we put together earlier, but this time as a post-processing effect, letting us render Atmospheric Scattering throughout the scene’s volume, taking its geometries into account, with our sky shader as a backdrop. With that implemented, we can start providing a more realistic ambient sky to any scene that would need it, and also have some fun with some silly interactions like this one below, implemented with a Raycaster : atmosphere post-processing effect now with draggable celestial objects https://t.co/xejzC5SWuc https://t.co/U342icnvxz I purposefully omitted anything related to the steps to set up the underlying React Three Fiber scene with a custom post-processing effect, as I walked through those in many previous articles . The demo provided above contains all the code for you to read through if necessary. We’re finally reaching the part you probably came here for in the first place: rendering a realistic atmosphere around planets! Luckily, with everything we built up to this point, we only have two steps missing to achieve that: Since we’re working at a planetary scale in this section, we can expect a lot of “depth fighting” when viewing our planet from afar, as it is hard for our shader to differentiate the depth between the atmosphere and planet shell from a large distance (the atmosphere height being only a few km). We need to adjust both the way our depth buffer is defined in our React Three Fiber scene and how it’s read. To do so, we set logarithmicDepthBuffer to true in the gl prop of our Canvas component that wraps the entire scene definition: Enabling logarithmic depth buffer for our scene Then, in our shader, we redefine our sceneDepth as follows to convert the lograithmic depth buffer received by the post-processing effect, and convert it back into a distance along the ray. Updated getWorldPosition function For the second point, we will use a ray-sphere intersection test to find where our view ray enters and exits the atmospheric sphere . Once we have those two points, we can limit our raymarching loop to that segment without wasting samples outside the atmosphere. However, just doing a single test is not enough. We also want to model our planet as a sphere mesh surrounded by a slightly larger atmosphere sphere, and thus, we will need to perform the same test against the planet itself. If the ray hits the ground before it exits the atmosphere, we use that ground intersection as the end of our raymarching segment. Using ray-sphere intersection points in our raymarching loop For the definition of the raySphereIntersect function itself, I recommend checking our Inigo Quilez post on the topic: Ray-Surface intersection functions . Going forward, we will use the following implementation for our shader: One additional thing we need to adapt is the end of our raymarching segment to handle objects within the scene. The atmosphere may stop for two different reasons: In both cases, we want to stop marching at the closest relevant object. Notice how, without this logic, the surface of the planet will appear in front of our object. With those two parts now in code, we have a full implementation of atmospheric scattering as a post-processing effect and can render atmospheres around planets. The scene below renders a simple “Sun - Earth system” in React Three Fiber, with our custom effect in place. I invite you to take some time to adjust the position of the sun, zoom out, and enjoy the sky colors this shader can yield from different angles, from ground to orbit. The effect you can see in this demo is the same one I used to take the photos for the posters I posted in early April to announce this article: outline for my upcoming, and very much on theme, article on atmospheric scattering felt inspired and made posters with photos of actual renders made with the techniques you’ll learn in it :) very excited for this one https://t.co/wSjdQPyoI0 You may notice that the torus in the scene is still "lit-up" even once the sun has set. This is mostly a scale issue, the shadow-map / shadow-camera of our main directional light is small, and does not cover the torus as it's simply too far. For now, the cleanest workaround I'd try first would be to reuse the shadow-mapping approach from my volumetric lighting article : render a depth map from the sun’s point of view, then use that data to decide whether the mesh should receive light or not. That's at least what my instincts tell me, I have not tried it. This is a little bonus section where I’d like us to answer the question: how can we handle large celestial objects blocking the sun? We now have a decent understanding of what’s at play in this atmospheric scattering shader when it comes to lighting, and adding this extra test is relatively easy. We can add, after our lightMarch function, a function call that would return the sunVisibility ranging from [0, 1] and multiply the transmittance by this value. The function itself could be as easy as doing a dot product between: If they were to match closely, i.e., close to 1.0 , that means the moon would be obstructing the sun, and vice versa; if they were orthogonal, close to 0.0 , there would be no obstruction. However, this doesn’t take into account the size and scale of the object in the scene. We need a function that can handle the three cases described in the diagram above: sunVisibility function Here, float angularSep = acos(clamp(dot(sunDir, moonDir), -1.0, 1.0)) represents the angular separation between the sun and moon directions. We can then use this value to compare it with the different angular thresholds outerEdge and innerEdge , representing, respectively, the angles at which the two discs start touching externally / internally. The demo below implements this sunVisibility function on top of our previous example, and also adds a moon mesh to our system. Try to align the moon with the sun, and notice how our Atmospheric Scattering shader properly handles the lack of light in those cases. As always this only scratches the surface, some talented folks dived deep in the topic of rendering solar eclipses and simulating the corona accurately. If you're curious, here's a paper on the matter: Physically Based Real-Time Rendering of Eclipses . I haven't gone through the process of reading the paper and porting the implementation to WebGL, but if you do, let me know! Another bonus section! It’s your lucky day! The model we’ve been using throughout this article to simulate atmospheric density and scattering is mostly governed by a handful of constants: These are the main knobs that make our rendered atmosphere look the way it does. Thus, by tweaking them to the right set of values, we could, in theory, approach a martian atmosphere or even other planets'. Below is the set of values I set for Mars: Just replacing our constants with these gives us a more dusty, orangy atmosphere. Even better, we get Mars' distinctive blue hue at sunset! Below are a couple of screenshots I took while working on this. You can try plugging those values into the previous demo to see the result by yourself. You guessed it, yes, there's a paper on this topic as well: Physically Based Rendering of the Martian Atmosphere . The resulting shader we’ve built, albeit intuitive and able to render atmosphere at small and large scales, is unfortunately quite expensive to run: Alongside tackling those drawbacks, I also wanted to study how the pros were doing it when I reached this point in my exploration of atmospheric scattering. Sebastian Hillaire proposed in his paper titled A Scalable and Production Ready Sky and Atmosphere Rendering Technique , a method to render atmosphere based on Look Up Tables (LUTs), i.e. textures that can hold expensive scattering calculations, so the final render samples and composes those precomputed textures. In this part, we will look into the respective implementations of: While this method is well-documented, the process highlighted in the paper was still a little bit beyond my current skill level , and despite spending a decent amount of time on it, I did not implement the full paper. Among the things I took shortcuts on are: Despite that, the result is on par with our original fully raymarched implementation, and it is still interesting to investigate together. In our original shader, every sample point calls the lightmarch function to get the amount of light from our sun that reaches it, which, as you may guess, is quite expensive. The goal of this LUT is to store that data beforehand, preferably at a low resolution, so we can then load it into subsequent LUTs whenever we need that light data. My implementation for this LUT, and any that follows, consists of: It may seem a bit convoluted, but as said before, ideally, you’d use WebGPU and compute shaders for this and thus not need those FBOs. For the tramittance, we’re extracting the expensive lightmarch loop into its own pass by putting it in the transmittanceLUTFragmentShader . The code below is what I used to generate my texture: This results in the following transmittance LUT texture: Here’s how you can interpret this texture: Subsequent LUTs can now answer the question of "how much light survives at a given angle and altitude through our atmosphere" very quickly by just looking up that value in this texture. These two LUTs leverage the transmittance data we just computed in its respective texture and answer two complementary questions: Combining both those LUTs will give us the full atmospheric scattering effect. The former handles far-field color while the latter calculates near-field haze. Using a similar process involving FBO and off-screen scenes, we can define distinct shaders to generate both LUTs. For the Sky View texture, I ended up with the following code: Excerpt of the Sky View LUT The major thing to highlight here is the getSkyViewRayDir , which defines our raymarching ray directions. In this case: With this definition of our rayDir , our raymarching loop here yields a texture representing the color of the sky for directions across the entire sky dome. When it comes to the Aerial Perspective, as mentioned earlier, I slightly diverged from Hillaire’s paper. My resulting texture is a 2D texture where each pixel corresponds to one visible screen pixel. I rely on the depth buffer of the scene to tell how far along the ray we should march and accumulate scattering. As a result, this lets me reuse more or less the same scattering code introduced in the first part, except that now each sample pulls sunlight visibility from the Transmittance LUT. The output stores the accumulated atmospheric scattering in RGB and a packed view transmittance value in alpha, which we will use later during composition. Excerpt of the Aerial Perspective LUT I may revisit this process later this year once I muster the courage to convert all this code to WebGPU and follow Sebastian Hillaire’s process a bit more closely. I will make sure to update this article with the relevant up to date code when the time comes. With the Sky-view and Aerial Perspective LUTs generated, we have only one step remaining: combining them in a final post-processing pass to achieve the full LUT-based atmospheric scattering result. The code mainly consists of: The playground below contains all the full implementation of our LUT-based atmosphere: all the LUTs and their corresponding shader, as well as the final post-processing pass. It is a bit dense, so I’d recommend checking the implementation directly at this Github link , where you’ll find the code that renders the scene below. This version of atmospheric scattering may look almost identical to the one we worked on in the earlier parts of this post, but the underlying process is different: we split the work into smaller LUTs that we then compose in the final effect . Most importantly, instead of repeatedly raymarching toward the sun to figure out how much light reaches each sample, we can fetch that lighting information directly from the Transmittance LUT, replacing a costly nested loop with a simple texture lookup and resulting in a non-negligible performance boost for the final scene. Despite that, my LUT-based implementation pales in comparison to what Sébastian Hillaire and others in the field came up with: If you want to look at a real production-grade implementation, I highly recommend checking out three-geospatial by Shoda Matsuda ( @shotamatsuda ). His work on skies, clouds, and geospatial rendering has been a huge reference point for me, and the images he shares on social media speak for themselves. Nonetheless, I learned a lot throughout this entire project, especially through the LUT-based approach, which took me out of my comfort zone when it comes to creating screen-space depth-aware post-processing effects. It also consolidated some previous learnings, and resulted in a series of beautiful visuals (which is the most important after all). I’m very happy with the result of those experiments. I also worked on adding volumetric clouds on top of that, but the result is still a bit of a mixed bag and needs more work put into it before I could be proud enough of it to showcase it in a write-up. This will have to wait. Until then, I’m looking forward to leveraging that work to complement my upcoming projects and scenes I have been slowly shaping in my head . I have tried this many times over . Real-time Cloudscapes with Volumetric Raymarching introduces a lot of concepts used here. This is a workaround to avoid too much blinking of the skyview at a large distance
Sources
1Platforms
1Relations
0- First seen
- May 12, 2026, 9:26 PM
- Last updated
- May 13, 2026, 12:02 AM
Why this topic matters
Rendering the Sky, Sunsets, and Planets is currently shaped by signals from 1 source platforms. This page organizes AI analysis summaries, 1 timeline events, and 0 relationship edges so search engines and AI systems can understand the topic's factual basis and propagation arc.
Keywords
9 tagsSource evidence
1 evidence itemsRendering the Sky, Sunsets, and Planets
News · 1Timeline
Rendering the Sky, Sunsets, and Planets
May 12, 2026, 9:26 PM
Related topics
No related topics have been aggregated yet, but this page still preserves the AI summary, source links, and timeline.