Moving from SceneKit - fog missing

I am rewriting an unfinished SceneKit project as RealityKit (NonAR). As far as I can see, RealityKit is missing basic fog functionality?

Fog was simple & easy to implement in SCeneKit (fogStartDistance / fogEndDistance / fogDensityExponent / fogColor). Are there any plans to implement something like this in RealityKit? Are there any simple workarounds?

Answered by DTS Engineer in 855961022

Hello,

As @kemalenver suggests, RealityKit's modern rendering API requires some familiarity with shaders for post-processing effects like depth-based fog.

The simplest depth-based fog is implemented in a fragment shader that linearly blends a pixel in the frame buffer with a fog color weighted by distance.

You'll find something similar in this sample Creating a fog effect using scene depth which renders with Metal.

Likewise, check out Explore advanced rendering with RealityKit 2 for another fog effect implemented in RealityKit.

Re: "I think you can also use CGFilters for post processing which is pretty cool." do you mean CIFilters? (and yes, cool!)

Please request more sample code with the Feedback Assistant.

I'm not sure how it was implemented in SceneKit behind the scenes, I suspect it was a post processing effect. If that's the case you can use the new post processing functionality on RealityKit.

The downside is that you have to write the effect yourself. They likely moved to this model to give people more freedom with the effects, but it does mean the average developer now needs to be familiar with writing shaders. As far as these go, the new model is fairly straightforward forward, but there's a learning curve if you're not familiar with it.

I think you can also use CGFilters for post processing which is pretty cool.

I'd read up on the new beta functionality on RealityView and search for deferred fog for the technique.

I hope the Apple graphics team pump put a few example post processing effects so that people can drop them in as replacements.

Accepted Answer

Hello,

As @kemalenver suggests, RealityKit's modern rendering API requires some familiarity with shaders for post-processing effects like depth-based fog.

The simplest depth-based fog is implemented in a fragment shader that linearly blends a pixel in the frame buffer with a fog color weighted by distance.

You'll find something similar in this sample Creating a fog effect using scene depth which renders with Metal.

Likewise, check out Explore advanced rendering with RealityKit 2 for another fog effect implemented in RealityKit.

Re: "I think you can also use CGFilters for post processing which is pretty cool." do you mean CIFilters? (and yes, cool!)

Please request more sample code with the Feedback Assistant.

Thanks DTS Engineer! Here's a feedback with the request: https://feedbackassistant.apple.com/feedback/19964911

Moving from SceneKit - fog missing
 
 
Q