Post

Replies

Boosts

Views

Activity

Reply to RealityKit Blend Modes
Just found this which works a treat: let surfaceShader = CustomMaterial.SurfaceShader(named: "textureAndTint", in: library) var descriptor = CustomMaterial.Program.Descriptor() descriptor.lightingModel = .unlit descriptor.blendMode = .add let program = try! await CustomMaterial.Program(surfaceShader: surfaceShader, descriptor: descriptor) var customMaterial = CustomMaterial(program: program)
Topic: Graphics & Games SubTopic: RealityKit Tags:
1w
Reply to RealityKit Blend Modes
My shader so far, I'm not sure how to to get the source colours or set the blend modes. #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; constexpr sampler textureSampler(address::clamp_to_edge, filter::bicubic); [[visible]] void additiveShader(realitykit::surface_parameters params) { float2 uv = params.geometry().uv0(); auto tex = params.textures(); half3 color = (half3)tex.base_color().sample(textureSampler, uv).rgb; params.surface().set_emissive_color(color); } In RealityKit try customMaterial = CustomMaterial(surfaceShader: surfaceShader, lightingModel: .unlit) customMaterial.baseColor.texture = .init(MaterialParameters.Texture(tex))
Topic: Graphics & Games SubTopic: RealityKit Tags:
1w
Reply to Moving from SceneKit - fog missing
I'm not sure how it was implemented in SceneKit behind the scenes, I suspect it was a post processing effect. If that's the case you can use the new post processing functionality on RealityKit. The downside is that you have to write the effect yourself. They likely moved to this model to give people more freedom with the effects, but it does mean the average developer now needs to be familiar with writing shaders. As far as these go, the new model is fairly straightforward forward, but there's a learning curve if you're not familiar with it. I think you can also use CGFilters for post processing which is pretty cool. I'd read up on the new beta functionality on RealityView and search for deferred fog for the technique. I hope the Apple graphics team pump put a few example post processing effects so that people can drop them in as replacements.
1w
Reply to Reacting to hover on a RealityKit Entity
Hi thanks for your reply. I've created an enhancement request FB14241109 'Create developer hooks to respond to hovering over Entities'. It might be a bit late in the day, but it would be great to have this for the release of visionOS 2. I have a bunch of use cases in various apps that would be better with this functionality.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’24
Reply to MusicKit on VisionOS
+1 I'm working around this by making my music model code separate from the main app. That way I can target it to an iOS device and get it working. Then I can use that in my Vision Pro app but with mocked data. Very clunky. So basically build to an interface and mock the data on the pro when testing, and using your none mock data when going to production. Check it actually works by using your iPhone or iPad. I really wish there was a way to simulate this...
Topic: Programming Languages SubTopic: Swift Tags:
Apr ’24
Reply to Where are the SceneKit WWDC videos..?
It's certainly not had a lot of development over the past couple of years but I think that's partly for two reasons: RealityKit has clearly been a priority due to the Vision Pro. SceneKit is already fairly mature and works well - although some things are fiddly. SceneKit has a scene graph, PBR works, there are hooks for post processing (SCNTechnique), there is a way to overwrite rendering for an an object (SCNProgram or snippets) so you can basically do anything you want - but you need to be fairly experienced in rendering tech if you want to go out of the realm of the editor. I use SceneKit all the time for hobbies and it works really well - docs could be a lot better but there is stuff out there. SceneKit does not provide any VR/AR functionality so if that's a use case you can scrub it from your options.
Topic: Graphics & Games SubTopic: General Tags:
Feb ’24