Nothing yet. Clear doesn’t match the bottom style. So there’s something else going on - I guess it’s adaptive, but it would be useful to know how/why it adapts.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Thanks DTS Engineer! Here's a feedback with the request: https://feedbackassistant.apple.com/feedback/19964911
Topic:
Graphics & Games
SubTopic:
RealityKit
Just found this which works a treat:
let surfaceShader = CustomMaterial.SurfaceShader(named: "textureAndTint", in: library)
var descriptor = CustomMaterial.Program.Descriptor()
descriptor.lightingModel = .unlit
descriptor.blendMode = .add
let program = try! await CustomMaterial.Program(surfaceShader: surfaceShader, descriptor: descriptor)
var customMaterial = CustomMaterial(program: program)
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
My shader so far, I'm not sure how to to get the source colours or set the blend modes.
#include <metal_stdlib>
#include <RealityKit/RealityKit.h>
using namespace metal;
constexpr sampler textureSampler(address::clamp_to_edge, filter::bicubic);
[[visible]]
void additiveShader(realitykit::surface_parameters params)
{
float2 uv = params.geometry().uv0();
auto tex = params.textures();
half3 color = (half3)tex.base_color().sample(textureSampler, uv).rgb;
params.surface().set_emissive_color(color);
}
In RealityKit
try customMaterial = CustomMaterial(surfaceShader: surfaceShader,
lightingModel: .unlit)
customMaterial.baseColor.texture = .init(MaterialParameters.Texture(tex))
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
Hi, it's still not visible. I'm running Xcode beta 6. My app is an iPad OS app and I'm using RealityView.
I can only see the Capture GPU Workload button. There's no way for me to debug my scene using the RealityKit debugger. Is it only available for VisionOS?
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
I'm not sure how it was implemented in SceneKit behind the scenes, I suspect it was a post processing effect. If that's the case you can use the new post processing functionality on RealityKit.
The downside is that you have to write the effect yourself. They likely moved to this model to give people more freedom with the effects, but it does mean the average developer now needs to be familiar with writing shaders. As far as these go, the new model is fairly straightforward forward, but there's a learning curve if you're not familiar with it.
I think you can also use CGFilters for post processing which is pretty cool.
I'd read up on the new beta functionality on RealityView and search for deferred fog for the technique.
I hope the Apple graphics team pump put a few example post processing effects so that people can drop them in as replacements.
Topic:
Graphics & Games
SubTopic:
RealityKit
I have a similar problem. I have a particle effect that is contained within an object that is semi opaque (it looks like glass) The particles are flashing in and out. Once I get through the glass it all works ok.
I filed a feedback to the RealityKit team explaining my issue: FB14285128
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
Hi thanks for your reply. I've created an enhancement request FB14241109 'Create developer hooks to respond to hovering over Entities'.
It might be a bit late in the day, but it would be great to have this for the release of visionOS 2. I have a bunch of use cases in various apps that would be better with this functionality.
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
+1
I'm working around this by making my music model code separate from the main app. That way I can target it to an iOS device and get it working. Then I can use that in my Vision Pro app but with mocked data. Very clunky.
So basically build to an interface and mock the data on the pro when testing, and using your none mock data when going to production.
Check it actually works by using your iPhone or iPad.
I really wish there was a way to simulate this...
Topic:
Programming Languages
SubTopic:
Swift
Tags:
It's certainly not had a lot of development over the past couple of years but I think that's partly for two reasons:
RealityKit has clearly been a priority due to the Vision Pro.
SceneKit is already fairly mature and works well - although some things are fiddly.
SceneKit has a scene graph, PBR works, there are hooks for post processing (SCNTechnique), there is a way to overwrite rendering for an an object (SCNProgram or snippets) so you can basically do anything you want - but you need to be fairly experienced in rendering tech if you want to go out of the realm of the editor.
I use SceneKit all the time for hobbies and it works really well - docs could be a lot better but there is stuff out there.
SceneKit does not provide any VR/AR functionality so if that's a use case you can scrub it from your options.
Topic:
Graphics & Games
SubTopic:
General
Tags:
Hi Quinn. This an iOS app.
Topic:
App & System Services
SubTopic:
General
Tags:
This is annoying beyond words - for now you can get to Connect by logging in through https://developer.apple.com/ and then choosing AppStore Connect from the menu on the left.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Another gotcha to look our for are things in asset catalogues such as colours. i.e.
var mycol = UIColor(named: "mycol")!
will crash because the colour can't be found in whatever context the xib is running. Do this instead:
var mycol = UIColor(named: "mycol", in: Bundle(for: MyView.self), compatibleWith: nil)!
Topic:
Programming Languages
SubTopic:
Swift
Tags: