How to create a visionOS project in Xcode15.0beta
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to do visionOS games. Which one is better, SwiftUI or UIKit? What are the advantages?
I am develop visionOS app. I am now very interested in Metal and Compositor Services, but I have not explored them in depth. I know that Metal has a higher degree of control freedom. I am wondering if using Compositor Services will have fewer functions than RealityKit in AR technology (such as scene reconstruction and understanding, hover effect, etc.).
I’m working with RealityView in visionOS and noticed that the content closure seems to run twice, causing content.add to be called twice automatically. This results in duplicate entities being added to the scene unless I manually check for duplicates. How can I fix that? Thanks.
I have a USDZ model called 'GooseNModel' in the visionOS App project. I'm sure that this model contains an animation, so I wrote the following code to display the model with animation:
import SwiftUI
import RealityKit
RealityView{ content in
if let GooseNModel = try? await Entity(named: "GooseNModel"),
let animation = GooseNModel.availableAnimations.first {
GooseNModel.playAnimation(animation)
content.add(GooseNModel)
}
}
But when I ran it, I found that it did not contain animation as I imagined, but a static model. How to solve it?
When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code:
RealityView { content in
if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) {
content.add(model)
}
}
Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add?
Note:I have seen https://developer.apple.com/videos/play/wwdc2024/10101/, but I don't know much about it.
In RealityView I have two entities that contain tracking components and collision components, which are used to follow the hand and detect collisions. In the Behaviors component of one of the entities, there is an instruction to execute action through onCollision. However, when I test, they cannot execute action after collisions. Why is this?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
AR / VR
RealityKit
Reality Composer Pro
visionOS
Does visionOS support mapkit?
What statements have been added to SwiftUI in visionOS development compared with the original SwiftUI?
If I learn all in https://developer.apple.com/visionos/planning/, can I master all the knowledge in visionOS development?
What are the compatible frameworks with visionOS?
I have some question about visionOS:
Does Apple open the eye tracking API to developers? If I want to know how to achieve it, when the eyes are stared at a specific View, a Boolean value can be changed to true, and when the eyes are removed from this View, it will become false.
In ImmersiveSpace, when immersionStyle is .full or .progressive, a black background will appear by default. How can I turn this background into a panorama of my own?
In ImmersiveSpace, how to make a View always follow the user?
I didn't find any errors in my program, and Xcode didn't report any errors in the program code, but when I ran it, it inexplicably reported an error:
Command CompileAssetCatalog failed with a nonzero exit code
What should I do?
Topic:
Programming Languages
SubTopic:
Swift
Tags:
Swift
Xcode
Xcode Sanitizers and Runtime Issues
visionOS
Apple launched the Developer Strap on the developer's official website, which allows developers to connect Vision Pro and Mac with USB-C, but I can't get it in China, so I want to ask developers who already have Vision Pro or People who understand these questions: Can Vision Pro without Developer Strap be debugged wirelessly? If so, How is the experience of unlimited debugging? and how does it compare with wired connection? Thank you!
I saw onnoffitacation in the Behavior configuration of Reality Composer pro, which asked me to enter the Nofficatition name, that is to say, this requires swift in Xcode to send a message. There is a message name in the message, so I hope you can write an list for me how to use Swift in Xcode to send a message containing the message name.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
SwiftUI
RealityKit
Reality Composer Pro
visionOS