When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code:
RealityView { content in
if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) {
content.add(model)
}
}
Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In the AudioServicesPlaySystemSound function of AudioToolbox, you can enter the corresponding SystemSoundID to play some sound effects that come with the system. However, I can't be sure what sound effect each number corresponds to, so I want to know all the sound effects in visionOS and its corresponding SystemSoundID.
In vision OS, the tab bar of TabView is outside the window by default.
If I switch a page without TabView to a page that needs TabView in my program, the tab bar will suddenly appear on the left side of the screen without any animation. I hope it has an animation when it appears (such as easeIn, move). I tried it in Tab. Other animation-related modifiers such as animation are added under View, but there is no animation in the tab bar. Only the view in the tab has an animation effect, but this is not what I want. What I want is that the tab bar outside the window can have animation. What should I do?
My App needs to send and receive messages to the server, but my server does not have SSL, so I can only disable ATS in the development stage. But if I want to put the app on the shelf, then I still disable ATS when I put it on the shelf, and the server still does not have SSL. Will it be packaged? Is pp warned and terminated by Xcode? Will it be rejected by the Apple audit department? Can it be put on the App Store normally and provided to all users?
Note: My server is completely safe without any security risks. I didn't apply for SSL just because I didn't have enough funds.
Topic:
App & System Services
SubTopic:
Networking
Tags:
Sign in with Apple
Authentication Services
App Store Server API
visionOS
We can add many models in the Reality Composer Pro scene, but when I use RealityView to display and add modifiers in SwiftUI, the modifiers will have Effect, and I don't want to do this. I hope this modifier will be valid for a single model in the Reality Composer Pro scenario.
May I ask how to add modifiers to a single model in the Reality Composer Pro scene?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
SwiftUI
RealityKit
Reality Composer Pro
visionOS
I learned Sharplay from the WWDC video. I understand the creation of seats, but I can't learn some of the following contents well, so I hope you can help me. The content is as follows: I have set up the seats.
struct TeamSelectionTemplate: SpatialTemplate {
let elements: [any SpatialTemplateElement] = [
.seat(position: .app.offsetBy(x: 0, z: 4)),
.seat(position: .app.offsetBy(x: 1, z: 4)),
.seat(position: .app.offsetBy(x: -1, z: 4)),
.seat(position: .app.offsetBy(x: 2, z: 4)),
.seat(position: .app.offsetBy(x: -2, z: 4)),
]
}
It was mentioned in one of my previous posts: "I hope you can give me a SharePlay Button. After pressing it, it will assign all users in Facetime to a seat with elements quantified in TeamSe lectionTemplate.", and someone replied to me and asked me to try systemCoordinator.configuration.spatialTemplatePreference = .custom (TeamSelectionTemplate()), however, Xcode error Cannot find 'systemCoordinator' in scope How to solve it? Thank you!
I followed the WWDC video to learn Sharplay. I understood the first creation of seats, but I couldn't learn some of the following content very well, so I hope you can give me a list code. The contents are as follows:
I have already taken a seat.
struct TeamSelectionTemplate: SpatialTemplate {
let elements: [any SpatialTemplateElement] = [
.seat(position: .app.offsetBy(x: 0, z: 4)),
.seat(position: .app.offsetBy(x: 1, z: 4)),
.seat(position: .app.offsetBy(x: -1, z: 4)),
.seat(position: .app.offsetBy(x: 2, z: 4)),
.seat(position: .app.offsetBy(x: -2, z: 4)),
]
}
I hope you can give me a SharePlay Button. After pressing it, it will assign all users in Facetime to a seat with elements quantified in TeamSelectionTemplate. Thank you very much.
How to display the user's own persona in a view
Please treat me as a beginner of Unity.
Now I want to learn to develop visionOS VR App through unity. I try to find a relatively complete route and start learning, but Unity's official website does not have much explanation for visionOS VR App, so I hope you can give me a comparison. The whole route, thank you!
Topic:
Graphics & Games
SubTopic:
General
Tags:
Games
Apple Unity Plug-Ins
WWDC23 Community
visionOS
I want to test the in-app purchase in visionOS Simulator. I logged in to the Sandbox account in Simulator and was prompted to send the verification code to the phone number I wrote in the previous step. I entered the verification code accurately, but I couldn't log in normally. It was wet many times.
Topic:
App & System Services
SubTopic:
StoreKit
Tags:
StoreKit Test
In-App Purchase
visionOS
StoreKit
I hope to send my income to my Apple Cash (Apple Cash can display the virtual card number in iOS 17.4 Beta). Is this okay?
If you can, how to fill in questions such as ABA Routeing Number in Agreements, Tax, and Banking? And will Apple Cash give me interest (I only have Apple Cash)
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Wallet
App Store Connect
Analytics & Reporting
In visionOS, I want show a 3D Content, I can use RealityView or Mode3D, But the effect they achieve is similar. What is the difference between them and which one to use for users?
In order to improve App sales, I hope to promote App. At present, I only know from the VisionOS App Submission Guide that I can get the opportunity to become an editor's choice to promote the App by sharing my APP. Is there any other way to promote VisionOS App?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect API
Tags:
App Store Connect API
App Store Server Library
visionOS
We can use AnchorEmpty to fix RealityView in one place, such as a wall. Now I hope that it can not only be fixed in one place, but also fill the whole place with RealityView by stretching. For example: My RealityView is a picture, which is now anchored to the wall, and ARKit will stretch the picture according to the size of the wall. How to achieve the whole wall? Thank you!
Apple launched the Developer Strap on the developer's official website, which allows developers to connect Vision Pro and Mac with USB-C, but I can't get it in China, so I want to ask developers who already have Vision Pro or People who understand these questions: Can Vision Pro without Developer Strap be debugged wirelessly? If so, How is the experience of unlimited debugging? and how does it compare with wired connection? Thank you!