I successfully changed a picture to the background in ImmersiveSpace in a full state with the following code.
import RealityKit
struct MainBackground: View {
var body: some View {
RealityView { content in
guard let resource = try? await TextureResource(named: "Image_Name") else {
fatalError("Error.")
}
var material = UnlitMaterial()
material.color = .init(texture: .init(resource))
let entity = Entity()
entity.components.set(ModelComponent(
mesh: .generateSphere(radius: 1000),
materials: [material]
))
entity.scale *= .init(x: -1, y: 1, z: 1)
content.add(entity)
}
}
}
However, when running, I found that when the user moves, the background is not fixed, but follows the user's movement, which I feel unrealistic. How to fix the background in the place where it first appears, and give the user a kind of movement that is really like walking in the real world, instead of letting the background follow the user.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I created a RealityKitContent in the Packages folder of the visionOS app project. At first, I tried to add a USDA model directly to its rkassets. I used Model3D(named: "ModelName", bundle: realityKitContentBundle) can The model is displayed normally, but then when I add a folder in rkassets and then put the USDA model in that folder, use Model3D(named: "ModelName", bundle: realityKit ContentBundle) cannot display the model normally. What should I do?
If you know how to solve the above problems, please let me know and check if you know how to solve the following problems. If you know, please also tell me the answer. Thank you!
The USDA model I mentioned in the question just now contains an animation, but when I used Model3D(named: "ModelName", bundle: realityKitContentBundle) , I found that the animation was not played by default, but needed additional code. Is there any documentation, video or listing code in this regard?
Some functions cannot be tested in the normal state of the visionOS virtual machine, such as hand tracking, and I need to test these functions. Excuse me, in addition to applying for Dev Kit and participating in the laboratory, how can I test these special functions?
Question 1:
The manipulationState variable has been used many times in the 3D gestures part of wwdc2023-10113 Video. What do I need to use Expected Type for this variable?
I tried to use the following code:
@State private var manipulationState: GestureState
Xcode error Reference to generic type 'GestureState' requires arguments in <... > Insert '<Any>' But I don't know what parameter to refer to in <Any>
Question 2:
If you have all the code snippets (or similar) of the 3D gestures part in this tutorial, please send it to me. Thank you.
How does visionOS play an MP4 audio to Spatial Audio through SwiftUI or RealityKit?
Note: Since I can only test the App through Simulator, in order to ensure that my Spatial Audio is played correctly in the space, please tell me how to display the location of Spatial Audio in the space. Ew and how to delete this View after the test, thank you!
How to add a light source to the View of the visionOS APP.
visionOS App how to play Spatial Audio, Ambient Audio and Channel Audio ?
I want to play RealityKitContent USDA model's Spatial Audio, I use this code:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
let audio = entity.spatialAudio
entity.playAudio(audio)
content.add(entity)
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
entity.playAudio(audio) this code need add a 'AudioResource' back of audio, Excuse me, what should AudioResource be?
In the visionOS app project, you can lock to wall / floor / ceiling / table / seat / window / door through Plane classifications. How can I lock a View to one of them and change a Bool variable to true after successful locking?
In a visionOS app project, how to fix a view in the user's hand through ARKit.
I developed a Plane-detection using ARKit in the visionOS app:
import SwiftUI
import RealityKit
import ARKit
struct ContentView: View {
@State private var ok = false
let session = ARKitSession()
let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical])
var body: some View {
Group {
if !ok {
TabelView()
} else {
SwiftUIView()
}
}
.onAppear{
Task {
try await session.run([planeData])
for await update in planeData.anchorUpdates {
if update.anchor.classification == .table { continue }
switch update.event {
case .added, .updated:
ok = true
case .removed:
ok = false
}
}
}
}
}
}
When I ran it, I found that Xcode told me that it does not support Simulator, so how can I test this program? If there is no other way other than applying for VisionPro DevKit and participating in Lab, I hope you can tell me that this View is Can the following functions be realized:
When there is no table in the user's sight, SwiftUIView() will be displayed in the coordinates of x:0,y:0,z:0. If there is a table in the user's sight, TabbelView() will be displayed on the table.
If you can't realize the above functions, I hope you can give me some advice. Thank you!
I implemented multiple languages through Localizable.strings in the visionOS App. Now I want to test whether multiple languages can work properly, so I want to change the system language in the settings before testing, but I can't find the relevant page in the settings. What should I do?
When the visionOS App is developed, when will it be submitted to the reviewer?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Review
App Store Connect
Swift
visionOS
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
content.add(entity)
guard let entity = entity.findEntity(named: "SpatialAudio"),
let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a",
from: "isWateringBasin.usda",
in: RealityKitContent.realityKitContentBundle) else { return }
let audioPlaybackController = entity.prepareAudio(resource)
audioPlaybackController.play()
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
I didn't find any errors in my program, and Xcode didn't report any errors in the program code, but when I ran it, it inexplicably reported an error:
Command CompileAssetCatalog failed with a nonzero exit code
What should I do?
Topic:
Programming Languages
SubTopic:
Swift
Tags:
Swift
Xcode
Xcode Sanitizers and Runtime Issues
visionOS