Post

Replies

Boosts

Views

Activity

Spatial Audio
How does visionOS play an MP4 audio to Spatial Audio through SwiftUI or RealityKit? Note: Since I can only test the App through Simulator, in order to ensure that my Spatial Audio is played correctly in the space, please tell me how to display the location of Spatial Audio in the space. Ew and how to delete this View after the test, thank you!
0
0
622
Dec ’23
Play Spatial Audio
I want to play RealityKitContent USDA model's Spatial Audio, I use this code: RealityView{ content in do { let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle) let audio = entity.spatialAudio entity.playAudio(audio) content.add(entity) } catch { print("Entity encountered an error while loading the model.") return } } entity.playAudio(audio) this code need add a 'AudioResource' back of audio, Excuse me, what should AudioResource be?
0
0
543
Dec ’23
Can't play Audio in RealityComposerPro
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents: RealityView{ content in do { let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle) content.add(entity) guard let entity = entity.findEntity(named: "SpatialAudio"), let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a", from: "isWateringBasin.usda", in: RealityKitContent.realityKitContentBundle) else { return } let audioPlaybackController = entity.prepareAudio(resource) audioPlaybackController.play() } catch { print("Entity encountered an error while loading the model.") return } } but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
1
0
616
Jan ’24
ARKit and Simulator
I developed a Plane-detection using ARKit in the visionOS app: import SwiftUI import RealityKit import ARKit struct ContentView: View { @State private var ok = false let session = ARKitSession() let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical]) var body: some View { Group { if !ok { TabelView() } else { SwiftUIView() } } .onAppear{ Task { try await session.run([planeData]) for await update in planeData.anchorUpdates { if update.anchor.classification == .table { continue } switch update.event { case .added, .updated: ok = true case .removed: ok = false } } } } } } When I ran it, I found that Xcode told me that it does not support Simulator, so how can I test this program? If there is no other way other than applying for VisionPro DevKit and participating in Lab, I hope you can tell me that this View is Can the following functions be realized: When there is no table in the user's sight, SwiftUIView() will be displayed in the coordinates of x:0,y:0,z:0. If there is a table in the user's sight, TabbelView() will be displayed on the table. If you can't realize the above functions, I hope you can give me some advice. Thank you!
1
0
1.1k
Jan ’24
TipView can't show
import SwiftUI import TipKit struct ChatRoomView: View { @StateObject private var socketManager = SocketIOManager() @State private var inputText: String = "" @StateObject var viewModel = SignInWithAppleViewModel() @Binding var isCall: Bool @State private var isSheet = false @State private var ShowView = false var learnlisttip = KeyTip() @Binding var showShareSheet: Bool @Binding var codeshar: String var body: some View { NavigationStack{ VStack { if let roomCode = socketManager.roomCode { ZStack{ VStack{ HStack{ Text("Room Key: \(roomCode)") .font(.title) .onAppear{ codeshar = roomCode self.isCall = true } Button(action:{ self.showShareSheet = true }, label:{ Image(systemName: "square.and.arrow.up.fill") .accessibilityLabel("Share") }) } .padding(20) TipView(learnlisttip, arrowEdge: .top) .glassBackgroundEffect() .offset(z: 20) Spacer() } List(socketManager.messages, id: \.self) { message in Text(message) } TextField("input", text: $inputText) Button("send") { socketManager.sendMessage(roomCode: roomCode, message: inputText) inputText = "" } } .sheet(isPresented: $showShareSheet) { let shareContent = "Open SpatialCall, Join this Room, Key is: \(codeshar)" ActivityView(activityItems: [shareContent]) } } else { HStack{ Button(action:{ withAnimation{ socketManager.createRoom() } }, label: { VStack{ Image(systemName: "phone.circle.fill") .symbolRenderingMode(.multicolor) .symbolEffect(.appear, isActive: !ShowView) .font(.largeTitle) Text("Add Room") .font(.title3) } }) .buttonStyle(.borderless) .buttonBorderShape(.roundedRectangle) .padding(.horizontal, 30) .glassBackgroundEffect() .offset(z: 20) .scaleEffect(1.5) .padding(60) Button(action:{ withAnimation{ self.isSheet = true } }, label: { VStack{ Image(systemName: "phone.badge.checkmark") .symbolRenderingMode(.multicolor) .symbolEffect(.appear, isActive:!ShowView) .font(.largeTitle) Text("Join Room") .font(.title3) } }) .buttonStyle(.borderless) .buttonBorderShape(.roundedRectangle) .padding(.horizontal, 30) .glassBackgroundEffect() .offset(z: 20) .scaleEffect(1.5) .padding(70) } } } .onAppear { DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { withAnimation { self.ShowView = true } } } .sheet(isPresented: $isSheet){ VStack{ Text("Join Room") .font(.largeTitle) Text("You need to get the key to the room.") TextField("Key", text: $inputText) .padding(30) .textFieldStyle(.roundedBorder) Button(action:{ socketManager.joinRoom(roomCode: inputText) self.isSheet = false }, label: { Text("Join Room") .font(.title3) }) .padding(50) } .padding() } .sheet(isPresented: $socketManager.showRoomNotFoundAlert) { Text("The room does not exist. Please check whether the Key you entered is correct.") .font(.title) .frame(width: 500) .padding() Button(action:{ self.socketManager.showRoomNotFoundAlert = false }, label: { Text("OK") .font(.title3) }) .padding() } } } } In the above code (this is a visionOS project), when I click Share, it can't display Sheet normally, and TipView can't be displayed either. Why?
1
0
911
Jan ’24
Follow the body
In visionOS project, How does the model in RealityView track a certain part of the body, such as left foot, right leg, left arm, etc., and store the position information of the model moving in the following part in a variable (in order to realize the transmission of body part information in the call). Please let me know if Apple publishes a document about the Persona virtual camera.
1
0
843
Jan ’24