In a visionOS app project, how to fix a view in the user's hand through ARKit.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I developed a Plane-detection using ARKit in the visionOS app:
import SwiftUI
import RealityKit
import ARKit
struct ContentView: View {
@State private var ok = false
let session = ARKitSession()
let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical])
var body: some View {
Group {
if !ok {
TabelView()
} else {
SwiftUIView()
}
}
.onAppear{
Task {
try await session.run([planeData])
for await update in planeData.anchorUpdates {
if update.anchor.classification == .table { continue }
switch update.event {
case .added, .updated:
ok = true
case .removed:
ok = false
}
}
}
}
}
}
When I ran it, I found that Xcode told me that it does not support Simulator, so how can I test this program? If there is no other way other than applying for VisionPro DevKit and participating in Lab, I hope you can tell me that this View is Can the following functions be realized:
When there is no table in the user's sight, SwiftUIView() will be displayed in the coordinates of x:0,y:0,z:0. If there is a table in the user's sight, TabbelView() will be displayed on the table.
If you can't realize the above functions, I hope you can give me some advice. Thank you!
I implemented multiple languages through Localizable.strings in the visionOS App. Now I want to test whether multiple languages can work properly, so I want to change the system language in the settings before testing, but I can't find the relevant page in the settings. What should I do?
When the visionOS App is developed, when will it be submitted to the reviewer?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Review
App Store Connect
Swift
visionOS
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
content.add(entity)
guard let entity = entity.findEntity(named: "SpatialAudio"),
let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a",
from: "isWateringBasin.usda",
in: RealityKitContent.realityKitContentBundle) else { return }
let audioPlaybackController = entity.prepareAudio(resource)
audioPlaybackController.play()
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
I didn't find any errors in my program, and Xcode didn't report any errors in the program code, but when I ran it, it inexplicably reported an error:
Command CompileAssetCatalog failed with a nonzero exit code
What should I do?
Topic:
Programming Languages
SubTopic:
Swift
Tags:
Swift
Xcode
Xcode Sanitizers and Runtime Issues
visionOS
Just now, we learned the exciting news that Vision Pro is about to be released! I would like to ask if there are any services such as discounts or rentals for developers? If so, can I experience it normally in China?
import SwiftUI
import TipKit
struct ChatRoomView: View {
@StateObject private var socketManager = SocketIOManager()
@State private var inputText: String = ""
@StateObject var viewModel = SignInWithAppleViewModel()
@Binding var isCall: Bool
@State private var isSheet = false
@State private var ShowView = false
var learnlisttip = KeyTip()
@Binding var showShareSheet: Bool
@Binding var codeshar: String
var body: some View {
NavigationStack{
VStack {
if let roomCode = socketManager.roomCode {
ZStack{
VStack{
HStack{
Text("Room Key: \(roomCode)")
.font(.title)
.onAppear{
codeshar = roomCode
self.isCall = true
}
Button(action:{
self.showShareSheet = true
}, label:{
Image(systemName: "square.and.arrow.up.fill")
.accessibilityLabel("Share")
})
}
.padding(20)
TipView(learnlisttip, arrowEdge: .top)
.glassBackgroundEffect()
.offset(z: 20)
Spacer()
}
List(socketManager.messages, id: \.self) { message in
Text(message)
}
TextField("input", text: $inputText)
Button("send") {
socketManager.sendMessage(roomCode: roomCode, message: inputText)
inputText = ""
}
}
.sheet(isPresented: $showShareSheet) {
let shareContent = "Open SpatialCall, Join this Room, Key is: \(codeshar)"
ActivityView(activityItems: [shareContent])
}
} else {
HStack{
Button(action:{
withAnimation{
socketManager.createRoom()
}
}, label: {
VStack{
Image(systemName: "phone.circle.fill")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive: !ShowView)
.font(.largeTitle)
Text("Add Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(60)
Button(action:{
withAnimation{
self.isSheet = true
}
}, label: {
VStack{
Image(systemName: "phone.badge.checkmark")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive:!ShowView)
.font(.largeTitle)
Text("Join Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(70)
}
}
}
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
withAnimation {
self.ShowView = true
}
}
}
.sheet(isPresented: $isSheet){
VStack{
Text("Join Room")
.font(.largeTitle)
Text("You need to get the key to the room.")
TextField("Key", text: $inputText)
.padding(30)
.textFieldStyle(.roundedBorder)
Button(action:{
socketManager.joinRoom(roomCode: inputText)
self.isSheet = false
}, label: {
Text("Join Room")
.font(.title3)
})
.padding(50)
}
.padding()
}
.sheet(isPresented: $socketManager.showRoomNotFoundAlert) {
Text("The room does not exist. Please check whether the Key you entered is correct.")
.font(.title)
.frame(width: 500)
.padding()
Button(action:{
self.socketManager.showRoomNotFoundAlert = false
}, label: {
Text("OK")
.font(.title3)
})
.padding()
}
}
}
}
In the above code (this is a visionOS project), when I click Share, it can't display Sheet normally, and TipView can't be displayed either. Why?
In RealityView, I use AnchorEntity to lock to Wall to run normally:
AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: .one))
But I use a location other than wall but I can't display View. Why is that?
I found in App Store Connect that on the 18th (Vision Pro pre-sale day), my visionOS App was purchased by a person in the United States, but this day was only pre-sale. No one really got Vision Pro, and this app is only compatible with visionOS. May I ask why?
Topic:
Programming Languages
SubTopic:
Swift
Tags:
Swift
App Store Connect API
App Submission
visionOS
We can use AnchorEmpty to fix RealityView in one place, such as a wall. Now I hope that it can not only be fixed in one place, but also fill the whole place with RealityView by stretching. For example: My RealityView is a picture, which is now anchored to the wall, and ARKit will stretch the picture according to the size of the wall. How to achieve the whole wall? Thank you!
In order to improve App sales, I hope to promote App. At present, I only know from the VisionOS App Submission Guide that I can get the opportunity to become an editor's choice to promote the App by sharing my APP. Is there any other way to promote VisionOS App?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect API
Tags:
App Store Connect API
App Store Server Library
visionOS
I hope to send my income to my Apple Cash (Apple Cash can display the virtual card number in iOS 17.4 Beta). Is this okay?
If you can, how to fill in questions such as ABA Routeing Number in Agreements, Tax, and Banking? And will Apple Cash give me interest (I only have Apple Cash)
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Wallet
App Store Connect
Analytics & Reporting
How to display the user's own persona in a view
In Xcode16Beta4, it contains Predictive Code Completion, and Predictive Code Completion is also with other SDKs in the page opened by Xcode for the first time. Waiting for download.
However, I want to know: 1. What is Predictive Code Completion? 2. I didn't download Predictive Code Completion on the SDK download page when I first opened Xcode. Where should I download it later?