Just now, I accidentally deleted Xcode15.1 Beta3. Now I want to download it again, but I found that Xcode 15.1 Release Candidate replaced Xcode15.1 Beta3 in https://developer.apple.com/download/applications/, but there is no visionOS SDK in Xcode 15.1 Release Candidate, so I really need to download Xcode15.1 Beta3 now. Who has any way for me to download it again?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In reality composer pro, when importing an USDZ model and inserting it into the scene, reality composer pro will remove the material of the model itself by default, but I don't want to do this. So how can reality composer pro not remove the material of the model itself?
Some functions cannot be tested in the normal state of the visionOS virtual machine, such as hand tracking, and I need to test these functions. Excuse me, in addition to applying for Dev Kit and participating in the laboratory, how can I test these special functions?
How to add a light source to the View of the visionOS APP.
In the visionOS app project, you can lock to wall / floor / ceiling / table / seat / window / door through Plane classifications. How can I lock a View to one of them and change a Bool variable to true after successful locking?
I developed a Plane-detection using ARKit in the visionOS app:
import SwiftUI
import RealityKit
import ARKit
struct ContentView: View {
@State private var ok = false
let session = ARKitSession()
let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical])
var body: some View {
Group {
if !ok {
TabelView()
} else {
SwiftUIView()
}
}
.onAppear{
Task {
try await session.run([planeData])
for await update in planeData.anchorUpdates {
if update.anchor.classification == .table { continue }
switch update.event {
case .added, .updated:
ok = true
case .removed:
ok = false
}
}
}
}
}
}
When I ran it, I found that Xcode told me that it does not support Simulator, so how can I test this program? If there is no other way other than applying for VisionPro DevKit and participating in Lab, I hope you can tell me that this View is Can the following functions be realized:
When there is no table in the user's sight, SwiftUIView() will be displayed in the coordinates of x:0,y:0,z:0. If there is a table in the user's sight, TabbelView() will be displayed on the table.
If you can't realize the above functions, I hope you can give me some advice. Thank you!
I implemented multiple languages through Localizable.strings in the visionOS App. Now I want to test whether multiple languages can work properly, so I want to change the system language in the settings before testing, but I can't find the relevant page in the settings. What should I do?
When the visionOS App is developed, when will it be submitted to the reviewer?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Review
App Store Connect
Swift
visionOS
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
content.add(entity)
guard let entity = entity.findEntity(named: "SpatialAudio"),
let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a",
from: "isWateringBasin.usda",
in: RealityKitContent.realityKitContentBundle) else { return }
let audioPlaybackController = entity.prepareAudio(resource)
audioPlaybackController.play()
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
I found that my visionOS Simulator is very strange. Many functions and features are missing. For example, I learned from the Internet that the immersive scenes of Environments in their visionOS Simulator can be opened, but I click There was no response after the attack. There are not only these, but also many system features. I saw on the Internet that other developers have them, and I am missing. I'm worried that this will have an impact on me when testing my app. May I ask why?
Some information:
My updated Xcode version is the latest Xcode15.1Beta.
Device: iMac (2021)
Simulator system number: 21N305
import SwiftUI
import TipKit
struct ChatRoomView: View {
@StateObject private var socketManager = SocketIOManager()
@State private var inputText: String = ""
@StateObject var viewModel = SignInWithAppleViewModel()
@Binding var isCall: Bool
@State private var isSheet = false
@State private var ShowView = false
var learnlisttip = KeyTip()
@Binding var showShareSheet: Bool
@Binding var codeshar: String
var body: some View {
NavigationStack{
VStack {
if let roomCode = socketManager.roomCode {
ZStack{
VStack{
HStack{
Text("Room Key: \(roomCode)")
.font(.title)
.onAppear{
codeshar = roomCode
self.isCall = true
}
Button(action:{
self.showShareSheet = true
}, label:{
Image(systemName: "square.and.arrow.up.fill")
.accessibilityLabel("Share")
})
}
.padding(20)
TipView(learnlisttip, arrowEdge: .top)
.glassBackgroundEffect()
.offset(z: 20)
Spacer()
}
List(socketManager.messages, id: \.self) { message in
Text(message)
}
TextField("input", text: $inputText)
Button("send") {
socketManager.sendMessage(roomCode: roomCode, message: inputText)
inputText = ""
}
}
.sheet(isPresented: $showShareSheet) {
let shareContent = "Open SpatialCall, Join this Room, Key is: \(codeshar)"
ActivityView(activityItems: [shareContent])
}
} else {
HStack{
Button(action:{
withAnimation{
socketManager.createRoom()
}
}, label: {
VStack{
Image(systemName: "phone.circle.fill")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive: !ShowView)
.font(.largeTitle)
Text("Add Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(60)
Button(action:{
withAnimation{
self.isSheet = true
}
}, label: {
VStack{
Image(systemName: "phone.badge.checkmark")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive:!ShowView)
.font(.largeTitle)
Text("Join Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(70)
}
}
}
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
withAnimation {
self.ShowView = true
}
}
}
.sheet(isPresented: $isSheet){
VStack{
Text("Join Room")
.font(.largeTitle)
Text("You need to get the key to the room.")
TextField("Key", text: $inputText)
.padding(30)
.textFieldStyle(.roundedBorder)
Button(action:{
socketManager.joinRoom(roomCode: inputText)
self.isSheet = false
}, label: {
Text("Join Room")
.font(.title3)
})
.padding(50)
}
.padding()
}
.sheet(isPresented: $socketManager.showRoomNotFoundAlert) {
Text("The room does not exist. Please check whether the Key you entered is correct.")
.font(.title)
.frame(width: 500)
.padding()
Button(action:{
self.socketManager.showRoomNotFoundAlert = false
}, label: {
Text("OK")
.font(.title3)
})
.padding()
}
}
}
}
In the above code (this is a visionOS project), when I click Share, it can't display Sheet normally, and TipView can't be displayed either. Why?
In visionOS project, How does the model in RealityView track a certain part of the body, such as left foot, right leg, left arm, etc., and store the position information of the model moving in the following part in a variable (in order to realize the transmission of body part information in the call).
Please let me know if Apple publishes a document about the Persona virtual camera.
I found in App Store Connect that on the 18th (Vision Pro pre-sale day), my visionOS App was purchased by a person in the United States, but this day was only pre-sale. No one really got Vision Pro, and this app is only compatible with visionOS. May I ask why?
Topic:
Programming Languages
SubTopic:
Swift
Tags:
Swift
App Store Connect API
App Submission
visionOS
In visionOS, I want show a 3D Content, I can use RealityView or Mode3D, But the effect they achieve is similar. What is the difference between them and which one to use for users?
Please treat me as a beginner of Unity.
Now I want to learn to develop visionOS VR App through unity. I try to find a relatively complete route and start learning, but Unity's official website does not have much explanation for visionOS VR App, so I hope you can give me a comparison. The whole route, thank you!
Topic:
Graphics & Games
SubTopic:
General
Tags:
Games
Apple Unity Plug-Ins
WWDC23 Community
visionOS