Post

Replies

Boosts

Views

Activity

Whether SSC can be connected to the iPad real machine with Xcode to run the evaluation
👋Hi This problem is related to SSC. I remember that the form submitted last year needed to be filled in with Xcode or Playground test, and it seems that it was also mentioned: • If you use Xcode, the judges may use Simulator to run. • If you use Playground, it's a real machine. But my work this time will encounter two limitations: It will use the framework/API only available in iPadOS 26 (so if it is a Playground environment, it may not run, playground can't use iPadOS 26 SDK) It will also use some content that must be real to run (such as ARKit), which means that Xcode Simulator is not good. So I would like to ask: Does this year's review allow you to compile and connect the iPad to the real machine? Or did I misremember last year's regulations? If the judge's environment is fixed (for example, only Playground or only Simulator), how should I adjust the submission method or implement the scheme? Looking forward to your reply, thank you.
2
3
250
4d
Some question about Apple Intelligents
Apple Intelligents is here, but I have some problems. First of all, it often shows that something is being downloaded on the settings page. Is this normal? And the Predictive Code Completion Model in Xcode seems to have been suddenly deleted and needs to be re-downloaded, and the error The operation couldn't be complet has occurred. Ed. (ModelCatalog.CatalogErrors.AssetErrors error 1.), detailed log: The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.) Domain: ModelCatalog.CatalogErrors.AssetErrors Code: 1 User Info: { DVTErrorCreationDateKey = "2024-08-27 14:42:54 +0000"; } -- Failed to find asset: com.apple.fm.code.generate_small_v1.base - no asset Domain: ModelCatalog.CatalogErrors.AssetErrors Code: 1 -- System Information macOS Version 15.1 (Build 24B5024e) Xcode 16.0 (23049) (Build 16A5230g) Timestamp: 2024-08-27T22:42:54+08:00
2
3
940
Aug ’24
About Apple Vision Pro Developer Kit
It has been more than a month since I applied for the Apple Vision Pro Developer Kit in July, and there is still no answer. I didn't get much from asking Apple Developer Support. Just tell me to let I'm waiting. I hope to get some information provided by everyone, thank you 🙏! In Dev Kit Apply page: We’ve received your application. Thank you for your interest. We’ll get back to you soon with your status. If you wish to withdraw your application, you may do so.
1
2
918
Sep ’23
Object Tracking with RealtyView
When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code: RealityView { content in if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(model) } } Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add? Note:I have seen https://developer.apple.com/videos/play/wwdc2024/10101/, but I don't know much about it.
3
1
1.1k
Sep ’24
All SystemSoundID
In the AudioServicesPlaySystemSound function of AudioToolbox, you can enter the corresponding SystemSoundID to play some sound effects that come with the system. However, I can't be sure what sound effect each number corresponds to, so I want to know all the sound effects in visionOS and its corresponding SystemSoundID.
0
1
669
Jul ’24
Can't Tap
I am attempting to execute actions after clicking an entity in Reality View using the Behaviors component. I have added the Input Target component and the Tap gesture as follows: TapGesture().targetedToAnyEntity() .onEnded({ value in _ = value.entity.applyTapForBehaviors() }) ) However, during testing, I have observed that the entity does not appear to recognize the click gesture. Could you kindly provide any relevant documentation or guidance on this matter?
1
1
656
Aug ’24
Code with Swift Assist
Hello, I would like to inquire about the release date of Swift Assist’s beta version. Apple has stated that it will be released later this year, but they have not provided a specific date or time. Could you please provide information on the beta version’s release date? Additionally, is there a trial version available? If so, when was it released? Thank you for your assistance.
2
1
2.5k
Jan ’25
Send messages to the scene
I saw onnoffitacation in the Behavior configuration of Reality Composer pro, which asked me to enter the Nofficatition name, that is to say, this requires swift in Xcode to send a message. There is a message name in the message, so I hope you can write a list for me how to use Swift in Xcode to send a message containing the message name.(There is an answer in https://developer.apple.com/forums/thread/756978, but it doesn't work.) and in the time line in Reality Composer Pro, there is a Notification action, which is used to send messages to swift. How can I ask swift to detect whether the Notification action has sent a message?(There is an answer in https://developer.apple.com/videos/play/wwdc2024/10102/, but it doesn't work.) I have asked this question before (https://developer.apple.com/forums/thread/756978). Those answers were available before, but now they are all invalid in the latest system. I hope you can help me. Thank you.
1
1
845
Oct ’24
Winner's visa issue
It was mentioned in the Swift Student Challenge that outstanding winners will have the opportunity to visit Apple Park in the United States. However, as a challenger from China who is not currently in the U.S., this means that if I receive the outstanding award, I will need to apply for a visa to travel to Apple Park. Since I am under 18, my guardian would also need to apply for a visa. Therefore, I would like to know if Apple provides visa assistance for outstanding winners and their guardians from China, or if we are responsible for applying for the visas on our own.
1
1
524
Feb ’25
visionOS plane anchor rotation and wall direction are inconsistent
I have a problem with the wall plane detection using visionOS/ARKit: I am using ARKitSession's PlaneDetectionProvider detection.wall in the space of visionOS. I recorded the position and rotation information of the first detected plane, but found that the rotation value will be facing when the user starts the space. There is a deviation in different directions. That is to say, even if the plane is located on the same wall, the rotation quaternion will be different. I hope that no matter from which direction the user enters the scan, the real direction of the wall can be correctly obtained so that the virtual content can be accurately aligned with the wall. I have tried to use anchor.originFromAnchorTransform or Transform.rotation directly, but the rotation value is still affected by the user's initial orientation. In addition, I would like to know whether the user's initial orientation will affect the location information. If so, please provide a solution. Thank you!
1
0
519
Sep ’25