In the fileImporter on iOS, when I select a folder, the result is always a subfile belonging to the private folder. This prevents me from accessing these files within the app. However, I do not wish to store them in the sandbox environment. What steps can I take to resolve this issue?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In macOS, I am encountering an issue where the system API fails to grant permission to open a file despite enabling the necessary Read/Write permissions within the SandBox. Could you please elucidate the reasons behind this behavior? Thanks!
func finderOpenFileSystem(at path: String) {
let fileURL = URL(fileURLWithPath: path)
guard FileManager.default.fileExists(atPath: path) else {
print("Error: File does not exist at path: \(path)")
return
}
let success = NSWorkspace.shared.open(fileURL)
if success {
print("File opened successfully: \(fileURL)")
} else {
print("Error: Failed to open file: \(fileURL)")
}
}
In visionOS, there are existing modifiers that can completely conceal the hands. However, I am interested in learning how to achieve the effect of only one hand disappearing while the other hand remains visible.
.upperLimbVisibility(.hidden)
It was mentioned in the Swift Student Challenge that outstanding winners will have the opportunity to visit Apple Park in the United States. However, as a challenger from China who is not currently in the U.S., this means that if I receive the outstanding award, I will need to apply for a visa to travel to Apple Park. Since I am under 18, my guardian would also need to apply for a visa. Therefore, I would like to know if Apple provides visa assistance for outstanding winners and their guardians from China, or if we are responsible for applying for the visas on our own.
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
Swift Student Challenge
Swift
Swift Playground
DocC
I am interested in participating in the Swift Student Challenge. My application contains a significant amount of augmented reality (AR) content, necessitating access to the camera. It is evident that if the reviewer utilizes a simulator or operates on a Mac, they will not be able to experience the AR function. Therefore, the AR function in the camera experience application must be utilized to access a real iPad.
However, it is mentioned in https://developer.apple.com/forums/thread/773530 that the plan is to evaluate Xcode app playgrounds within the simulator. Additionally, I observed the statement “Note: Xcode app playgrounds are executed in Simulator” on the submission page. Consequently, it is clear that the reviewers are limited to using a simulator or running my application on a Mac.
In light of this, I am seeking guidance on how to enable the reviewer to utilize a real iPad to access the AR function in the camera experience application. Alternatively, I may need to reconsider my strategy and discontinue utilizing AR.
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
Swift Student Challenge
Swift
Swift Playground
Swans Quest
I am submitting a challenge to the Swift Student Challenge. I have created a RealityContent folder using Reality Composer Pro. How can I import this folder into the Swift Package Manager (.swiftpm) project hosted on Playground to ensure that it becomes a usable package?
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Swift Student Challenge
Swift Playground
RealityKit
Reality Composer Pro
I intend to participate in the Swift Student Challenge. A link is provided within my application that directs users to an Internet HTML web page.
Link(destination: URL(string: "https://url.com")!) {
Label("Developer Website - .....com", systemImage: "arrow.right")
.shadow(color: Color.white ,radius: 50)
}
This URL corresponds to my personal web page. Although it is not directly related to the experience interaction within the application, I have decided to include it as it serves as a logo and demonstrates my proficiency in HTML. However, the challenge’s rules stipulate that the evaluation environment is not permitted to connect to the Internet. Consequently, I am concerned that my work may be rejected due to its perceived incompleteness or errors. So should I keep it? Thanks!
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
Swift Student Challenge
Swift Playground
Swans Quest
SwiftUI
I am currently filling out the Swift Student Challenge form, and I have two questions that I hope to get clarified:
One of the options asks, “Did you use open source software, other than Swift?” I would like to know what is meant by “open source software” in this context. Does it refer to IDEs (like Xcode) or programming languages and frameworks (such as Python, ARKit)? Are Apple frameworks (e.g., SwiftUI, ARKit, etc.) and certain third-party tools (such as Xcode, Blender, etc.) considered “open source software”?
I would like to provide a demo video to ensure that the reviewers can use the app properly and experience all of its features in the shortest amount of time. For certain reasons, I do not plan to play the video directly in the App Playground. Instead, I intend to include a link in the “Comments” section at the end of the form, which will redirect to a webpage (requiring an internet connection) containing the demo video. Will the reviewers be able to view the link and access the video as intended?
I would greatly appreciate any responses to these questions!
Topic:
Community
SubTopic:
Swift Student Challenge
Tags:
Swift Student Challenge
Swift
Swift Playground
Swans Quest
Hello,
I am developing a visionOS application and am interested in obtaining detailed data of users’ hands through ARKit, including but not limited to Transform and rotation angle. I have reviewed Happy Beem, but it appears to only introduce the method of identifying the user’s specific gestures.
Could you please advise on how to obtain the Transform and rotation angle of the user’s hand?
Thank you.
Thank you very much for choosing me to go to Apple Park to participate in WWDC. I am looking forward to participating in this event. May I ask you some questions? I am a young Apple Developer Program from China. And I am the winner of the Swift Student Challenge in 2024. I am over 13 years old. I used my own Developer account, not my parents', to apply for WWDC activities and all the events I carry out.
Since I am under 18 years old, my parents may need to sign the Special Event Parental Permission Statement. Where can I find it? My parents will sign it.
At the same time, I noticed that the bottom of the RSVP form requires me to guarantee that I am at least 18 years old, but I am not. And I used my own account to apply for WWDC, so I want to know how to meet this need?
I need a non-immigrant visa to go to the United States. So, I need to prove to the visa officer that I have received an invitation from Apple. Could Apple send me a formal invitation letter to prove that I have received an invitation from Apple? At the same time, as a teenager, I need to go to the United States with my mother, so can you mention the information of my guardian (my mother) in the invitation letter?
Ps: I am very independent. I am well aware that the number of people in WWDC is limited. My mother will not enter the venue unless otherwise required.
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
Tags:
Swans Quest
Developer Program
Developer ID
I have discovered that RemoteImmersiveSpace is limited to utilizing the structure of the CompositorContent protocol, precluding direct invocation of RealityView. Consequently, I am interested in understanding the appropriate method for integrating CompositorContent within RemoteImmersiveSpace. Thanks.
Hello! I’m excited to see that Look to Scroll has been included in visionOS 26 Beta. I’m aiming to achieve a feature where the user’s gaze at a specific edge automatically scrolls to that position. However, I’ve experimented with ScrollView and haven’t been able to trigger this functionality. Could you advise if additional API modifiers are necessary? Thank you!
I have a problem with the wall plane detection using visionOS/ARKit:
I am using ARKitSession's PlaneDetectionProvider detection.wall in the space of visionOS. I recorded the position and rotation information of the first detected plane, but found that the rotation value will be facing when the user starts the space. There is a deviation in different directions. That is to say, even if the plane is located on the same wall, the rotation quaternion will be different.
I hope that no matter from which direction the user enters the scan, the real direction of the wall can be correctly obtained so that the virtual content can be accurately aligned with the wall.
I have tried to use anchor.originFromAnchorTransform or Transform.rotation directly, but the rotation value is still affected by the user's initial orientation.
In addition, I would like to know whether the user's initial orientation will affect the location information. If so, please provide a solution.
Thank you!
When I was developing the visionOS 26beta Widget, I found that it could not work normally when the real vision OS was running, and an error would appear.
Please adopt container background api
It is worth mentioning that this problem does not occur on the visionOS virtual machine.
Does anyone know what the reason and solution are, or whether this is a visionOS error that needs Feedback? Thank you!
Hey I’m working on a macOS app that wants to detect the MacBook lid / hinge angle (i.e. how far the screen is open) by directly reading the internal sensor via HID / IOKit (a private / undocumented API). I came across this project: LidAngleSensor —
GitHub: https://github.com/samhenrigold/LidAngleSensor?tab=readme-ov-file
Before investing too much effort, I’d like to ask the community:
Has anyone succeeded in getting such an app accepted on the Mac App Store when it includes sensor-level, private API access like this?
What were the reviewer feedback or rejection reasons (if any)?
Are there documented cases (positive or negative) where Apple approved or rejected apps for accessing non-public hardware sensors?
What’s the risk of getting banned or permanently rejected for integrating this kind of functionality?
If you have direct experience (whether it passed or failed), I’d love to hear your stories, strategies, or pointers. Thanks in advance!