Apple asked me today to add the compliance information for the Digital Services Act in the EU. I tried to do so, but ran into a major issue here.
When I created the developer account many years ago, it was a personal account used by me as a natural person / freelancer in Germany. When I later founded my US company, I converted the existing developer account into a business account for that company. While doing this, I obtained a DUNS number which is linked to the business address in the States (California).
However, it seems as if this US address never made it into App Store Connect. It still shows my personal address in Germany, which is not correct. I cannot modify it either. The address page says that I have to update it at DUNS. However, in their system, everything is ok.
The problem seems to be related to the transfer of the address data between DUNS and App Store Connect. I opened up a ticket in the DUNS system, but I need to publish a new version of our app soon. So I am wondering if there is a faster way to get this resolved somehow?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We want to use QR code to open and activate certain features in our app.
We don't want these QR codes to be hard-coded in our app (i.e. image tracking). Instead we want to use them like you typically would do with your smart phone camera: Just detect them if the user looks at them.
And if they encode a certain URL pointing to our app, start the app via an URL handler and hand over the link, like it is possible on iOS.
I tried this in normal Shared Space and also with the camera app. But neither recognized a QR code.
Is this feasible with the Vision Pro?
We want to overlay a SwiftUI attachment on a RealityView, like it is done in the Diorama sample. By default, the attachments seem to be placed centered at their position. However, for our use-case we need to set a different anchor point, so the attachment is always aligned to one of the corners of the attachment view, e.g. the lower left should be aligned with the attachment's position. Is this possible?
I really love the way how you can add SwiftUI views as attachments to a RealityView on visionOS. As I am now porting my app to iOS as well, I was wondering if something like this is possible in ARView as well? I've only seen custom libraries trying to mimic UI elements so far.
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video:
https://developer.apple.com/wwdc24/10166
My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library?
We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
We are building an app that uses ARKit occasionally, but not always.
We would like to test the non-ARKit parts in the simulator, since it offers more debugging features (e.g. SwiftUI previews or the Thread Sanitizer).
However, we can't even build the app for the simulator, since the simulator SDK does not know about certain classes (e.g. "AnchorEntity"). This also means that none of the SwiftUI previews work, even if the views are not using ARKit.
What is the best approach to test such an app in the simulator, without using any ARKit features?
On my macBook Pro 2019 with Intel processor, I could run apps in the visionOS simulator without any problems when I was running macOS Ventura. But since I upgraded the Mac to Sonoma, the visionOS simulator seems to be broken.
The display in Xcode sticks to "Loading visionOS 1.0", and the simulator page under "Devices and Simulators" says "No runtime".
This is independent of which Xcode version I am using. I used Xcode 15 beta2, but also tried out more recent versions.
Could it be that developing on Intel Macs was dropped on macOS Sonoma without any notice? I can see that the Xcode 15.1 specs state you need a Silicon Mac, but the Xcode 15 specs don't. And it worked for me, at least on Ventura. The "only" change I made since was upgrading the OS to Sonoma.
I'd like to let the user immersive in one of my views, by projecting its content on the inner side of a sphere surrounding the user. Think of a video player app that surrounds the user with video previews they can select, like a 3D version of the Netflix homescreen. The view should be fully interactable, not just a read-only view.
Is this possible?
Since a couple of days (or maybe even weeks), I cannot use App Store Connect in Chrome anymore. I get to the page where the Apps should appear, but there are none. If I open the same page in Safari, it works. But I dislike Safari, since it always asks me for my password each time I visit that site.
We are developing apps for visionOS and need the following capabilities for a consumer app:
access to the main camera, to let users shoot photos and videos
reading QR codes, to trigger the download of additional content
So I was really happy when I noticed that visionOS 2.0 has these features.
However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only:
https://developer.apple.com/videos/play/wwdc2024/10139/
I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space.
IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.