Post

Replies

Boosts

Views

Activity

Apple: please unlock "enterprise features" for all visionOS devs!
We are developing apps for visionOS and need the following capabilities for a consumer app: access to the main camera, to let users shoot photos and videos reading QR codes, to trigger the download of additional content So I was really happy when I noticed that visionOS 2.0 has these features. However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only: https://developer.apple.com/videos/play/wwdc2024/10139/ I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space. IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.
1
3
792
Jun ’24
visionOS simulator broken on Intel macBook since upgrade to Sonoma
On my macBook Pro 2019 with Intel processor, I could run apps in the visionOS simulator without any problems when I was running macOS Ventura. But since I upgraded the Mac to Sonoma, the visionOS simulator seems to be broken. The display in Xcode sticks to "Loading visionOS 1.0", and the simulator page under "Devices and Simulators" says "No runtime". This is independent of which Xcode version I am using. I used Xcode 15 beta2, but also tried out more recent versions. Could it be that developing on Intel Macs was dropped on macOS Sonoma without any notice? I can see that the Xcode 15.1 specs state you need a Silicon Mac, but the Xcode 15 specs don't. And it worked for me, at least on Ventura. The "only" change I made since was upgrading the OS to Sonoma.
3
2
2.0k
Jan ’24
Lots of "garbage" in the Xcode logs, like "Decoding completed without errors"
Hi, if I run an app on the visionOS simulator, I get tons of "garbage" messages in the Xcode logs. Please find some samples below. Because of these messages, I can hardly see really relevant logs. Is there any way to get rid of these? [0x109015000] Decoding completed without errors [0x1028c0000] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 11496 [0x1028c0000] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1021f3200] Releasing session [0x1031dfe00] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1058eae00] Releasing session [0x10609c200] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 10901 [0x1058bde00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20910 [0x1028d5200] Releasing session [0x1060b3600] Releasing session [0x10881f400] Decoding completed without errors [0x1058e2e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 9124 [0x1028d1e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20778 [0x1031dfe00] Decoding completed without errors [0x1031fe000] Decoding completed without errors [0x1058e2e00] Options: 256x256 [FFFFFFFF,FFFFFFFF] 00025060```
0
1
448
Jan ’24
Map view in SwiftUI on visionOS: Bad performance with many markers
I am trying to build a visionOS app that uses a map as a central user interface. This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet. Has anyone found a solution to this? I found this example by Apple about how to implement clustering: https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code. I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/ However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.
2
1
908
Mar ’24
DSA compliance in the EU: How to update address from DUNS?
Apple asked me today to add the compliance information for the Digital Services Act in the EU. I tried to do so, but ran into a major issue here. When I created the developer account many years ago, it was a personal account used by me as a natural person / freelancer in Germany. When I later founded my US company, I converted the existing developer account into a business account for that company. While doing this, I obtained a DUNS number which is linked to the business address in the States (California). However, it seems as if this US address never made it into App Store Connect. It still shows my personal address in Germany, which is not correct. I cannot modify it either. The address page says that I have to update it at DUNS. However, in their system, everything is ok. The problem seems to be related to the transfer of the address data between DUNS and App Store Connect. I opened up a ticket in the DUNS system, but I need to publish a new version of our app soon. So I am wondering if there is a faster way to get this resolved somehow?
3
1
2.2k
Oct ’24
Can the Vision Pro scan QR codes?
We want to use QR code to open and activate certain features in our app. We don't want these QR codes to be hard-coded in our app (i.e. image tracking). Instead we want to use them like you typically would do with your smart phone camera: Just detect them if the user looks at them. And if they encode a certain URL pointing to our app, start the app via an URL handler and hand over the link, like it is possible on iOS. I tried this in normal Shared Space and also with the camera app. But neither recognized a QR code. Is this feasible with the Vision Pro?
0
1
1.2k
Mar ’24
Set anchor point for SwiftUI attachment
We want to overlay a SwiftUI attachment on a RealityView, like it is done in the Diorama sample. By default, the attachments seem to be placed centered at their position. However, for our use-case we need to set a different anchor point, so the attachment is always aligned to one of the corners of the attachment view, e.g. the lower left should be aligned with the attachment's position. Is this possible?
0
1
617
May ’24
API for turning regular photos into spatial photos?
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video: https://developer.apple.com/wwdc24/10166 My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library? We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
3
1
1.2k
Jun ’24
App Store Connect not working properly in Chrome
Since a couple of days (or maybe even weeks), I cannot use App Store Connect in Chrome anymore. I get to the page where the Apps should appear, but there are none. If I open the same page in Safari, it works. But I dislike Safari, since it always asks me for my password each time I visit that site.
Replies
2
Boosts
2
Views
2.5k
Activity
Mar ’22
Apple: please unlock "enterprise features" for all visionOS devs!
We are developing apps for visionOS and need the following capabilities for a consumer app: access to the main camera, to let users shoot photos and videos reading QR codes, to trigger the download of additional content So I was really happy when I noticed that visionOS 2.0 has these features. However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only: https://developer.apple.com/videos/play/wwdc2024/10139/ I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space. IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.
Replies
1
Boosts
3
Views
792
Activity
Jun ’24
visionOS simulator broken on Intel macBook since upgrade to Sonoma
On my macBook Pro 2019 with Intel processor, I could run apps in the visionOS simulator without any problems when I was running macOS Ventura. But since I upgraded the Mac to Sonoma, the visionOS simulator seems to be broken. The display in Xcode sticks to "Loading visionOS 1.0", and the simulator page under "Devices and Simulators" says "No runtime". This is independent of which Xcode version I am using. I used Xcode 15 beta2, but also tried out more recent versions. Could it be that developing on Intel Macs was dropped on macOS Sonoma without any notice? I can see that the Xcode 15.1 specs state you need a Silicon Mac, but the Xcode 15 specs don't. And it worked for me, at least on Ventura. The "only" change I made since was upgrading the OS to Sonoma.
Replies
3
Boosts
2
Views
2.0k
Activity
Jan ’24
How to project a SwiftUI onto a 3D object in Immersive Space?
I'd like to let the user immersive in one of my views, by projecting its content on the inner side of a sphere surrounding the user. Think of a video player app that surrounds the user with video previews they can select, like a 3D version of the Netflix homescreen. The view should be fully interactable, not just a read-only view. Is this possible?
Replies
0
Boosts
2
Views
415
Activity
Feb ’24
Lots of "garbage" in the Xcode logs, like "Decoding completed without errors"
Hi, if I run an app on the visionOS simulator, I get tons of "garbage" messages in the Xcode logs. Please find some samples below. Because of these messages, I can hardly see really relevant logs. Is there any way to get rid of these? [0x109015000] Decoding completed without errors [0x1028c0000] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 11496 [0x1028c0000] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1021f3200] Releasing session [0x1031dfe00] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1058eae00] Releasing session [0x10609c200] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 10901 [0x1058bde00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20910 [0x1028d5200] Releasing session [0x1060b3600] Releasing session [0x10881f400] Decoding completed without errors [0x1058e2e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 9124 [0x1028d1e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20778 [0x1031dfe00] Decoding completed without errors [0x1031fe000] Decoding completed without errors [0x1058e2e00] Options: 256x256 [FFFFFFFF,FFFFFFFF] 00025060```
Replies
0
Boosts
1
Views
448
Activity
Jan ’24
Clustering of map markers / annotations using SwiftUI
I love the new SwiftUI APIs for Apple Maps. However, I am missing (or haven't found) quite a number of features, particularly on visionOS. Besides an easy way to zoom maps, the most important feature for me is marker clustering. If you have a lot of markers on a map, this is an absolute must. Is there any way to accomplish this?
Replies
0
Boosts
1
Views
452
Activity
Jan ’24
Fully Immersive on visionOS: Show map or WkWebView
Is it possible to show a map (or a WkWebView) in a fully-immersive AR or VR view, so it surrounds the user like a panorama?
Replies
0
Boosts
1
Views
405
Activity
Feb ’24
Map view in SwiftUI on visionOS: Bad performance with many markers
I am trying to build a visionOS app that uses a map as a central user interface. This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet. Has anyone found a solution to this? I found this example by Apple about how to implement clustering: https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code. I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/ However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.
Replies
2
Boosts
1
Views
908
Activity
Mar ’24
visionOS: Project SwiftUI view onto a 3D curved plane?
I'd like to map a SwiftUI view (in my case: a map) onto a 3D curved plane in immersive view, so user can literally immersive themselves into the map. The user should also be able to interact with the map, by panning it around and selecting markers. Is this possible?
Replies
0
Boosts
1
Views
776
Activity
Feb ’24
Reality Composer Pro: How to add text to a scene (and manipulate it with code)
I would like to add text to a Reality Composer Pro scene and set the actual text via code. How can I achieve this? I haven't seen any "Text" element in the editor.
Replies
1
Boosts
0
Views
1.6k
Activity
Mar ’24
DSA compliance in the EU: How to update address from DUNS?
Apple asked me today to add the compliance information for the Digital Services Act in the EU. I tried to do so, but ran into a major issue here. When I created the developer account many years ago, it was a personal account used by me as a natural person / freelancer in Germany. When I later founded my US company, I converted the existing developer account into a business account for that company. While doing this, I obtained a DUNS number which is linked to the business address in the States (California). However, it seems as if this US address never made it into App Store Connect. It still shows my personal address in Germany, which is not correct. I cannot modify it either. The address page says that I have to update it at DUNS. However, in their system, everything is ok. The problem seems to be related to the transfer of the address data between DUNS and App Store Connect. I opened up a ticket in the DUNS system, but I need to publish a new version of our app soon. So I am wondering if there is a faster way to get this resolved somehow?
Replies
3
Boosts
1
Views
2.2k
Activity
Oct ’24
Can the Vision Pro scan QR codes?
We want to use QR code to open and activate certain features in our app. We don't want these QR codes to be hard-coded in our app (i.e. image tracking). Instead we want to use them like you typically would do with your smart phone camera: Just detect them if the user looks at them. And if they encode a certain URL pointing to our app, start the app via an URL handler and hand over the link, like it is possible on iOS. I tried this in normal Shared Space and also with the camera app. But neither recognized a QR code. Is this feasible with the Vision Pro?
Replies
0
Boosts
1
Views
1.2k
Activity
Mar ’24
Set anchor point for SwiftUI attachment
We want to overlay a SwiftUI attachment on a RealityView, like it is done in the Diorama sample. By default, the attachments seem to be placed centered at their position. However, for our use-case we need to set a different anchor point, so the attachment is always aligned to one of the corners of the attachment view, e.g. the lower left should be aligned with the attachment's position. Is this possible?
Replies
0
Boosts
1
Views
617
Activity
May ’24
Is there an equivalent for attachments in iOS' ARView?
I really love the way how you can add SwiftUI views as attachments to a RealityView on visionOS. As I am now porting my app to iOS as well, I was wondering if something like this is possible in ARView as well? I've only seen custom libraries trying to mimic UI elements so far.
Replies
3
Boosts
1
Views
897
Activity
Jun ’24
API for turning regular photos into spatial photos?
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video: https://developer.apple.com/wwdc24/10166 My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library? We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
Replies
3
Boosts
1
Views
1.2k
Activity
Jun ’24