Post

Replies

Boosts

Views

Activity

How to integrate Apple Immersive Video into the app you are developing.
Hello, Let me ask you a question about Apple Immersive Video. https://www.apple.com/newsroom/2024/07/new-apple-immersive-video-series-and-films-premiere-on-vision-pro/ I am currently considering implementing a feature to play Apple Immersive Video as a background scene in the app I developed, using 3DCG-created content converted into Apple Immersive Video format. First, I would like to know if it is possible to integrate Apple Immersive Video into an app. Could you provide information about the required software and the integration process for incorporating Apple Immersive Video into an app? It would be great if you could also share any helpful website resources. I am considering creating Apple Immersive Video content and would like to know about the necessary equipment and software for producing both live-action footage and 3DCG animation videos. As I mentioned earlier, I’m planning to play Apple Immersive Video as a background in the app. In doing so, I would also like to place some 3D models as RealityKit entities and spatial audio elements. I’m also planning to develop the visionOS app as a Full Space Mixed experience. Is it possible to have an immersive viewing experience with Apple Immersive Video in Full Space Mixed mode? Does Apple Immersive Video support Full Space Mixed? I’ve asked several questions, and that’s all for now. Thank you in advance!
2
1
743
3w
How can I share space in Volumes?
Volumes allow an app to display 3D content in defined bounds, sharing the space with other apps What does it mean to be able to share space in Volumes? What are the benefits of being able to do this? Do you mean Shared Space? I don't understand Shared Space very well to begin with. they can be viewed from different angles. Does this mean that because it is 3D content with depth, if I change the angle, I can see it with depth? It seems obvious to me because it is 3D content. Is this related to Volumes?
1
0
704
Jun ’23
About Immersion Style progressive
Hi, I have a question about Immersion Style. It is about progressive. I understand that by specifying progressive in Immersion, it is possible to mix mixed and full, but when is this used, for example, as in the WWDC23 movie where the person watching the movie on the screen gradually switches the room to space, or in the Digital Crown where the person is watching a movie on the screen and the room gradually changes to space? For example, when a person is watching a movie on the screen and the room gradually changes to space, as in the WWDC23 movie, or when the room gets darker and darker as the Digital Crown is adjusted, or when the room goes completely dark? Please let me know if you have a video, sample code, or explanation that shows an example of progression. By the way, is it possible to get the event of operating the Digital Crown from the application? Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
1
0
1.3k
Jun ’23
GeometryReader3D and Scene Phases do not work properly.
Hi Scene Phases, but no event is issued when Alert is executed. Is this a known bug? https://developer.apple.com/videos/play/wwdc2023/10111/?time=784 In the following video, the center value is obtained, but a compile error occurs because the center is not found. https://developer.apple.com/videos/play/wwdc2023/10111/?time=861 GeometryReader3D { proxy in ZStack { Earth( earthConfiguration: model.solarEarth, satelliteConfiguration: [model.solarSatellite], moonConfiguration: model.solarMoon, showSun: true, sunAngle: model.solarSunAngle, animateUpdates: animateUpdates ) .onTapGesture { if let translation = proxy.transform(in: .immersiveSpace)?.translation { model.solarEarth.position = Point3D(translation) } } } } } Also, model.solarEarth.position is Point3D. This is not a simple Entity, is it? I'm quite confused because the whole code is fragmented and I'm not even sure if it works. I'm not even sure if it's a bug or not, so it's taking me a few days to a week to investigate and verify.
1
0
800
Aug ’23
How to place a 3D model in front of you in the Full Space app.
Hi, I am currently developing a Full Space App. I have a question about how to implement the display of Entity or Model Entity in front of the user. I want to move the Entity or Model Entity to the user's front, not only at the initial display, but also when the user takes an action such as tapping. (Animation is not required.) I want to perform the initial placement process to the user's front when the reset button is tapped. Thanks. Sadao Tokuyama https://twitter.com/tokufxug https://www.linkedin.com/in/sadao-tokuyama/ https://1planet.co.jp/tech-blog/category/applevisionpro
1
0
729
Oct ’23
How to set the default size of WindowGroup volumetric in SwiftUI to fit the size of Model Entity loaded in USDZ
Hello, I'm here. I am posting this in the hope that you can give me some advice on what I would like to achieve. What I would like to achieve is to download the USDZ 3D model from the web server within the visionOS app and display it with the Shared Space volume (volumetric) size set to fit the downloaded USDZ model. Currently, after downloading USDZ and generating it as a Model Entity, Using openWindow, The Model Entity is created as a volumetric WindowGroup in the RealityViewContent of the RealityView using openWindow. The Model Entity generated by downloading USDZ is added to the RealityViewContent of the RealityView in the View called by openWindow. The USDZ downloaded by the above method appears in the volume on visionOS without any problems. However, the size of the USDZ model to be downloaded is not uniform, so it may not fit in the volume. I am trying to generate a WindowGroup with openWindow using Binding with the appropriate size value set to defaultSize, but I am not sure which property of ModelEntity can be set to the appropriate value for defaultSize. The attached image does not have the correct position and I would like to place the position down if possible. I would appreciate your advice on sizing and positioning the downloaded USDZ to fit in the volume. Incidentally, I tried a plane style window and found that it displayed a USDZ Model Entity that was much larger in scale compared to the volume, so I have decided not to support a plane style window. If there is any information on how to properly set the position and size of the USDZ files created by visionOS and RealityKit, I would appreciate it if you could also provide it. Best regards. Sadao Tokuyama https://twitter.com/tokufxug https://1planet.co.jp/tech-blog/category/applevisionpro
1
0
994
Oct ’23
Rendering bug when layering transparent textures front and back
If I put an alpha image texture on a model created in Blender and run it on RCP or visionOS, the rendering between the front and back due to alpha will result in an unintended rendering. Details are below. I expor ted a USDC file of a Blender-created cylindrical object wit h a PNG (wit h alpha) texture applied to t he inside, and t hen impor ted it into Reality Composer Pro. When multiple objects t hat make extensive use of transparent textures are placed in front of and behind each ot her, t he following behaviors were obser ved in t he transparent areas ・The transparent areas do not become transparent ・The transparent areas become transparent toget her wit h t he image behind t hem the order of t he images becomes incorrect Best regards.
1
0
737
Nov ’24
The pinch operation by the left hand should be stopped.
Hi, I have a question. In visionOS, when a user looks at a button and performs a pinch gesture with their index finger and thumb, the button responds. By default, this works with both the left and right hands. However, I want to disable the pinch gesture when performed with the left hand while keeping it functional with the right hand. I understand that the system settings allow users to configure input for both hands, the left hand only, or the right hand only. However, I would like to control this behavior within the app itself. Is this possible? Best regards.
1
0
275
Feb ’25
Why don’t the dinosaurs in “Encounter Dinosaurs” respond to real-world light intensity?
I have a question about Apple’s preinstalled visionOS app “Encounter Dinosaurs.” In this app, the dinosaurs are displayed over the real-world background, but the PhysicallyBasedMaterial (PBM) in RealityKit doesn’t appear to respond to the actual brightness of the environment. Even when I change the lighting in the room, the dinosaurs’ brightness and shading remain almost the same. If this behavior is intentional — for example, if the app disables real-world lighting influence or uses a fixed lighting setup — could someone explain how and why it’s implemented that way?
1
0
561
Nov ’25
How to speed up build time when placing large USDZ files in RCP scenes
I’m currently developing a visionOS app that includes an RCP scene with a large USDZ file (around 2GB). Each time I make adjustments to the CG model in Blender, I export it as USDZ again, place it in the RCP scene, and then build the app using Xcode. However, because the USDZ file is quite large, the build process takes a long time, significantly slowing down my development speed. For example, I’d like to know if there are any effective ways to: Improve overall build performance Reduce the time between updating the USDZ file and completing the build Any advice or best practices for optimizing this workflow would be greatly appreciated. Best regards, Sadao
1
0
157
Nov ’25
Does an app using WKWebView require encryption compliance for worldwide App Store release?
I have two questions regarding releasing an app that uses an in-app browser (WKWebView) on the App Store worldwide. Question 1: Encryption usage Our app uses WKWebView and relies on standard encryption. Should this be declared as using encryption during the App Store submission? Question 2: If the answer to Question 1 is YES If it must be declared as using encryption, do we need to prepare and upload additional documentation when submitting the app in France? Also, would this require us to redo the entire build and upload process, even for an app version that has already been uploaded? Goal / request: We want to release an app using WKWebView worldwide, including France. We would like to understand all the necessary steps and requirements for completing the App Store release without unexpected rework. Best regards, P.S.: A similar question was posted a few years ago, but it seems there was no response. https://developer.apple.com/forums/thread/725047 Sadao
1
0
83
Nov ’25
How to determine access from Safari in visionOS
Hi, I have one question. When creating a web page, is there a way to determine that it is being accessed from Safari on visionOS? I would also like to know the user agent for Safari on visionOS. If there is more than one way to determine this, such as JavaScript and web server, please tell us all. Cases where it is used include changing the page layout in the case of Safari on visionOS, changing the processing method when dynamically generating HTML pages on a web server, and judging Quick Look. Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
1
935
Jun ’23
Is Unity's Unbounded Volume Shared Space or Full Space?
Hi, I am currently watching the Create immersive Unity apps video from WWDC23. I am posting this question because a question arose while watching the video. First, look at the following, which is explained in the session Because you're using Unity to create volumetric content that participates in the shared space, a new concept called a volume camera lets you control how your scene is brought into the real world. A volume camera can create two types of volumes, bounded and unbounded, each with different characteristics. Your application can switch between the two at any time. https://developer.apple.com/videos/play/wwdc2023/10088/?time=465   Your unbounded volume displays in a full space on this platform and allows your content to fully blend with passthrough for a more immersive experience. https://developer.apple.com/videos/play/wwdc2023/10088/?time=568 At first, we explain that there are two types of volumetric content in Shared Space: bounded volume and unbounded volume. However, when we get to the description of unbounded volume, it is changed to Full Space. Is Full Space correct for unbounded volume, not Shared Space? Best regards. P.S. I felt uncomfortable with the title Create immersive Unity apps. The first half of the presentation was about Unity development and Shared Space's Bounded Volume, and I felt that Bounded Volume apps are far from immersive. Apple's definition of immersive in spatial computing was vague. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
864
Jun ’23