Post

Replies

Boosts

Views

Activity

AR Quick View's AR Quick View button cannot be pressed on some iOS devices.
AR Quick View's AR Quick View button cannot be pressed on some iOS devices. As for the iPhone 12 Pro, the AR Quick View button cannot be pressed on iOS 15 when reading from the standard iOS QR code reader, but the AR Quick View button can be pressed on the iPhone 12 Pro with iOS 14. On other devices, you can press the AR Quick View button when launching from Safari, but you cannot press the AR Quick View button when accessing from the standard QR code reader. Also, on some iOS devices, the AR Quick View button cannot be pressed. The problem seems to be occurring outside of the https://konabeer.jp/beach-diorama-ar/ as well. Is there any way to avoid this AR Quick View button not being able to be pressed? It would be very helpful if I could get more information than just the solution to this problem. iPhone7(iOS13):OK iPhoneX(iOS15):QR Code Reader:NG iPhone11(iOS14):OK iPhone12Pro(iOS14):OK iPhone12Pro(iOS15):QR Code Reader:NG iPhone12(iOS15.1):QR Code Reader:NG iPad Air(iOS15):QR Code Reader:NG iPhoneSE(iOS15):QR Code Reader:NG As a result of our own verification, the AR Quick Look button is deactivated on iOS 15 and when the QR code is imported from the Control Center. If you know of any known bugs or workarounds, or if there are any informative sites that have information on this issue, I would appreciate it.
7
0
3.6k
Jan ’22
Apple Vision Pro Specifications
Hi, I have a question about Apple Vision Pro specifications. HWhat are the vertical, horizontal, and diagonal FOV degrees of Vision Pro? How many cm does Vision Pro's Near clipping plane support? How many meters does Vision Pro's Far clipping plane support? Does eye tracking, authentication, etc. work properly for people wearing contact lenses? At what age can Vision Pro be worn? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
5
1
4.8k
Jun ’23
Problem with black background when taking a photo with AR Quick Look feature in iOS 15
Has the problem of the black background when taking a picture with the AR Quick look function in iOS 15 been resolved? I think it's a rather serious bug. On an iOS 15 device, go to the Quick Look page below and display any of the multiple 3D models in AR Quick Look; the background will be black when you take a picture with the AR Quick Look shooting function. https://developer.apple.com/augmented-reality/quick-look/ There are similar events below, but they do not seem to be addressed at all. https://developer.apple.com/forums/thread/691784
6
0
2.3k
Jun ’22
Does the Apple Vision Pro have GPS?
Hi, I was wondering after watching the WWDC23 session, Meet Core Location for spatial computing, does the Apple Vision Pro have GPS? Or does it provide Core Location functionality via Wi-Fi? Also, in Unity, we use Input.location to get latitude and longitude. When developing in Unity with Apple Vision Pro, do we use Input.location to get latitude and longitude? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
7
0
3.9k
Feb ’24
Manual for Shader Graph in Reality Composer Pro
Hi, I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials. Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
7
2
4.9k
Mar ’24
How to determine access from Safari in visionOS
Hi, I have one question. When creating a web page, is there a way to determine that it is being accessed from Safari on visionOS? I would also like to know the user agent for Safari on visionOS. If there is more than one way to determine this, such as JavaScript and web server, please tell us all. Cases where it is used include changing the page layout in the case of Safari on visionOS, changing the processing method when dynamically generating HTML pages on a web server, and judging Quick Look. Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
1
935
Jun ’23
Play spatial video shot on iPhone 15 Pro in visionOS simulator
I heard that iPhone 15 Pro or iPhone 15 Pro Max can shoot spatial video. However, I also know that the iPhone 15 Pro does not support spatial video shooting at first. When the iPhone 15 Pro becomes able to shoot spatial video, can the shot spatial video be played back on the visionOS simulator? When played back, is the video playback represented in three dimensions as a spatial video also performed in visionOS simulator? I would like to play back the spatial video shot with the iPhone 15 Pro using the VideoPlayerComponent of RealityKit. I am concerned that if the visionOS simulator does not support the operation verification of the shot spatial video, it will take a long time to verify it because I do not have an Apple Vision Pro device.
0
1
2k
Sep ’23
How to integrate Apple Immersive Video into the app you are developing.
Hello, Let me ask you a question about Apple Immersive Video. https://www.apple.com/newsroom/2024/07/new-apple-immersive-video-series-and-films-premiere-on-vision-pro/ I am currently considering implementing a feature to play Apple Immersive Video as a background scene in the app I developed, using 3DCG-created content converted into Apple Immersive Video format. First, I would like to know if it is possible to integrate Apple Immersive Video into an app. Could you provide information about the required software and the integration process for incorporating Apple Immersive Video into an app? It would be great if you could also share any helpful website resources. I am considering creating Apple Immersive Video content and would like to know about the necessary equipment and software for producing both live-action footage and 3DCG animation videos. As I mentioned earlier, I’m planning to play Apple Immersive Video as a background in the app. In doing so, I would also like to place some 3D models as RealityKit entities and spatial audio elements. I’m also planning to develop the visionOS app as a Full Space Mixed experience. Is it possible to have an immersive viewing experience with Apple Immersive Video in Full Space Mixed mode? Does Apple Immersive Video support Full Space Mixed? I’ve asked several questions, and that’s all for now. Thank you in advance!
2
1
743
3w
When placing a TextField within a RealityViewAttachment, the virtual keyboard does not appear in front of the user as expected.
Hello, Thank you for your time. I have a question regarding visionOS app development. When placing a SwiftUI TextField inside RealityView.attachments, we found that focusing on the field does not bring up the virtual keyboard in front of the user. Instead, the keyboard appears around the user’s lower abdomen area. However, when placing the same TextField in a regular SwiftUI layer outside of RealityView, the keyboard appears in the correct position as expected. This suggests that the issue is specific to RealityView.attachments. We are currently exploring ways to have the virtual keyboard appear directly in front of the user when using TextField inside RealityViewAttachments. If there is any method to explicitly control the keyboard position or any known workarounds—including alternative UI approaches—we would greatly appreciate your guidance. Best regards, Sadao Tokuyama
3
1
625
Jul ’25
How to Achieve Volumetric Lighting (Light Shafts) in RealityKit on visionOS?
Hello everyone, I am currently developing an experience for visionOS using RealityKit and I would like to achieve volumetric light effects, such as visible light rays or shafts through fog or dust. I found this GitHub project: https://github.com/robcupisz/LightShafts, which demonstrates the kind of visual style I am aiming for. I would like to know if there is a way to create similar effects using RealityKit on visionOS. So far, I have experimented with DirectionalLight, SpotLight, ImageBasedLight, and custom materials (e.g., additive blending on translucent meshes), but none of these approaches can replicate the volumetric light shaft look shown in the repository above. Questions: Is there a recommended technique or workaround in RealityKit to simulate light shafts or volumetric lighting? Is creating a custom mesh (e.g., cone or volume geometry with gradient alpha and additive blending) the only feasible method? Are there any examples, best practices, or sample projects from Apple or other developers that showcase a similar visual style? Any advice or hints would be greatly appreciated. Thank you in advance!
9
1
836
Aug ’25
Simple question about visionOS and Shared Space
Hi, I have one question: you explained Shared Space at the beginning of the session video, but I didn't really understand it. Is this Shared Space like the Dock on a Mac? Are applications placed in the Shared Space and the operation is to launch the application placed in the Shared Space ? Why is the word "Shared" included, or is there a function to do Shared? "By default, apps launch into Shared Space." By default, apps launch into the Shared Space. What is the default? What is the non-default state? "People remain connected to their surroundings through passthrough." What does the above mean on visionOS? By the way, is the application that starts on the Shared Space the so-called clock, or does the Safari browser also work on the Shared Space? What kind of applications can only run on Full Space? I don't have an image of the role of each function on visionOS. If possible, it would be easier to understand if there is an actual image of the application running, not just a diagram. Best regards. Sadao Tokuyama https://1planet.co.jp/
6
0
3.6k
Jun ’23
How can I share space in Volumes?
Volumes allow an app to display 3D content in defined bounds, sharing the space with other apps What does it mean to be able to share space in Volumes? What are the benefits of being able to do this? Do you mean Shared Space? I don't understand Shared Space very well to begin with. they can be viewed from different angles. Does this mean that because it is 3D content with depth, if I change the angle, I can see it with depth? It seems obvious to me because it is 3D content. Is this related to Volumes?
1
0
704
Jun ’23
Value of type '(@escaping (TapGesture.Value) -> Void) -> _EndedGesture<TapGesture>' has no member 'targetedToAnyEntity'
Hi, If you write the sample code for the gesture in the following document and try to execute it, an error will occur. https://developer.apple.com/documentation/visionos/adding-3d-content-to-your-app Value of type '(@escaping (TapGesture.Value) -> Void) -> _EndedGesture' has no member 'targetedToAnyEntity' Is something missing? XCode 15.0 beta 2 (15A516b) visionOS 1.0 beta import SwiftUI import RealityKit import RealityKitContent struct SphereSecondView: View { @State var scale = false /*var tap: some Gesture { TapGesture() .onEnded { _ in print("Tap") scale.toggle() } }*/ var body: some View { RealityView {content in let model = ModelEntity( mesh: .generateSphere(radius: 0.1), materials: [SimpleMaterial(color: .yellow, isMetallic: true)]) content.add(model) } update: { content in if let model = content.entities.first { model.transform.scale = scale ? [2.0, 2.0, 2.0] : [1.0, 1.0, 1.0] } } .gesture(TapGesture().onEnded .targetedToAnyEntity() { _ in scale.toggle() }) } } #Preview{ SphereSecondView() } Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
2
0
1.2k
Jul ’23
How to place a 3D model in front of you in the Full Space app.
Hi, I am currently developing a Full Space App. I have a question about how to implement the display of Entity or Model Entity in front of the user. I want to move the Entity or Model Entity to the user's front, not only at the initial display, but also when the user takes an action such as tapping. (Animation is not required.) I want to perform the initial placement process to the user's front when the reset button is tapped. Thanks. Sadao Tokuyama https://twitter.com/tokufxug https://www.linkedin.com/in/sadao-tokuyama/ https://1planet.co.jp/tech-blog/category/applevisionpro
1
0
729
Oct ’23