Post

Replies

Boosts

Views

Activity

How to determine access from Safari in visionOS
Hi, I have one question. When creating a web page, is there a way to determine that it is being accessed from Safari on visionOS? I would also like to know the user agent for Safari on visionOS. If there is more than one way to determine this, such as JavaScript and web server, please tell us all. Cases where it is used include changing the page layout in the case of Safari on visionOS, changing the processing method when dynamically generating HTML pages on a web server, and judging Quick Look. Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
1
935
Jun ’23
Is Unity's Unbounded Volume Shared Space or Full Space?
Hi, I am currently watching the Create immersive Unity apps video from WWDC23. I am posting this question because a question arose while watching the video. First, look at the following, which is explained in the session Because you're using Unity to create volumetric content that participates in the shared space, a new concept called a volume camera lets you control how your scene is brought into the real world. A volume camera can create two types of volumes, bounded and unbounded, each with different characteristics. Your application can switch between the two at any time. https://developer.apple.com/videos/play/wwdc2023/10088/?time=465   Your unbounded volume displays in a full space on this platform and allows your content to fully blend with passthrough for a more immersive experience. https://developer.apple.com/videos/play/wwdc2023/10088/?time=568 At first, we explain that there are two types of volumetric content in Shared Space: bounded volume and unbounded volume. However, when we get to the description of unbounded volume, it is changed to Full Space. Is Full Space correct for unbounded volume, not Shared Space? Best regards. P.S. I felt uncomfortable with the title Create immersive Unity apps. The first half of the presentation was about Unity development and Shared Space's Bounded Volume, and I felt that Bounded Volume apps are far from immersive. Apple's definition of immersive in spatial computing was vague. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
864
Jun ’23
Apple Vision Pro Support for Unity Programmable Shaders
Hi, I have a question about Apple Vision Pro's support for Unity programmable shaders. Shaders applied to Material are not supported. RenderTextures are supported. (Can be used as texture input to Shader Graph for display through RealityKit.) Regarding the above, are Shared Space, Full Space, and Full immersive space all covered? Is Full immersive space irrelevant because it is Metal and not RealityKit? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
1.5k
Jun ’23
Full immersive behavior of the simulator
Hi, I run full immersive in the visionOS simulator and enable full immersive, but pass-through is still enabled. same behavior as mixed. Is there a bug in the visionOS simulator that prevents the pass-through with full immersive from being disabled? Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
479
Jun ’23
About shadow
Hi, I have one question. The following shadow is shown on the simulator whether it is set in the ContentView of the Window in visionOS or not. https://developer.apple.com/documentation/SwiftUI/View/shadow(color:radius:x:y:) Even if I define the shadow in the ContentView and change the color, radius, x, and y, there is no change at all. I think that shadow is not enabled. Is this because it is a visionOS simulator? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
474
Jul ’23
About metersPerUnit in USDZ
Hi, I watched the WWDC23 session video, "Create 3D models for Quick Look spatial experiences." https://developer.apple.com/videos/play/wwdc2023/10274/ In the video, I understood that the scale of models displayed using visionOS's AR Quick Look is determined by referencing the "metersPerUnit" value in USDZ files. I tried to find tools to set the "metersPerUnit" in 3D software or tools to view the "metersPerUnit" in USDZ files, but I couldn't find any. I believe adjusting the "metersPerUnit" in USDZ is crucial to achieve real-world scale when displaying models through visionOS's AR Quick Look. If anyone knows of apps or tools that can reference USDZ's "metersPerUnit" or 3D editor apps or tools that allow exporting with the "metersPerUnit" value properly reflected, I would greatly appreciate the information. Best regards. Sadao Tokuyama https://twitter.com/tokufxug https://www.linkedin.com/in/sadao-tokuyama/
0
0
872
Jul ’23
OrbitAnimation does not work.
Hi, I implemented it as shown in the link below, but it does not animate. https://developer.apple.com/videos/play/wwdc2023/10080/?time=1220 The following message was displayed No bind target found for played animation. import SwiftUI import RealityKit struct ImmersiveView: View { var body: some View { RealityView { content in if let entity = try? await ModelEntity(named: "toy_biplane_idle") { let bounds = entity.model!.mesh.bounds.extents entity.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)])) entity.components.set(HoverEffectComponent()) entity.components.set(InputTargetComponent()) if let toy = try? await ModelEntity(named: "toy_drummer_idle") { let orbit = OrbitAnimation( name:"orbit", duration: 30, axis:[0, 1, 0], startTransform: toy.transform, bindTarget: .transform, repeatMode: .repeat) if let animation = try? AnimationResource.generate(with: orbit) { toy.playAnimation(animation) } content.add(toy) } content.add(entity) } } } }
0
0
912
Aug ’23
Play spatial video shot on iPhone 15 Pro in visionOS simulator
I heard that iPhone 15 Pro or iPhone 15 Pro Max can shoot spatial video. However, I also know that the iPhone 15 Pro does not support spatial video shooting at first. When the iPhone 15 Pro becomes able to shoot spatial video, can the shot spatial video be played back on the visionOS simulator? When played back, is the video playback represented in three dimensions as a spatial video also performed in visionOS simulator? I would like to play back the spatial video shot with the iPhone 15 Pro using the VideoPlayerComponent of RealityKit. I am concerned that if the visionOS simulator does not support the operation verification of the shot spatial video, it will take a long time to verify it because I do not have an Apple Vision Pro device.
0
1
2k
Sep ’23
How to access Persona Virtual Camera features
How do I access Persona Virtual Camera features from the app? I would be happy to add permissions or a simple implementation example. I know that this feature is probably only available with the Apple Vision Pro device, but it would be nice to share information about Persona Virtual Camera, including whether or not it works with the visionOS simulator, and a solid description of Persona Virtual Camera to help us understand how it works. If you have a page or video that explains Persona Virtual Camera well, please share it as well. Best Regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
0
0
1.1k
Dec ’23
GroupActivity: Dropping activity as there is no active conversation:
Hi, I am having trouble with Share Play working. When I create and run the GroupActivity sample in SharePlay, I get the following message and GroupActivity does not work. https://mitemmetim.medium.com/shareplay-tutorial-share-custom-data-between-ios-and-macos-a50bfecf6e64 Dropping activity as there is no active conversation: <TUMutableConversationActivityCreateSessionRequest 0x2836731c0 activityIdentifier=jp.co.1planet.sample.SharePlayTutorial.SharePlayActivity applicationContext={length = 42, bytes = 0x62706c69 73743030 d0080000 00000000 ... 00000000 00000009 } metadata=<TUConversationActivityMetadata 0x28072d380 context=CPGroupActivityGenericContext title=SharePlay Example sceneAssociationBehavior=<TUConversationActivitySceneAssociationBehavior 0x28237a740 targetContentIdentifier=(null) shouldAssociateScene=1 preferredSceneSessionRole=(null)>> UUID=3137DDE4-F5B2-46B2-9097-30DD6CAE79A3> I tried running it on Mac and iOS, but it did not work as expected. By the way, we are also trying the following https://developer.apple.com/forums/thread/683624 I have no knowledge of GroupActivity; I have Group Activities set in Capability. Do I need to set anything else? Please let me know if you can find any solution to this message. By the way, I am using Xcode 15.2 Beta, iOS 17.1.1 and iOS 17.3 Beta, Mac OS 14.2.1 (23C71). Best Regards.
0
0
883
Jan ’24
Construction of luminous expression methods
Hi, I am investigating how to emit the following in my visionOS app. https://www.hiroakit.com/archives/1432 https://blog.terresquall.com/2020/01/getting-your-emission-maps-to-work-in-unity/ Right now, I'm trying various things with Shader Graph in Reality Composer Pro, but I can't tell from the official documentation and WWDC session videos what the individual functions and combined effects of Reality Composer Pro's Shader Graph nodes are, I am having a hard time understanding the effects of the individual functions and combinations of them. I have a feeling that such luminous materials and expressions are not possible in visionOS to begin with. If there is a way to achieve this, please let me know. Thanks.
0
0
610
Mar ’24
How can I share space in Volumes?
Volumes allow an app to display 3D content in defined bounds, sharing the space with other apps What does it mean to be able to share space in Volumes? What are the benefits of being able to do this? Do you mean Shared Space? I don't understand Shared Space very well to begin with. they can be viewed from different angles. Does this mean that because it is 3D content with depth, if I change the angle, I can see it with depth? It seems obvious to me because it is 3D content. Is this related to Volumes?
1
0
704
Jun ’23
About Immersion Style progressive
Hi, I have a question about Immersion Style. It is about progressive. I understand that by specifying progressive in Immersion, it is possible to mix mixed and full, but when is this used, for example, as in the WWDC23 movie where the person watching the movie on the screen gradually switches the room to space, or in the Digital Crown where the person is watching a movie on the screen and the room gradually changes to space? For example, when a person is watching a movie on the screen and the room gradually changes to space, as in the WWDC23 movie, or when the room gets darker and darker as the Digital Crown is adjusted, or when the room goes completely dark? Please let me know if you have a video, sample code, or explanation that shows an example of progression. By the way, is it possible to get the event of operating the Digital Crown from the application? Thanks. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
1
0
1.3k
Jun ’23
GeometryReader3D and Scene Phases do not work properly.
Hi Scene Phases, but no event is issued when Alert is executed. Is this a known bug? https://developer.apple.com/videos/play/wwdc2023/10111/?time=784 In the following video, the center value is obtained, but a compile error occurs because the center is not found. https://developer.apple.com/videos/play/wwdc2023/10111/?time=861 GeometryReader3D { proxy in ZStack { Earth( earthConfiguration: model.solarEarth, satelliteConfiguration: [model.solarSatellite], moonConfiguration: model.solarMoon, showSun: true, sunAngle: model.solarSunAngle, animateUpdates: animateUpdates ) .onTapGesture { if let translation = proxy.transform(in: .immersiveSpace)?.translation { model.solarEarth.position = Point3D(translation) } } } } } Also, model.solarEarth.position is Point3D. This is not a simple Entity, is it? I'm quite confused because the whole code is fragmented and I'm not even sure if it works. I'm not even sure if it's a bug or not, so it's taking me a few days to a week to investigate and verify.
1
0
800
Aug ’23