I'm trying to evaluate if we can support AR navigation with MapKit. The feature is supposed to be available for users in US.
I tried to run the sample on my iPhone: https://developer.apple.com/documentation/arkit/tracking-geographic-locations-in-ar?language=objc
But I'm in a location that ARGeoTrackingConfiguration.checkAvailabilityWithCompletionHandler: always return false. I think ARGeoAnchor isn't supported in my location.
I tried to use simulated locations by
Adding a gpx file when launching the app.
Enabling Xcode -> Debug -> Simulate Location -> New York, NY, US
But the availability for ARGeoAnchor is still false.
Is that possible for me to develop the ARGeoAnchor feature outside of the covered areas?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I can generate a ShapeResource from a ReakityKit entity's extents. Could I apply some scaling to the generated shape. Is there a way to do that?
// model is a ModelResource and bounds is a BoundingBox
var shape = ShapeResource.generateConvex(from: model.mesh);
shape = shape.offsetBy(translation: bounds.center)
// How can I scale the shape to fit within the bounds?
The following API only provide the rotation and translation support. and I cannot find the scale support.
offsetBy(rotation: simd_quatf = simd_quatf(ix: 0, iy: 0, iz: 0, r: 1), translation: SIMD3<Float> = SIMD3<Float>())
I can put the ShapeResource on an entity and scale the entity. But, I would like to know if it is possible to scale the ShapeResource itself without attaching it to an entity.
I have a MeshResource and I would like to create a collision component from it.
let childBounds = child.visualBounds(relativeTo: self)
var childShape: ShapeResource
do {
// Crashed by the following line instead of throwing a Swift Error
childShape = try await ShapeResource.generateConvex(from: childModel.mesh);
} catch {
childShape = ShapeResource.generateBox(size: childBounds.extents)
childShape = childShape.offsetBy(translation: childBounds.center)
}
Based on this document: https://developer.apple.com/documentation/realitykit/shaperesource/generateconvex(from:)-6upj9
Will throw an error if mesh does not define a nonempty convex volume. For example, will fail if all the vertices in mesh are coplanar.
But, the method crashes the app instead of throwing a Swift error
Incident Identifier: 35CD58F8-FFE3-48EA-85D3-6D241D8B0B4C
CrashReporter Key: FE6790CA-6481-BEFD-CB26-F4E27652BEAE
Hardware Model: Mac15,11
...
Version: 1.0 (1)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd_sim [2057]
Coalition: com.apple.CoreSimulator.SimDevice.85A2B8FA-689F-4237-B4E8-DDB93460F7F6 [1496]
Responsible Process: SimulatorTrampoline [910]
Date/Time: 2025-01-26 16:13:17.5053 +0800
Launch Time: 2025-01-26 16:13:09.5755 +0800
OS Version: macOS 15.2 (24C101)
Release Type: User
Report Version: 104
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x00000001abf841d0
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [17316]
Triggered by Thread: 0
Thread 0 Crashed:
0 CoreRE 0x1abf841d0 REAssetManagerCollisionShapeAssetCreateConvexPolyhedron + 232
1 CoreRE 0x1abf845f0 REAssetManagerCollisionShapeAssetCreateConvexPolyhedronFromMesh + 868
2 RealityFoundation 0x1d25613bc static ShapeResource.generateConvex(from:) + 148
Here is the message on the app console from Xcode
/Library/Caches/com.apple.xbs/Sources/REKit_Sim/ThirdParty/PhysX/physx/source/physxcooking/src/convex/QuickHullConvexHullLib.cpp (935) : internal error : QuickHullConvexHullLib::findSimplex: Simplex input points appers to be coplanar.
Failed to cook convex mesh (0x3)
assertion failure: 'convexPolyhedronShape != nullptr' (REAssetManagerCollisionShapeAssetCreateConvexPolyhedron:line 356) Bad parameters passed for convex mesh creation.
Message from debug
The above crash happened on a visionOS simulator (visionOS 2.2 (22N840)
I'm trying to implement a prototype to render virtual objects in a mixed immersive space on the camer frames captured by CameraFrameProvider.
Here are what I have done:
Get camera's instrinsics from frame.primarySample.parameters.intrinsics
Get camera's extrinsics from frame.primarySample.parameters.extrinsics
Get the device anchor by worldTrackingProvider.queryDeviceAnchor(atTimestamp: CACurrentMediaTime())
Setup a RealityKit.RealityRenderer to render virtual objects on the captured camera frames
let realityRenderer = try RealityKit.RealityRenderer()
realityRenderer.cameraSettings.colorBackground = .outputTexture()
let cameraEntity = PerspectiveCamera()
// see https://developer.apple.com/forums/thread/770235
let cameraTransform = deviceAnchor.originFromAnchorTransform * extrinsics.inverse
cameraEntity.setTransformMatrix(cameraTransform, relativeTo: nil)
cameraEntity.camera.near = 0.01
cameraEntity.camera.far = 100
cameraEntity.camera.fieldOfViewOrientation = .horizontal
// manually calculated based on camera intrinsics
cameraEntity.camera.fieldOfViewInDegrees = 105
realityRenderer.entities.append(cameraEntity)
realityRenderer.activeCamera = cameraEntity
Virtual objects, which should be seen in the camera frames, are clipped out by the camera transform.
If I use deviceAnchor.originFromAnchorTransform as the camera transform, virtual objects can be rendered on camera frames at wrong positions (I think it is because the camera extrinsics isn't used to adjust the camera to the correct position).
My question is how to use the camera extrinsic matrix for this purpose?
Does the camera extrinsics point to a similar orientation of the device anchor with some minor rotation and postion change? Here is an extrinsics from a camera frame. It seems that the direction of Y-axis and Z-axis are flipped by the extrinsics. So the camera is point to a wrong direction.
simd_float4x4([[0.9914258, 0.012555369, -0.13006608, 0.0], // X-axis
[-0.0009778949, -0.9946325, -0.10346654, 0.0], // Y-axis
[-0.13066702, 0.10270659, -0.98609203, 0.0], // Z-axis
[0.024519, -0.019568002, -0.058280986, 1.0]]) // translation
We have an existing app and we would like to add an additional privacy policy URL. So, there will be two policy URLs for the app. Is that possible? If yes, how to add the second privacy policy URL?
The following code can connect to the host successfully on iOS Simulator. But the code fails to connect on an iOS device. I has checked the WLAN and Cellular Data are both allowed for the app clip.
I also try NSURLSession to access to a web site. NSURLSession is successful to get the data on both iOS simulator and iOS devices.
Do we allow to use socket streams in App Clips?
Xcode Version 12.0 beta (12A6159)
iOS 14 Beta: 14.0
But the code fails to connect on
CFReadStreamRef readStream;
CFWriteStreamRef writeStream;
//create socket connection
CFStreamCreatePairWithSocketToHost(NULL, (__bridge CFStringRef)@"www.baidu.com", 443, &readStream, &writeStream);
NSInputStream* inputStream = (__bridge_transfer NSInputStream*)readStream;
NSOutputStream* outputStream = (__bridge_transfer NSOutputStream*)writeStream;
[inputStream setProperty:NSStreamSocketSecurityLevelNegotiatedSSL
forKey:NSStreamSocketSecurityLevelKey];
// and open the streams
[inputStream open];
[outputStream open];
Console output:
[connection] nw_socket_connect [C1.1:1] connectx(5, [srcif=0, srcaddr=<NULL>, dstaddr=61.135.169.121:443], SAE_ASSOCID_ANY, 0, NULL, 0, NULL, SAE_CONNID_ANY) failed: [65: No route to host]
[connection] nw_socket_connect [C1.1:1] connectx failed (fd 5) [65: No route to host]
[] nw_socket_connect connectx failed [65: No route to host]
[connection] nw_socket_connect [C1.2:1] connectx(5, [srcif=0, srcaddr=<NULL>, dstaddr=61.135.169.125:443], SAE_ASSOCID_ANY, 0, NULL, 0, NULL, SAE_CONNID_ANY) failed: [65: No route to host]
[connection] nw_socket_connect [C1.2:1] connectx failed (fd 5) [65: No route to host]
[] nw_socket_connect connectx failed [65: No route to host]
[connection] nw_connection_get_connected_socket [C1] Client called nw_connection_get_connected_socket on unconnected nw_connection
TCP Conn 0x280200370 Failed : error 0:65 [65]