Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Activity

Bonjour Capabilities
I'm a noobie but doing a tutorial where it requires me to add these capabilities to my project. Access Wi-Fi Information Network Extentions Bonjour Services I have the Access Wi-Fi and Network Extentions installed but I could not find Bonjour Services Any help would be much appreciated! Thanks
0
0
139
Jan ’25
Updating Swift Packages via command line (Xcode 16)
We have a tool in our CI which periodically updates our iOS project's Swift Package Manager dependencies. The script that does a few things, but generally speaking it first removes the existing package.resolved at ./ProjectName.xcodeproj/project.xcworkspace/xcshareddata/swiftpm/Package.resolved and then runs: xcodebuild -resolvePackageDependencies -disablePackageRepositoryCache This regenerated the package.resolved file to reflect the latest, updated versions of the dependencies. It is the equivalent of doing Xcode > File > Packages> Update to Latest Package Versions. This has been working perfectly for a long time, but has stopped working since we moved from Xcode 15.4 to Xcode 16. I have also tested this with Xcode 16.1 and 16.2 - with no luck. I have tried running this command locally, and can confirm it is not an issue with the environment of our CI. I have also tried swift package update. Both of these commands appear to work - no errors are thrown, and the logs say "resolved source packages". However, they are not updating the packages. They are making no changes to the package.resolved file. However, using Xcode > File > Packages> Update to Latest Package Versions in Xcode 16 works as expected - the packages are updated and the .resolved file is updated. Is there now a different way to update Swift packages via the command line, or is this likely an Xcode 16 bug? Update: This feels broken, so I have submitted a Feedback Report (FB16108036) but perhaps someone can suggest a workaround for the time being!
0
3
900
Dec ’24
Multiple issues on Xcode Cloud, WeatherKit and TestFlight after changing bundle id
After changing the bundle identifier of my app, I’ve encountered several issues that I can’t seem to resolve, even though I’ve followed all the necessary steps. The app with the previous identifier was live on Testflight, and working perfectly fine, but now I’m facing the following problems: WeatherKit Authentication Issue WeatherKit has stopped working, and I’m getting authentication errors. I’ve updated the app in the Developer Portal to reflect the new bundle ID, but it still doesn’t authenticate properly. Xcode Cloud Configuration Issue: Xcode is asking me to set up Xcode Cloud again, which I understand is expected after a bundle ID change. However, during the setup process, it fails to recognize my remote repository, even though the repository is correctly added and visible under the Source Control tab. TestFlight Internal Testing Issue: I manually uploaded a build to TestFlight, but internal testers cannot use it because the invitation appears as invalid. This wasn’t an issue with the app’s previous identifier. It seems like the bundle ID change has caused some fundamental issues that I can’t resolve despite following all the usual instructions. Has anyone experienced this before or knows how to resolve these problems? I'm using the latest Xcode 16.2 on Mac OS Sequoia 15.2
0
0
531
Jan ’25
Xcconfig variables doesn't get loaded automatically in project settings
Hi, I am using xcode build that receive it's configuration using xcconfig files, those add some new definitions to the project, like the location of openssl library. If xcode environment variable include prefix that matches one of the fields in the project settings, it is automatically referred to as if you added it to that field. for example : the var HEADER_SEARCH_PATHS_openssl_libopenssl has value (openssl headers' path) that should be automatically added to the field Headers Search Paths under project settings. For some reason it stopped working for me and i'm not sure why (i've tried to release the xcconfig files). any idea why ? Thanks !
0
1
341
Jan ’25
[Xcode:BuildSettings] Keep some warnings as warnings while the rest as errors
We have a big iOS project and we are using .xcconfig files to define our compiler options and build settings. We have our SWIFT_TREAT_WARNINGS_AS_ERRORS set to YES so that all Swift related warnings will be reported as errors. Now, we are trying to migrate to Xcode 16.1 and set 'targeted' in the 'Strict Concurrency Checking' flag. This produces some errors that are related to Swift's concurrency checks. We are now planning to have an approach where we still want to keep SWIFT_TREAT_WARNINGS_AS_ERRORS as is but we want all concurrency related warnings to be still treated as warnings while the rest will get reported as errors. We found this new compiler option - https://forums.swift.org/t/warnings-as-errors-exceptions/72925. It looks like the one we want but I think it is still not out yet and we need to wait until Swift 6.1 (correct me if im wrong). Or is there any other way to do what we want to achieve?
0
0
357
Jan ’25
XCode git commit showing other Projects
I imported few files in my Xcode project from other projects using drag and drop. Even though the files are copied in the new project and there are no softlinks pointing to the location of other project, the issue is whenever I do a git commit and push, Xcode keeps showing all the projects to commit to rather than just the current project. There seem to be no setting to permanently remove git dependency of other projects. Is there anyway to remove references to other projects?
0
0
290
Jan ’25
Is it possible to send pushes through the Apple production server to an app running in Xcode?
I can sucessfully send pushes to an app (which has been installed/run via Xcode) when the pushes are going through the Apple sandbox server. However I want to test the server is configured correctly to send them through the Apple production server. In the Xcode scheme I tried to change the build configuration to release (and ticked debug executable off) ,however the pushes still only work when sent through the sandbox. Is there a way of installing/running the app using Xcode such that its compatible with the push production environment? Does the APS Environment entitlement come into play here? this only ever says development. (The app is on behalf of a 3rd party company, they've added me to their apple developer account but with limited powers, I can't upload to Testflight nor make an ad-hoc release with with to test with)
0
0
285
Jan ’25
Cannot Revoke iOS Development Certificate – "The specified resource does not exist"
I'm facing an issue with revoking an iOS Development certificate in the Apple Developer Console. The certificate is in "Pending Approval" status, and when I attempt to revoke it, I receive the following error: "The specified resource does not exist. There is no certificate with ID 'XXXXXXXXX' on this team." Despite this error, the certificate still appears in my list, and the only available action remains "Revoke." Steps I've tried so far: Refreshed the page and cleared the browser cache Logged out and back into my Developer account Tried using different browsers (Safari, Chrome) Checked Xcode > Preferences > Accounts for certificate status Contacting Apple Support (Not response since over 3 weeks) Additional Info: The certificate type is iOS Development, not Distribution. The status has been "Pending Approval" since creation.
0
0
277
Feb ’25
How does DerivedData really work on Xcode Cloud?
Currently Im trying to save few files in it but on every run this folder is empty. I have the following script in ci_post_clone.sh mkdir ${CI_DERIVED_DATA_PATH} cd ${CI_DERIVED_DATA_PATH} ls -als touch test return 1 My expectation is that on the second run it would show test file in DerivedData or fail at creating the directory, but the issue is that this file is not created. does it need a successful build for this folder to be saved? in Xcode Cloud, in workflow environment I have unchecked Clean build Xcode Cloud will not restore derived data or caches for your builds, which may take significantly longer as a result.. One more question here would be what is meant by caches? are there other folders being saved? Also a bit of context. Im trying to build a Kotlin Multiplatform project but it fails with Showing All Issues > Could not resolve all files for configuration ':composeApp:iosArm64CompileKlibraries'. > Could not download lifecycle-viewmodel.klib (androidx.lifecycle:lifecycle-viewmodel-iosarm64:2.9.0-alpha03) > Could not get resource 'https://dl.google.com/dl/android/maven2/androidx/lifecycle/lifecycle-viewmodel-iosarm64/2.9.0-alpha03/lifecycle-viewmodel-iosarm64-2.9.0-alpha03.klib'. > Could not GET 'https://dl.google.com/dl/android/maven2/androidx/lifecycle/lifecycle-viewmodel-iosarm64/2.9.0-alpha03/lifecycle-viewmodel-iosarm64-2.9.0-alpha03.klib'. > Got socket exception during request. It might be caused by SSL misconfiguration > Connection reset by peer my guess is that Xcode Cloud or Google servers probably has some limitation. So if I cache those libraries my project will hopefuly compile once again.
0
1
521
Dec ’24
Spatial audio personalised profile access entitlement "not available on iOS"
I followed this guide, and added com.apple.developer.spatial-audio.profile-access as an entitlement to the app (via the + Capability button – Spatial Audio Profile). I have a audio graph that outputs to AVAudioEngine. However, the Xcode Cloud build ended up with this error: Invalid Code Signing Entitlements. Your application bundle's signature contains code signing entitlements that are not supported on iOS. Specifically, key 'com.apple.developer.spatial-audio.profile-access' in 'Payload/…' is not supported. This guide says it's available on iOS. Does it mean not on iOS 17? In which case how can I provide fallback for iOS 17?
0
0
440
Jan ’25
NSItemProvider make .jpg image to .jpeg
Hi, I'm not sure why but when my fileURL is .jpg file and I drop the file from my app to Finder folders it make the dropped file as .jpeg Is there a way to fix it? [.onDrag { if FileManager.default.fileExists(atPath: file.path) { // Provide the file as an item for dragging let fileURL = URL(fileURLWithPath: file.path) let itemProvider = NSItemProvider(contentsOf: fileURL) // Remove the file extension in the suggestedName let baseNameWithoutExtension = fileURL.deletingPathExtension().lastPathComponent itemProvider?.suggestedName = baseNameWithoutExtension return itemProvider ?? NSItemProvider() } else { // Handle the case where the file no longer exists print("File no longer exists at path: \(file.path)") return NSItemProvider() } })
0
0
358
Dec ’24
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
443
Feb ’25
How to let instruments - app launch template wait for process to launch , rather than actively launch by instruments
Hi there, How to let instruments - app launch template wait for process to launch , rather than actively launch by instruments? I need to profile the app launch performance when clicking a push notification or cold launch via a url link. How to let instruments to wait for the process and collect the data? Currently, I tried the command xctrace record --template "App Launch" --attach MyApp --device-name 'Phone-Dev' --output mytrace.trace But it soon failed with 'Cannot find process matching name: MyApp'. How to make it work?
0
0
508
Jan ’25
dyld: Symbol not found ... certain MeshResource APIs on iOS 17.x
I submitted feedback as FB16463501 -- posting here for others to see, or maybe for Apple to share any help if there are workarounds, etc.: Targets below iOS 18.x fail to launch app due to dyld[xxxxx]: Symbol not found: errors when referencing: MeshResource.init(from:) async - https://developer.apple.com/documentation/realitykit/meshresource/init(from:)-b7hb i.e. dyld[61511]: Symbol not found: _$s10RealityKit12MeshResourceC0A10FoundationE4fromACSayAD0C10DescriptorVG_tYaKcfC MeshResource.replace(with:) async - https://developer.apple.com/documentation/realitykit/meshresource/replace(with:)-8uvri i.e. dyld[78830]: Symbol not found: _$s10RealityKit12MeshResourceC0A10FoundationE7replace4withyAcDE8ContentsV_tYaKF Targets tested that exhibit issue: (DYLD errors) Device: iOS 17.7.2, iPhone 14 Pro Max Simulator: iOS 17.5 (21F79), iPhone 15 System Information: macOS Version 15.3 (Build 24D60) Xcode 16.2 (23507) (Build 16C5032a) MRE -- include this code in your app: (no need to invoke, just reference) static func addOrUpdateEntityModel_MRE(_ entity: ModelEntity) async { let descriptor = MeshDescriptor(name: "MyDescriptor") do { if let modelComponent = entity.model { // update existing ModelComponent if let model = try? MeshResource.Model(id: "MyModelId", descriptors: [descriptor]) { var contents = MeshResource.Contents() contents.models = .init([model]) try await modelComponent.mesh.replace(with: contents) /// `dyld[78830]: Symbol not found: _$s10RealityKit12MeshResourceC0A10FoundationE7replace4withyAcDE8ContentsV_tYaKF` } } else { //create new ModelComponent /// Comment-out the 2 lines below == dyld error for above `MeshResource.replace(with:)` let meshRes = try await MeshResource(from: [descriptor]) /// `dyld[61511]: Symbol not found: _$s10RealityKit12MeshResourceC0A10FoundationE4fromACSayAD0C10DescriptorVG_tYaKcfC` entity.model = .init(mesh: meshRes, materials: [SimpleMaterial()]) } } catch { fatalError() } }
0
0
503
Feb ’25
XCode 16.2 - error: unable to open dependencies file
Hello, I am encountering "unable to open dependencies file" error in XCode that started after updating to Xcode version 16.2 and macOS version 15.2. The error message I receive is as follows: error: unable to open dependencies file (/Users/user/Library/Developer/Xcode/DerivedData/MyProject-cwpcmnebzjpgkzcuoauxlaeiqrsg/Build/Intermediates.noindex/MyProject.build/Debug-iphoneos/MyProject.build/Objects-normal/arm64/MyProject-master.d) (in target 'MyProject' from project 'MyProject') This problem didn’t occur with XCode 16.1; the project was building successfully before the update. Now, even reverting to XCode 16.1 doesn’t resolve the issue anymore. Here’s what I’ve tried so far without success: Switched the compilation mode to “Whole Module” Cleaned the build folder Cleared Derived Data Thank you in advance for any suggestions!
0
4
909
Dec ’24
Developing First Ever IOS App - Have Very Specific Questions to Unblock my Testing
I have developed an app that I had been testing on the hardware device with the developer profile signed builds, I had setup a CloudKit container in development mode and also had tested with Production mode and they are working as expected. I have also tested storekit auto renewal subscriptions using Storekit Config file and all of that is working on the hardware device with the developer profile signed builds. Now comes the Fun Part, I want to use the Distribution profile to test the app for production readiness, I had created a distribution profile and had set that up in the Release under target of the app in Xcode, I have also created sandbox tester account (which is showing inactive even after 7 days - though I am also logged in with this sandbox tester account on a hardware device and under developer setting it shows as a sandbox tester account) All the subscriptions are showing Ready to Submit in the App Store Connect. I need help understand this whole flow, how to ensure I can test CloudKit and storekit for production readiness and then publish my app for the review. Thank you.
0
0
298
Feb ’25
Apple is not granting access to the developer account, but funds have already been partially charged
Hello, We are facing a serious issue while trying to gain access to our developer account. We have submitted all the required documents (passport scan, payment information), but we still have not received access. At the same time, funds have already been partially charged from our card, yet there has been no progress from Apple. Support is not providing clear answers, and the process is being delayed. This is seriously affecting our work! Has anyone encountered a similar situation? How can we speed up the resolution? Any advice would be greatly appreciated. Thank you!
0
0
124
Feb ’25