Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Activity

UIInputViewController获取外部输入框的点击事件
通过UIInputViewController自定义的一个键盘,键盘内部有一个自定义的输入框UITextField。 需要实现的效果:外部输入框和键盘上自定义输入框可互相点击切换第一响应,从而控制输入的文字插入在哪个输入框。 遇到的问题:当点击了自定义输入框并设置了自定义输入框为第一响应之后,再次点击外部输入框时,无法获取到点击外部输入框的事件,从而无法取消内部输入框的第一响应事件。
4
1
172
Feb ’25
Assistance Needed – Xcode: Command PhaseScriptExecution Failed with a Nonzero Exit Code
Main Issue: While building an iOS project using IL2CPP in Unity, I encountered the following error: Command PhaseScriptExecution failed with a nonzero exit code Sub-Issue: Unity is unable to detect a compatible iPhoneOS SDK, even though Xcode is correctly installed and functioning as expected. Error Message: Unity.IL2CPP.Bee.BuildLogic.ToolchainNotFoundException: IL2CPP C++ code builder is unable to build C++ code. In order to build C++ code for Mac, you must have Xcode installed. Building for Apple Silicon requires Xcode 9.4 and Mac 10.12 SDK. Xcode needs to be installed in the /Applications directory and have a name matching Xcode*.app. Or be selected using xcode-select. It's also possible to use /Library/Developer/CommandLineTools if those match the listed requirements. More information and installation instructions can be found here: https://developer.apple.com/support/xcode/ Specific Xcode versions can be downloaded here: https://developer.apple.com/download/more/ Unable to detect any compatible iPhoneOS SDK! Additional Errors & Observations: bee_backend Not Found chmod: /Users/vaunicacalji/Desktop/DinoKite/Il2CppOutputProject/IL2CPP/build/deploy_x86_64/bee_backend/mac-x86_64/bee_backend: No such file or directory Issue: Manually checking the file, bee_backend does exist and is executable, but the build process still reports it as missing. 2. IL2CPP Build Failure Build failed with 0 successful nodes and 0 failed ones Error: Internal build system error. BuildProgram exited with code 1. 3. Xcode Not Detected by Unity Unable to detect any compatible iPhoneOS SDK! Issue: Running xcode-select -p confirms that Xcode is installed at /Applications/Xcode.app/Contents/Developer, and xcodebuild -showsdks lists available SDKs, including iOS 18.2. Environment Details: macOS Version: Sonoma 14.5 Mac Model: MacBook Air (Retina, 13-inch, 2018) Processor: 1.6 GHz Dual-Core Intel Core i5 Memory: 8GB 2133 MHz LPDDR3 Graphics: Intel UHD Graphics 617 1536MB Unity Version: 2022.2.21f1 Installed Unity Modules: ✅ iOS Build Support ✅ Mac Build Support (IL2CPP) ✅ IL2CPP Current Status & Key Issues: ✅ Xcode is properly installed, and xcode-select -p shows the correct path. ✅ xcodebuild -showsdks confirms that iOS SDK 18.2 is available. ✅ bee_backend is present and executable when checked manually. ❌ Unity still reports Unable to detect any compatible iPhoneOS SDK! during the build process. ❌ Unable to successfully build the iOS project. Could you please provide guidance on how to resolve this issue? Xcode Build Settings (for reference): Command PhaseScriptExecution failed with a nonzero ALWAYS_SEARCH_USER_PATHS = NO ARCHS = arm64 ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon CLANG_CXX_LANGUAGE_STANDARD = c++0x CLANG_CXX_LIBRARY = libc++ CLANG_ENABLE_OBJC_ARC = YES CODE_SIGN_IDENTITY[sdk=iphoneos*] = iPhone Distribution CODE_SIGN_IDENTITY[config=Debug] = Apple Development CODE_SIGN_IDENTITY[config=Release] = Apple Distribution CODE_SIGN_STYLE = Manual DEVELOPMENT_TEAM[sdk=iphoneos*] = 4429TL28T7 IPHONEOS_DEPLOYMENT_TARGET = 15.6 LD_RUNPATH_SEARCH_PATHS = $(inherited) @executable_path/Frameworks PRODUCT_BUNDLE_IDENTIFIER = com.torihiplay.DinoKite PROVISIONING_PROFILE_SPECIFIER[sdk=iphoneos*] = DinoKite. SDKROOT = iphoneos SUPPORTED_PLATFORMS = iphoneos TARGETED_DEVICE_FAMILY = 1,2 UNITY_RUNTIME_VERSION = 2022.2.21f1 UNITY_SCRIPTING_BACKEND = il2cpp This build is intended for a production release. I would appreciate any help in resolving this issue. Thank you!
1
0
707
Feb ’25
Assistance Needed – Xcode: Command PhaseScriptExecution Failed with a Nonzero Exit Code
Main Issue: While building an iOS project using IL2CPP in Unity, I encountered the following error: Command PhaseScriptExecution failed with a nonzero exit code Sub-Issue: Unity is unable to detect a compatible iPhoneOS SDK, even though Xcode is correctly installed and functioning as expected. Error Message: Unity.IL2CPP.Bee.BuildLogic.ToolchainNotFoundException: IL2CPP C++ code builder is unable to build C++ code. In order to build C++ code for Mac, you must have Xcode installed. Building for Apple Silicon requires Xcode 9.4 and Mac 10.12 SDK. Xcode needs to be installed in the /Applications directory and have a name matching Xcode*.app. Or be selected using xcode-select. It's also possible to use /Library/Developer/CommandLineTools if those match the listed requirements. More information and installation instructions can be found here: https://developer.apple.com/support/xcode/ Specific Xcode versions can be downloaded here: https://developer.apple.com/download/more/ Unable to detect any compatible iPhoneOS SDK! Additional Errors & Observations: bee_backend Not Found chmod: /Users/vaunicacalji/Desktop/DinoKite/Il2CppOutputProject/IL2CPP/build/deploy_x86_64/bee_backend/mac-x86_64/bee_backend: No such file or directory Issue: Manually checking the file, bee_backend does exist and is executable, but the build process still reports it as missing. 2. IL2CPP Build Failure Build failed with 0 successful nodes and 0 failed ones Error: Internal build system error. BuildProgram exited with code 1. 3. Xcode Not Detected by Unity Unable to detect any compatible iPhoneOS SDK! Issue: Running xcode-select -p confirms that Xcode is installed at /Applications/Xcode.app/Contents/Developer, and xcodebuild -showsdks lists available SDKs, including iOS 18.2. Environment Details: macOS Version: Sonoma 14.5 Mac Model: MacBook Air (Retina, 13-inch, 2018) Processor: 1.6 GHz Dual-Core Intel Core i5 Memory: 8GB 2133 MHz LPDDR3 Graphics: Intel UHD Graphics 617 1536MB Unity Version: 2022.2.21f1 Installed Unity Modules: ✅ iOS Build Support ✅ Mac Build Support (IL2CPP) ✅ IL2CPP Current Status & Key Issues: ✅ Xcode is properly installed, and xcode-select -p shows the correct path. ✅ xcodebuild -showsdks confirms that iOS SDK 18.2 is available. ✅ bee_backend is present and executable when checked manually. ❌ Unity still reports Unable to detect any compatible iPhoneOS SDK! during the build process. ❌ Unable to successfully build the iOS project. Could you please provide guidance on how to resolve this issue? Xcode Build Settings (for reference): ALWAYS_SEARCH_USER_PATHS = NO ARCHS = arm64 ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon CLANG_CXX_LANGUAGE_STANDARD = c++0x CLANG_CXX_LIBRARY = libc++ CLANG_ENABLE_OBJC_ARC = YES CODE_SIGN_IDENTITY[sdk=iphoneos*] = iPhone Distribution CODE_SIGN_IDENTITY[config=Debug] = Apple Development CODE_SIGN_IDENTITY[config=Release] = Apple Distribution CODE_SIGN_STYLE = Manual DEVELOPMENT_TEAM[sdk=iphoneos*] = 4429TL28T7 IPHONEOS_DEPLOYMENT_TARGET = 15.6 LD_RUNPATH_SEARCH_PATHS = $(inherited) @executable_path/Frameworks PRODUCT_BUNDLE_IDENTIFIER = com.torihiplay.DinoKite PROVISIONING_PROFILE_SPECIFIER[sdk=iphoneos*] = DinoKite. SDKROOT = iphoneos SUPPORTED_PLATFORMS = iphoneos TARGETED_DEVICE_FAMILY = 1,2 UNITY_RUNTIME_VERSION = 2022.2.21f1 UNITY_SCRIPTING_BACKEND = il2cpp This build is intended for a production release. I would appreciate any help in resolving this issue. Thank you!
1
0
345
Feb ’25
Compiling old code with XCode give the error [XSym <- Unknown type name ‘Xsym’]
When I try to compile with XCode an app that is about 10 years old gives me the following compiler errors in JSONModel.h header file. There are only 4 lines in that file. Content of file with the error messages. XSym <- Uknown type name ‘Xsym’ 0075 <- Expected identifier or ‘(’ a7b090c047283ff76fc7f1def7ba7425 ../../../JSONModel/JSONModel/JSONModel/JSONModel.h The app was originally compiled wit XCode 3.2, targeting iPhone 7 Now I am using XCode 16, targeting iPhone 12. Original coder is unaccessible. I am very new to this environment and any guidance / assistance is greatly appreciated. Please let me know if it is more appropriate to post this somewhere else.
1
0
216
Feb ’25
Unincluded dependencies failing to build for watchOS companion app.
When trying to use the watchOS preview the simulator fails to load and throws about 15 errors. Clearly the simulator is trying to load all of the iOS packages to the watchOS simulator. However, these packages clearly aren't included in the watchOS app. Furthermore, both apps build successfully to the main simulators, just not the previews. Having a list of errors that simply should not be there is a pretty big annoyance when something is going wrong. How do I fix this?
0
0
306
Feb ’25
Xcode won't symbolicate .ips crash log
I was my understanding that you're supposed to be able to open a .ips crash log in Xcode and see pretty much what you would see if the app had been running in the debugger when it crashed. But the addresses in my app don't get symbolicated. I opened the .ips in the same project and same version of Xcode that was used to create the app. The .dSym file is around, and I can use it to symbolicate using the atos tool. What am I missing?
3
1
2.4k
Feb ’25
Swift, kevent, and wth?!?!?
I have this code: var eventIn = kevent(ident: UInt(self.socket), filter: Int16(EVFILT_WRITE), flags: UInt16((EV_ADD | EV_ENABLE)), fflags: 0, data: 0, udata: nil ) I looked at it and thought why do I have those extra parentheses? So I changed it to var eventIn = kevent(ident: UInt(self.socket), filter: Int16(EVFILT_WRITE), flags: UInt16(EV_ADD | EV_ENABLE), // changed line! fflags: 0, data: 0, udata: nil ) and then kevent gave me EBADF. Does this make sense to anyone?
0
0
242
Feb ’25
I have a bone to pick with Xcode
Despite Xcode being the one and most used IDE for iOS, it is far from perfect. I have used many tools in my career and here are some features, that would make it much better or what I miss from other IDEs. I encourage others to chip in discussion and lets all hope, Apple starts to improve Xcode. Will put each issue in separate comment below. Slow debugger: not sure what bloat Xcode has or what it is doing, but sometimes it can take more than 10 seconds from breakpoint firing to actually see values. Moreover when debugging SwiftUI or something objC I have to drill down to see value of property or use po and p command and hope it works. SwiftUI views and states are a big pain to debug, to see what is changing a value I have to always use didSet trick or some other black magic. Is it too hard to make it easier? Breakpoints with condition can take up to 1 minute to load and I have M1 Max MBP. Just tried cursor IDE few days ago and breakpoints are much faster and without too much bloat in variable inspector.
7
0
361
Feb ’25
Option to download SDK Seperatly.
Hey Guys , I think forcing people to download Simulator along side SDK will increase hude download size, is not a good experience for developers. Make simulator download optional and you can keep it ticked by default inside (Settings/Components/IOS XX.X Support) for beginner developers , but for advanced developer there should be option to untick it and reduce huge download size. If it doesn't result in huge download size,then we have took a wrong turn somewhere.
0
1
251
Feb ’25
Apple developer
in xcode i have select the developer team. but show some error that is here, "Communication with Apple failed Your team has no devices from which to generate a provisioning profile. Connect a device to use or manually add device IDs in Certificates, Identifiers & Profiles. https://developer.apple.com/account/" and show this error also "No profiles for 'com.kuntaldoshi.homeautomation' were found Xcode couldn't find any iOS App Development provisioning profiles matching 'com.kuntaldoshi.homeautomation'."
0
0
382
Feb ’25
Full Disk Access, Run and Debug from Xcode?
I'm working on a macOS app that I want to give "Full Disk Access". When I run from Xcode, I get "permission denied" errors when reading a file in my home directory. What can I do so that I can run and debug from Xcode? I dragged the binary from the derived data folder to the System Preferences list for Full Disk Access, but that seems to do nothing.
5
0
3k
Feb ’25
Sequential Animation on Reality Composer Pro
Dear Developers, I am having some problems with Reality Composer Pro on Mac. Specifically, I don't understand how to manage some timeline functions. I have an object that has a double animation, an opening animation and a closing animation. On a first tap the object should open through animation 1, while on the second tap the object should close through animation 2. Only the two animations conflict. In addition, animation 1 does not stop at the last frame but returns the object to the position of frame 0. Do you have any solutions? Thank you all
0
0
350
Feb ’25
Quality of predictive code completion
I am so perturbed by Xcode's predictive code completion (PCC) - it operates like a 5 year old boy who constantly distracts me code by throwing up entirely irrelevant code sections even if there is nothing to complete. Even in cases where I have a long set of references (object dot something dot something etc), and PCC would be really handy, its list rarely has what is in the immediate scope of my source line as the first couple of suggestions. Additionally, its code suggestions (fully lines of code, structs etc) sort of skirt what might be possible mixed with code from elsewhere in my app. Quality of PCC is awful and at this point with CoPilot doing such a great job, chatGPT lose integration with Xcode, what is Apple thinking, are they serious about their hijacked AI term? I am open to suggestions, what (the heck) could I be doing wrong or should I refresh something. Right now I have turned off PCC, it is not helping.
1
0
380
Feb ’25
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
481
Feb ’25