I have been wanting to make a mobileprovision file and a p12 file, how do i make these files but using only ipad?
Xcode
RSS for tagBuild, test, and submit your app using Xcode, Apple's integrated development environment.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone, I’ve been trying since yesterday to set up an Apple Watch app alongside my iOS app, but it’s not working at all. I created the app by adding a new target, selected watchOS > App, and linked it to my iOS project. When I run the Apple Watch scheme directly, the app works, but if I try to install it on my Apple Watch via the iOS app, it doesn’t work. Every time I tap install, I get a message saying that the installation failed. While reading the documentation, they mention an Info.plist file that is supposed to be generated with the watchOS app, but when I check, I don’t see any .plist file. Is that normal? Thanks for your help.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Hello All, I see an issue while running the Notification content Extension on simulator without checking the "Copy only when installing in app target -> Build Phases -> Embed App Extensions"
If I check "Copy only when installing in app target" then only it is working.
Can someone please confirm if Notification Content Extension is working on simulator. If yes how can we do that. Please share the details
Just wondering if it is possible to configure a secondary macbook to act as a run destination in Xcode, similar to how you would configure an iPhone as a run destination.
I have tried connecting my device via USB-C and I can see that my macbook detects the second macbook via USB but it does not show up when trying to add devices in Xcode. I suppose this flow might not be supported?
the Xcode Version 16.2 (16C5032a),
I want to know how to setup a directory in Xcode project when develop the iOS app.
Here is the whole thing:
I start a new iOS App project "test_path".
then I right click, and choose "New Folder", to make a new folder "configx"
then I right click on the configx folder, and choose the "add files to test_path", add a file in it
But the folder does not exist in the project when try to access by "Bundle.main.urls" func.
5. when ls in the Mac, /data/Containers/Bundle/Application/E4F11903-3FAD-467F-A4CD-60AC68D64934/test_path.app, the file just at the root path of test_path.app, no "configx" folder ahead the file.
so, how to setup a directory or a path in iOS project?
Hi,
I'm developing an iOS/WatchOS app and have an issue with Preview for WatchOS components.
The error message is
Could not find target description for “ContentView.swift”
Preview for all other views in the iOS app work fine.
Also, the app builds and runs fine on the watch simulator and a physical watch. Preview worked fine earlier today and I didn't exclude it in EXCLUDED_SOURCE_FILE_NAMES.
Does anyone have an idea what causes this issue?
Diagnostics
Topic:
Developer Tools & Services
SubTopic:
Xcode
We are trying to create a screentime app using the Family Controls as well as Device activity frameworks. The build succeeds but while pushing to an iphone we are getting an info.plist file for deviceactivity.framework could not be found error. For reference when using the Screentime API a physical device must be used not a simulator. When we remove the device activity framework this error also occurs for the family controls framework. We have added the Family Controls(development) Capability and applied for the distribution capability. We have redownloaded xcode multiple times on the main device, deleted derived data, and redownloaded all of the iphone SDKs and the issue still persists.
Environment
Xcode version: Xcode 16.0
iOS device system version: iOS 18.4 / 18.5
Problem description
When debugging iOS 18.4/18.5 on a real device, the App starts very slowly after clicking Run, usually taking 2-3 minutes, which is much slower than iOS 18.3. It manifests as Xcode staying in the Launch stage for a long time, and the application stuck in the opening screen interface. Start the APP first, then attach the process, and this problem will not exist
Location process
Use lldb to enable logs log enable gdb-remote packets
After observing the logs, it was found that:
The jam mainly occurs when processing two data packets:
jGetLoadedDynamicLibrariesInfos and qProcessInfo
These two commands involve dynamic library loading information and process meta information, and the response time is much higher than the normal level, far exceeding other gdb-remote instructions
Use Instruments / Profiler to analyze:
The CPU peak of the real machine debugserver is concentrated in the dynamic library information loading process of _dyld_process_info_create
Is there a way to deploy a modified debugserver on the real machine to further locate the problem
How to enable and view the debugsever log to further locate the problem
When Xcode 16.2 connects iPad to real machine debugging using the view in the picture box, the mac will be stuck directly, the computer cursor will disappear, and the computer cannot do any operation
Topic:
Developer Tools & Services
SubTopic:
Xcode
When a new application runs on iOS 18.4 simulator and tries to access the Speech Framework, prompting a request for authorisation to use Speech Recognition, the application will crash if the user clicks allow. Same issue in the visionOS 2.4 simulator.
Using Swift 6. Report Identifier: FB17686186
/// Checks speech recognition availability and requests necessary permissions.
@MainActor
func checkAvailabilityAndPermissions() async {
logger.debug("Checking speech recognition availability and permissions...")
// 1. Verify that the speechRecognizer instance exists
guard let recognizer = speechRecognizer else {
logger.error("Speech recognizer is nil - speech recognition won't be available.")
reportError(.configurationError(description: "Speech recognizer could not be created."), context: "checkAvailabilityAndPermissions")
self.isAvailable = false
return
}
// 2. Check recognizer availability (might change at runtime)
if !recognizer.isAvailable {
logger.error("Speech recognizer is not available for the current locale.")
reportError(.configurationError(description: "Speech recognizer not available."), context: "checkAvailabilityAndPermissions")
self.isAvailable = false
return
}
logger.trace("Speech recognizer exists and is available.")
// 3. Request Speech Recognition Authorization
// IMPORTANT: Add `NSSpeechRecognitionUsageDescription` to Info.plist
let speechAuthStatus = SFSpeechRecognizer.authorizationStatus()
logger.debug("Current Speech Recognition authorization status: \(speechAuthStatus.rawValue)")
if speechAuthStatus == .notDetermined {
logger.info("Requesting speech recognition authorization...")
// Use structured concurrency to wait for permission result
let authStatus = await withCheckedContinuation { continuation in
SFSpeechRecognizer.requestAuthorization { status in
continuation.resume(returning: status)
}
}
logger.debug("Received authorization status: \(authStatus.rawValue)")
// Now handle the authorization result
let speechAuthorized = (authStatus == .authorized)
handleAuthorizationStatus(status: authStatus, type: "Speech Recognition")
// If speech is granted, now check microphone
if speechAuthorized {
await checkMicrophonePermission()
}
} else {
// Already determined, just handle it
let speechAuthorized = (speechAuthStatus == .authorized)
handleAuthorizationStatus(status: speechAuthStatus, type: "Speech Recognition")
// If speech is already authorized, check microphone
if speechAuthorized {
await checkMicrophonePermission()
}
}
}
Hi,
I'm working in unity and I've implemented Firebase Phone Number Authentication in it. Everything works fine when I directly install build from xCode. App Attest screen shows up, user receives OTP on their phone and login works. But when I download the same build from TestFlight, it gets stuck after the user sends OTP request.
I've added Push Notifications and App Attest in capabilities. I've also additionally added Remote Notifications.
In device log I see an error about mobile provisioning file but I've added that to my account also. Is this expected behavior that phone number authentication does not work on TestFlight? If yes, how can I get this approved from apple since they need to test it before approving it.
Thanks!
I am a bit confused. My understanding previously was that a modulemap was required in order to have a bridging header be generated. Now it has come to my attention that a modulemap is both a build input and something you can put in the Modules folder of the built product if you so choose.
I have tried reading the clang modulemap documentation, but am really struggling to connect most of what it says to the problem at hand.
In a project I am working on, the generation of the modulemap file is quite problematic. The framework imports C++ libraries and itself writes Objective-C++ wrappers for them. Currently, the modulemap file is both set as the Module Map File in "Build Settings" and presumably used when the Swift project later imports it.
In this project the modulemap is a list of the objective-c++ header files then export *
I am trying to understand what I would lose if I do one or both of two things:
What happens if I dont set this module map file in the build settings for the objective-c++ framework?
What happens if I dont have a modulemap involved whatsoever in this objective-c++ framework and then it is imported into Swift?
And does any of this change if its compiled as a static vs dynamic library? What if I embed it vs not embed it?
Because the build in the real project is so complicated its hard to isolate what is going on. So I built a smaller sample app.
There is CFramework which has an objective-c++ class. There is SwiftProject which imports that framework and is purely Swift. It imports the module and uses it.
I did not write a modulemap file, and the Swift project builds just fine. In the timeline it:
Prepares packages
Computes target dependency graph
Builds static cache for iPhoneSimulator18.2sdk
As near as I can tell even though the objective-c++ framework is not built with a modulemap in its build settings and there is not a modulemap included in the framework everything works. So then the modulemap file is useless? Perhaps it speeds things up but what step would theoretically be skippable?
Trying to use SwiftUI Preview in Xcode 16.2 on complex project, I have no access to Developer Apple team, instead I am using manually installed provision profiles to test on iPhone.
So, SwiftUI Preview does not work with diagnostic error "Failure: Framework Agent preparation failed: Could not find a team ID"
I'm experiencing an issue with a custom font not loading properly in Xcode 16.3. The font files are included in the bundle, listed in Info.plist, and verified for correct names using UIFont.familyNames, but they still don't appear at runtime. Has anyone else run into this with Xcode 16.3? Could this be related to recent changes in asset packaging or font catalogs?
I’m creating code that performs asynchronous processing using the Promises library (https://github.com/google/promises). In this context, when building the app in Xcode 15.4 and Xcode 16.2, the behavior differs between the two.
I’m using version 2.1.1 of the library. Also, I’ve tried using the latest version, 2.4.0, but the result was the same.
Has anyone encountered the same issue or know an effective solution?
Here's a simple code that reproduces this issue.
@IBAction func tapButton(_: UIButton) {
_ = getInfo()
}
func getInfo() -> Promise<Void> {
Promise(on: .global(qos: .background)) { fulfill, _ in
self.callApi()
.then { apiResult -> Promise<ApiResult> in
print("\(#function), first apiResult: \(apiResult)")
return self.callApi() // #1
}
.then { apiResult in
print("\(#function), second apiResult: \(apiResult)") // #2
fulfill(())
}
}
}
func callApi() -> Promise<ApiResult> {
Promise(on: .global(qos: .background)) { fulfill, _ in
print("\(#function), start")
self.wait(3)
.then { _ in
let apiResult = ApiResult(message: "success")
print("\(#function), end")
fulfill(apiResult)
}
}
}
struct ApiResult: Codable {
var message: String
enum CodingKeys: String, CodingKey {
case message
}
}
The Swift Language version in the build settings is 5.0.
The console output when running the above code is as follows:
When built with Xcode 15.4:
2025/03/21 10:10:46.248 callApi(), start
2025/03/21 10:10:46.248 wait 3.0 sec
2025/03/21 10:10:49.515 callApi(), end
2025/03/21 10:10:49.535 getDeviceInfo(), first apiResult: ApiResult(message: "success")
2025/03/21 10:10:49.535 callApi(), start
2025/03/21 10:10:49.536 wait 3.0 sec
2025/03/21 10:10:52.831 callApi(), end
2025/03/21 10:10:52.832 getDeviceInfo(), second apiResult: ApiResult(message: "success")
The process proceeds from #1 to #2 after completing the code comment in #1.
When built with Xcode 16.2:
2025/03/21 09:45:33.399 callApi(), start
2025/03/21 09:45:33.400 wait 3.0 sec
2025/03/21 09:45:36.648 callApi(), end
2025/03/21 09:45:36.666 getDeviceInfo(), first apiResult: ApiResult(message: "success")
2025/03/21 09:45:36.666 callApi(), start
2025/03/21 09:45:36.666 wait 3.0 sec
2025/03/21 09:45:36.677 getDeviceInfo(), second apiResult: Pending: ApiResult
2025/03/21 09:45:39.933 callApi(), end
The process does not wait for the code comment in #1 to finish and outputs the #2 print statement first.
Additionally, even with Xcode 16.2, when changing the #2 line to "print("(#function), second apiResult: (apiResult.message)")", the output becomes as follows. From this, it seems that referencing the ApiResult type, which is not a String, might have some effect on the behavior.
2025/03/21 10:05:42.129 callApi(), start
2025/03/21 10:05:42.131 wait 3.0 sec
2025/03/21 10:05:45.419 callApi(), end
2025/03/21 10:05:45.437 getDeviceInfo(), first apiResult: ApiResult(message: "success")
2025/03/21 10:05:45.437 callApi(), start
2025/03/21 10:05:45.437 wait 3.0 sec
2025/03/21 10:05:48.706 callApi(), end
2025/03/21 10:05:48.707 getDeviceInfo(), second apiResult: success
Thank you in advance
Topic:
Developer Tools & Services
SubTopic:
Xcode
My app has a Watch app companion, and a Widget.
are this settings correct to distribute these to TestFlight and Apple Store ?
Topic:
Developer Tools & Services
SubTopic:
Xcode
I know there has been issues with SFSpeechRecognizer in iOS 17+ inside the simulator. Running into issues with speech not being recognised inside the visionOS 2.4 simulator as well (likely because it borrows from iOS frameworks). Just wondering if anyone has any work arounds or advice for this simulator issue. I can't test on device because I don't have an Apple Vision Pro.
Using Swift 6 on Xcode 16.3. Below are the console logs & the code that I am using.
Console Logs
BACKGROUND SPATIAL TAP (hit BackgroundTapPlane)
SpeechToTextManager.startRecording() called
[0x15388a900|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format.
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
iOSSimulatorAudioDevice-22270-1: Abandoning I/O cycle because reconfig pending
SpeechToTextManager.startRecording() completed successfully and recording is active.
GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: true
GameManager received tap toggle callback. Tapped Object: None
BACKGROUND SPATIAL TAP (hit BackgroundTapPlane)
GESTURE MANAGER - User is already recording, stopping recording
SpeechToTextManager.stopRecording() called
GameManager.onTapToggle received. speechToTextManager.isAvailable: true, speechToTextManager.isRecording: false
Audio data size: 134400 bytes
Recognition task error: No speech detected <---
Code
private(set) var isRecording: Bool = false
private var recognitionRequest: SFSpeechAudioBufferRecognitionRequest?
private var recognitionTask: SFSpeechRecognitionTask?
@MainActor
func startRecording() async throws {
logger.debug("SpeechToTextManager.startRecording() called")
guard !isRecording else {
logger.warning("Cannot start recording: Already recording.")
throw AppError.alreadyRecording
}
currentTranscript = ""
processingError = nil
audioBuffer = Data()
isRecording = true
do {
try await configureAudioSession()
try await Task.detached { [weak self] in
guard let self = self else {
throw AppError.internalError(description: "SpeechToTextManager instance deallocated during recording setup.")
}
try await self.audioProcessor.configureAudioEngine()
let (recognizer, request) = try await MainActor.run { () -> (SFSpeechRecognizer, SFSpeechAudioBufferRecognitionRequest) in
guard let result = self.createRecognitionRequest() else {
throw AppError.configurationError(description: "Speech recognition not available or SFSpeechRecognizer initialization failed.")
}
return result
}
await MainActor.run {
self.recognitionRequest = request
}
await MainActor.run {
self.recognitionTask = recognizer.recognitionTask(with: request) { [weak self] result, error in
guard let self = self else { return }
if let error = error {
// WE ENTER INTO THIS BLOCK, ALWAYS
self.logger.error("Recognition task error: \(error.localizedDescription)")
self.processingError = .speechRecognitionError(description: error.localizedDescription)
return
}
. . .
}
}
. . .
}.value
} catch {
. . .
}
}
@MainActor
func stopRecording() {
logger.debug("SpeechToTextManager.stopRecording() called")
guard isRecording else {
logger.debug("Not recording, nothing to do")
return
}
isRecording = false
Task.detached { [weak self] in
guard let self = self else { return }
await self.audioProcessor.stopEngine()
let finalBuffer = await self.audioProcessor.getAudioBuffer()
await MainActor.run {
self.recognitionRequest?.endAudio()
self.recognitionTask?.cancel()
}
. . .
}
}
I am on MacOs 15.4.1 which is being deployed with an old version of Ruby ruby 2.6.10 and bundler version 1.17.2. I ran pod install after generating the pods with these versions and during the application compilation there have been several issues reported against pods code.
Test Coverage shows "No Coverage Data" on Xcode 16.3. Code Coverage is checked in my scheme. The same configurations worked on Xcode 16.2 but don't work on Xcode 16.3. any update on when it will be fixed?
Topic:
Developer Tools & Services
SubTopic:
Xcode
Hi, I'm trying to plan some roadmaps out and also have some issues with ios 15.0. Since it's no longer supported by Apple, any word on if/when iOS 15.0 will be removed from the "minimum deployment" version list in Xcode?
Topic:
Developer Tools & Services
SubTopic:
Xcode