Post

Replies

Boosts

Views

Activity

App Tester shares crash report over TestFlight, but I get nothing.
I have an app currently in beta testing, and my primary QA person tries to send the crash log for this one particular bug: The night before, he finished testing the app on an iPhone, left the app running in the background and the phone charging. The phone went to sleep. The phone is running the latest release of iOS 18. In the morning he wakes up the phone and gets the alert that the app "xxx Crashed. Do you want to share additional information with the developer?" He taps Share, but I don't get a crash log in the organizer. I've gotten other crash logs in the organizer as recently as 2 weeks ago, This one has occurred several times and always fails to produce a report. I have tried doing the same steps without seeing any crash. Any ideas? Mike
1
1
77
Mar ’25
App rejected: saying that the monthly subscription...was not submitted for review
The monthly subscription is clearly listed in the Monthly/Yearly subscription group. I structured it similar to the first 2 items in Apple's Ocean Journal Multiple subscription and durations. The only difference is the order in which I put them. I didn't think that matters, does it? The Developer action flag does not allow me to rearrange or edit the Yearly group. I'm stuck. What should I do?
0
0
259
Feb ’25
MFMessageComposeViewController, SwiftUI, with attachment
My app needs to send iMessages with an attached data file. The view comes up modally and the user needs to press the send button. When the button is tapped the error occurs and the attachment and message is not delivered. I have tried three implementations of UIViewControllerRepresentable for MFMessageComposeViewController that have worked for iOS 15 and 16, but not 17. If I make the simplest app to show the problem, it will reliably fail on all iOS versions tested. If I remove the attachment, the message gets sent. In contrast, the equivalent UIViewControllerRepresentable for email works perfectly every time. After struggling for weeks to get it to work, I am beginning to believe this is a timing error in iOS. I have even tried unsuccessfully to include dispatch queue delays. Has anybody else written a swiftUI based app that can reliably attach a file to an iMessage send? UIViewControllerRepresentable: import SwiftUI import MessageUI import UniformTypeIdentifiers struct AttachmentData: Codable { var data:Data var mimeType:UTType var fileName:String } struct MessageView: UIViewControllerRepresentable { @Environment(\.presentationMode) var presentation @Binding var recipients:[String] @Binding var body: String @Binding var attachments:[AttachmentData] @Binding var result: Result<MessageComposeResult, Error>? func makeUIViewController(context: Context) -> MFMessageComposeViewController { let vc = MFMessageComposeViewController() print("canSendAttachments = \(MFMessageComposeViewController.canSendAttachments())") vc.recipients = recipients vc.body = body vc.messageComposeDelegate = context.coordinator for attachment in attachments { vc.addAttachmentData(attachment.data, typeIdentifier: attachment.mimeType.identifier, filename: attachment.fileName) } return vc } func updateUIViewController(_ uiViewController: MFMessageComposeViewController, context: Context) { } func makeCoordinator() -> Coordinator { return Coordinator(presentation: presentation, result: $result) } class Coordinator: NSObject, MFMessageComposeViewControllerDelegate { @Binding var presentation: PresentationMode @Binding var result: Result<MessageComposeResult, Error>? init(presentation: Binding<PresentationMode>, result: Binding<Result<MessageComposeResult, Error>?>) { _presentation = presentation _result = result } func messageComposeViewController(_ controller: MFMessageComposeViewController, didFinishWith result: MessageComposeResult) { defer { $presentation.wrappedValue.dismiss() } switch result { case .cancelled: print("Message cancelled") self.result = .success(result) case .sent: print("Message sent") self.result = .success(result) case .failed: print("Failed to send") self.result = .success(result) @unknown default: fatalError() } } } } SwiftUI interface: import SwiftUI import MessageUI struct ContentView: View { @State private var isShowingMessages = false @State private var recipients = ["4085551212"] @State private var message = "Hello from California" @State private var attachment = [AttachmentData(data: Data("it is not zip format, however iMessage won't allow the recipient to open it if extension is not a well-known extension, like .zip".utf8), mimeType: .zip, fileName: "test1.zip")] @State var result: Result<MessageComposeResult, Error>? = nil var body: some View { VStack { Button { isShowingMessages.toggle() } label: { Text("Show Messages") } .sheet(isPresented: $isShowingMessages) { MessageView(recipients: $recipients, body: $message, attachments: $attachment, result: $result) } .onChange(of: isShowingMessages) { newValue in if !isShowingMessages { switch result { case .success(let type): switch type { case .cancelled: print("canceled") break case .sent: print("sent") default: break } default: break } } } } } } #Preview { ContentView() }
1
1
720
Sep ’24
How do I get iMessage to recognize my custom file format?
I have an app that shares files between users of the app by attaching the file to either an email or an iMessage. The original implementation attached a JSON file but I was able to build a smaller binary file, allowing me to have more content within the file size limit. Mail attachments work just fine. I've modified the info plist to define the app-specific file extension. But no matter where I define it, iMessage refuses to put up a preview where the user can select my app to open it. I tried putting it in a document type with a public.text, public.data roles. I tried adding it to the Exported Type Identifiers as a public-mime-type. I made sure the identifiers all have the reverse url com... My temporary iMessage workaround is to name the file with .zip, a known public extension. It doesn't have valid zip data, it just declares it as a zip file and allows the user to pick my app to open it because I added zip as an exported type identifier. But I don't want the user to think my app can also open zip files. I would be happy if the preview looked similar to what is done for zip files - a file icon, but with my app's name inside the icon. Any help with this configuration nightmare is appreciated. Mike
0
0
289
Aug ’24
Need AirPlay capability for Image and Live Photos
The Photos app on iOS is capable of playing Live Photos and showing images on another screen. My SwiftUI app is able to AirPlay videos and I can make the button appear or not using AVPlayer's isExternalPlaybackActive boolean. However, that boolean is not available for images and Live Photos. I've tried using an AVRoutePickerView(), which gives me a button for the user to start/stop AirPlay, but I have not found a way to associate it with SwiftUI's Image or my LivePhotoView: struct LivePhotoView: UIViewRepresentable { var livephoto: PHLivePhoto @Binding var playback:Bool @AppStorage("playAudio") var playAudio = userDefaults.bool(forKey: "playAudio") func makeUIView(context: Context) -> PHLivePhotoView { let view = PHLivePhotoView() try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}()) view.contentMode = .scaleAspectFit return view } func updateUIView(_ lpView: PHLivePhotoView, context: Context) { lpView.livePhoto = livephoto if playback { lpView.isMuted = false try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}()) try! AVAudioSession.sharedInstance().setActive(true) lpView.isMuted = false lpView.startPlayback(with: .full) } else { lpView.stopPlayback() } } } struct RouteButtonView: UIViewRepresentable { func makeUIView(context: Context) -> AVRoutePickerView { let routePickerView = AVRoutePickerView() routePickerView.tintColor = .gray routePickerView.prioritizesVideoDevices = true return routePickerView } func updateUIView(_ uiView: AVRoutePickerView, context: Context) { // No update needed } } Am I missing something? If it's a system internal API only available to the Photos App, why? Mike
1
1
920
Jun ’24