I am trying to follow the guide for automating creation of a DMG for distribution of a macOS application but can't figure out how to get the ExportOptions.plist from a manual export.
I am trying to follow this guide:
https://developer.apple.com/documentation/security/customizing-the-xcode-archive-process
What is a 'manual export' and what are the steps for creating a manual export.
`# Ask xcodebuild(1) to export the app. Use the export options
# from a previous manual export that used a Developer ID.
/usr/bin/xcodebuild -exportArchive -archivePath "$ARCHIVE_PATH" -exportOptionsPlist "$SRCROOT/ExportOptions.plist" -exportPath "$EXPORT_PATH"`
Where is "$SRCROOT" ? presumably I have to copy this ExportOptions.plist to this location.
Thanks - I am sure this must be blindingly obvious because there seems to be no reference as to how you do this 'manual export' or where one finds the resulting options file.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Anyone else notice that Finder no longer shows most photos exif data when in column mode - usually it appears below the photos image.
Does anyone know where I can find an example of creating a NSView subclass with custom bindings. I need to be able to bind to the object in interface builder.
This is the only reference in Apple documents but as is often the case there appear to be no examples.
https://developer.apple.com/documentation/objectivec/nsobject/nskeyvaluebindingcreation
Topic:
Developer Tools & Services
SubTopic:
Xcode
I am trying to figure out if AppKit provides any support for Async/Await when presenting ViewControllers.
Typically one has to use a completionHandler that gets called when the viewController is dismissed.
Any way to used Async/Await for achieving the same thing ?
The WWDC2024 custom data store example doesn't provide any details on how one would go about creating a DataStoreSnapshot. The example uses a DefaultSnapshot for persisting the data in the DefaultSnapshot format directly in the JSON file.
There appears to be no documentation or examples of how one might create a DataStoreSnapshot from data from another database.
The Apple documentation for DefaultSnapshot provides no examples of how one might create such a snapshot from data retrieved elsewhere.
Can anyone provide a simple example of how one might create such a snapshot from a remote database such that it can be returned as part of the response to a fetch request.
For the purpose of this example let's assume I have a CSV file with rows of data and code to read the data from this file. How would I create a snapshot or snapshots for each of the rows of data.
I have suddenly started getting this error when trying to compile an app that has been compiling fine for days under Xcode 16.
I have tried deleting the derived data folder and cleaning the project build folder, restarting Xcode but all to no avail.
Any other suggestions - perhaps I need to log a bug report with Apple ?!
Topic:
Developer Tools & Services
SubTopic:
Xcode
I am trying to generate a PDF file with certain components draw with Spot Colours. Spot colours are used for printing and I am not clear on how one would do that but I think that if I can create a custom ColorSpace with a specific name or a color that has a specific name - our printer looks for the name Spot1 and they use the colour green.
Can anyone shed any light on how I might be able to do this. For reference I have attached two pdf files with two different spot colours in them.
I need to be able to create similar using CGContext and CGPDFDocument. I can already generate the PDF documents using CMYK colors but don't know how I can create the equivalent "spot" colors.
At the moment I am loading the page from these attached pdf files and scaling them to fill the page to get a background with the spot color. This works fine but I also need to generate text and lines using this same spot color and I am not clear how I could do that using the Core Graphics APIs.
My guess is I need to create a custom ColorSpace with a single color and then use that color for drawing with.
The only 'custom' option for creating a ColorSpace seems to be the CGColorSpace(propertyListPList:) constructor, however there does not appear to be any documentation on what needs to be in the property list to do so. Nor can I find any examples of that.
Any pointers would be appreciated.
Regards
I am trying to create a custom CGColorSpace in Swift on macOS but am not sure I really understand the concepts.
I want to use a custom color space called Spot1 and if I extract the spot color from a PDF I get the following:
"ColorSpace<Dictionary>" = {
"Cs2<Array>" = (
Separation,
Spot1,
DeviceCMYK,
{
"BitsPerSample<Integer>" = 8;
"Domain<Array>" = (
0,
1
);
"Filter<Name>" = FlateDecode;
"FunctionType<Integer>" = 0;
"Length<Integer>" = 526;
"Range<Array>" = (
0,
1,
0,
1,
0,
1,
0,
1
);
"Size<Array>" = (
1024
);
}
);
};
How can I create this same color space using the CGColorSpace(propertyListPlist: CFPropertyList) API
func createSpot1() -> CGColorSpace? {
let dict0 : NSDictionary = [
"BitsPerSample": 8,
"Domain" : [0,1],
"Filter" : "FlateDecode",
"FunctionType" : 0,
"Length" : 526,
"Range" : [0,1,0,1,0,1,0,1],
"Size" : [1024]]
let dict : NSDictionary = [
"Cs2" : ["Separation","Spot1", "DeviceCMYK", dict0]
]
let space = CGColorSpace(propertyListPlist: dict as CFPropertyList)
if space == nil {
DebugLog("Spot1 color space is nil!")
}
return space
}
Does anyone know if it is possible to get Core Location data on a MacBook from an iPhone ?
So either by using a Bluetooth connection between the MacBook and iPhone or using a USB/Lightning cable or a Wifi hotspot.
Note that the phone will only have a GPS signal - no wifi or mobile network.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
External Accessory
IOBluetooth
Core Location
Core Bluetooth
I am trying to use a C++ library (Sony Remote SDK) in a macOS project and I have successfully created an Objective-C test project but when I tried to create a SwiftUI version of the same project I get a compiler error complaining that the #include <atomic> file is not found.
This line is in one of the C++ header files and I have a wrapper Objective-C++ object which imports the C++ header file.
How can I fix this issue ?
Any reason the .toolbar modifier doesn't work in macOS 11 beta? Is there some global setting required in the app to enable this as no toolbar is being shown at all.
I guess Apple won't respond to this but does anyone know what the timeline might be for Apple to provide support for the Sony Alpha RAW files.
Or what is the typical timeframe to release an update to provide this support ?
I am trying to create a TextView to use with SwiftUI and have use the code shown below to create the TextView.
This all seems to work fine except that the TextView retains the binding to the first object for all the updates.
For example if the TextView is used I a Master Detail arrangement then it will always update the first object that was selected. It seems the binding does not update to subsequent objects.
I have created a small sample application you can test out here. Run the app and select one of the objects in the left panel list and then try editing the textView and the textField. The text fields work as expected but the textView does not.
https://duncangroenewald.com/files/SampleApps/TextView.zip
import SwiftUI
// OSTextView is a subclass of NSTextView - but just use NSTextView if required
#if !os(macOS)
struct TextView: UIViewRepresentable {
@Binding var attributedText: NSAttributedString
func makeUIView(context: Context) -> OSTextView {
let textView = OSTextView()
textView.delegate = context.coordinator
return textView
}
func updateUIView(_ uiView: OSTextView, context: Context) {
uiView.attributedText = attributedText
}
func makeCoordinator() -> Coordinator {
Coordinator($attributedText)
}
class Coordinator: NSObject, UITextViewDelegate {
var text: Binding<NSAttributedString>
init(_ text: Binding<NSAttributedString>) {
self.text = text
}
func textViewDidChange(_ textView: UITextView) {
self.text.wrappedValue = textView.attributedText
}
}
}
#endif
#if os(macOS)
struct TextView: NSViewRepresentable {
@Binding var attributedText: NSAttributedString
func makeNSView(context: Context) -> OSTextView {
let textView = OSTextView(frame: .zero)
textView.delegate = context.coordinator
return textView
}
func updateNSView(_ nsView: OSTextView, context: Context) {
nsView.textStorage?.setAttributedString(attributedText)
}
func makeCoordinator() -> Coordinator {
return Coordinator($attributedText)
}
class Coordinator: NSObject, NSTextViewDelegate {
var text: Binding<NSAttributedString>
init(_ text: Binding<NSAttributedString>) {
self.text = text
super.init()
}
func textDidChange(_ notification: Notification) {
if let textView = notification.object as? NSTextView {
self.text.wrappedValue = textView.attributedString()
}
}
}
}
#endif
Why is NSAlert.runModal() crashing when called from within the Task.init { } block ?
Is this a bug. I can upload a sample program if required.
@IBAction func start(_ sender: Any) {
isBusy = true
log("start")
Task.init {
log("sync start")
let (result, message) = await asyncObj.process(5, progress: progressCallback)
isBusy = false
log(message)
log("sync end") // Reports as running in main thread
// CRASHES HERE !
alert = self.dialogOK(question: "Some question", text: "Some thing else")
log("task end") // Reports as running in main thread
}
}
func dialogOK(question: String, text: String) -> NSAlert {
let alert = NSAlert()
alert.messageText = question
alert.informativeText = text
alert.alertStyle = .warning
alert.addButton(withTitle: "OK")
return alert //.runModal() == .alertFirstButtonReturn
}
/// This process MUST run on a background thread - it cannot run on the main thread
/// So how do I force it to run on a background thread ?
func process(_ count: Int, progress: @escaping (Double)->Void) async -> (Bool, String) {
// Still on the main thread here ??
log("1 First part on the main thread ??")
let task = Task.detached { () -> (Bool, String) in
//Thread.sleep(forTimeInterval: 0.1)
log("2 Run a loop in the background")
log(" Thread: \(Thread.isMainThread ? "UI" : "BG")")
var cnt = 0
for i in 0..<count {
Thread.sleep(forTimeInterval: 0.025)// sleep(seconds: 0.1)
log(" i: \(i)")
cnt += 1
progress (Double(cnt)/Double(count)*100)
}
log(" >>>>")
return (true, "Background task Done")
}
let result = await task.value
log("2 First part on the main thread ??")
return result
}
Is it possible to change the drag and drop preview image. I am currently using the following code but the imagePreviewHandler never gets called and the image is always just a rendering of the visible portion of the gridView
...
grid
.drag(if: isDraggable, data: {
return self.dragData()
})
...
func dragData()-NSItemProvider{
let itemProvider = NSItemProvider(object: fileService )
itemProvider.previewImageHandler = { (handler, _, _) - Void in
os_log("previewImageHandler called")
if let image = NSImage(named: "film") {
handler?(image as NSSecureCoding?, nil)
} else {
let error = NSError(domain:"", code:001, userInfo:[ NSLocalizedDescriptionKey: "Unable to create preview image"])
handler?(nil, error)
}
}
return itemProvider
}
struct Draggable: ViewModifier {
let condition: Bool
let data: () - NSItemProvider
@ViewBuilder
func body(content: Content) - some View {
if condition {
content.onDrag(data)
} else {
content
}
}
}
extension View {
public func drag(if condition: Bool, data: @escaping () - NSItemProvider) - some View {
self.modifier(Draggable(condition: condition, data: data))
}
}