Any reason the .toolbar modifier doesn't work in macOS 11 beta? Is there some global setting required in the app to enable this as no toolbar is being shown at all.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I guess Apple won't respond to this but does anyone know what the timeline might be for Apple to provide support for the Sony Alpha RAW files.
Or what is the typical timeframe to release an update to provide this support ?
I have an application that uses a Transformable property in Core Data to store NSAttributedStrings and get compiler warnings
Object.property is using a nil or insecure value transformer. Please switch to NSSecureUnarchiveFromDataTransformerName or a custom NSValueTransformer subclass of NSSecureUnarchiveFromDataTransformer [2]
NSSecureUnarchiveFromDataTransformerName does not support archiving and unarchiving of NSAttributedStrings and so as I understand it I have to create a custom transformer, register that in AppDelegate and enter the transformer class name in the Core Data models object property details.
Below is the custom transformer class, however I get an error when trying to decode existing attributed strings. Can anyone shed any light on this ? Why is the new unarchiver unable to handle the NSFileWrapper given this is a property of the NSTextAttachment and works fine with the deprecated unarchiver ?
Is this a bug or intentional ?
Is there some way to add support for unarchiving the NSFileWrapper ?
@implementation NSAttributedStringValueTransformer(Class)transformedValueClass {
return [NSAttributedString class];
}(void)initialize {
[NSValueTransformer setValueTransformer:[[self alloc]init] forName:@"NSAttributedStringValueTransformer"];
}
(BOOL)allowsReverseTransformation {
return YES;
}
(NSData*)transformedValue:(NSAttributedString*)value {
NSError *error;
NSData* stringAsData = [NSKeyedArchiver archivedDataWithRootObject:value requiringSecureCoding:false error:&error];
if (error != nil) {
NSLog(@"Error encoding attributed string: %@", error.localizedDescription);
return nil;
}
return stringAsData;
}
(NSAttributedString*)reverseTransformedValue:(NSData*)value {
NSError *error;
/* This works */
/* NSAttributedString* string = [NSKeyedUnarchiver unarchiveObjectWithData: value]; /
/* This fails with the error The data couldn’t be read because it isn’t in the correct format. */
NSAttributedString* string = [NSKeyedUnarchiver unarchivedObjectOfClass:[NSAttributedString class] fromData:value error:&error];
if (error != nil) {
NSLog(@"Error decoding attributed string: %@", error);
return nil;
}
return string;
}
@end
Resulting Error:
Error decoding attributed string:
[Error](https://developer.apple.com/forums/content/attachment/547d2a08-9220-42aa-9d29-1b3129feb864){: .log-attachment}
Error Domain=NSCocoaErrorDomain Code=4864 "value for key 'NSFileWrapper' was of unexpected class 'NSFileWrapper (0x1c9d40d48) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/Foundation.framework]'. Allowed classes are '{(
"NSURL (0x1c9d1b988) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSAttributedString (0x1c9d36668) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/Foundation.framework]",
"NSFont (0x1c9e1a3c8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
"NSDictionary (0x1c9d1adf8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSArray (0x1c9d1ab28) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSColor (0x1c9ec2198) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIKitCore.framework]",
"NSTextAttachment (0x1c9e1aad0) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
"NSGlyphInfo (0x1c9e198d8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
...
I would like to get arrays of red, green and blue histogram data from the output of the CIAreaHistogramFilter. My current approach is not working.
According to the docs CIAreaHistogramFilter returns an image with width = bin size (256) in my case and height = 1
so each pixel contains the count of the rgb values for that bin.
if let areahistogram = self.areaHistogramFilter(ciImage) {
let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
var pixelBuffer : CVPixelBuffer?
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(areahistogram.extent.size.width), Int(areahistogram.extent.size.height), kCVPixelFormatType_32ARGB, attrs, &pixelBuffer)
guard (status == kCVReturnSuccess) else {
return
}
self.hContext.render(areahistogram, to: pixelBuffer!)
CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0));
let int32Buffer = unsafeBitCast(CVPixelBufferGetBaseAddress(pixelBuffer!), to: UnsafeMutablePointerUInt32.self)
let int32PerRow = CVPixelBufferGetBytesPerRow(pixelBuffer!)
var data = [Int]()
for i in 0..256 {
/* Get BGRA value for pixels */
let BGRA = int32Buffer[i]
data.append(Int(BGRA))
let red = (BGRA 16) & 0xFF;
let green = (BGRA 8) & 0xFF;
let blue = BGRA & 0xFF;
os_log("data[\(i)]:\(BGRA) red: \(red) green: \(green) blue: \(blue)")
}
CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
}
results in zeros.
data[0]:0 red: 0 green: 0 blue: 0
data[1]:0 red: 0 green: 0 blue: 0
...
data[255]:134678783 red: 7 green: 7 blue: 7
similarly produces a bunch of zeros
or this
let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer!)
let buffer = baseAddress!.assumingMemoryBound(to: UInt8.self)
for i in stride(from: 0, to: 256*4, by: 4) {
let blue = buffer[i]
let green = buffer[i+1]
let red = buffer[i+2]
os_log("data[\(i)]: red: \(red) green: \(green) blue: \(blue)")
}
or this variation that seems simpler
var red = [UInt8]()
var green = [UInt8]()
var blue = [UInt8]()
for i in 0..256 {
// Get BGRA value for pixel
let BGRA = int32Buffer[i]
withUnsafeBytes(of: BGRA.bigEndian) {
red.append($0[0])
green.append($0[1])
blue.append($0[2])
}
}
Is it possible to change the drag and drop preview image. I am currently using the following code but the imagePreviewHandler never gets called and the image is always just a rendering of the visible portion of the gridView
...
grid
.drag(if: isDraggable, data: {
return self.dragData()
})
...
func dragData()-NSItemProvider{
let itemProvider = NSItemProvider(object: fileService )
itemProvider.previewImageHandler = { (handler, _, _) - Void in
os_log("previewImageHandler called")
if let image = NSImage(named: "film") {
handler?(image as NSSecureCoding?, nil)
} else {
let error = NSError(domain:"", code:001, userInfo:[ NSLocalizedDescriptionKey: "Unable to create preview image"])
handler?(nil, error)
}
}
return itemProvider
}
struct Draggable: ViewModifier {
let condition: Bool
let data: () - NSItemProvider
@ViewBuilder
func body(content: Content) - some View {
if condition {
content.onDrag(data)
} else {
content
}
}
}
extension View {
public func drag(if condition: Bool, data: @escaping () - NSItemProvider) - some View {
self.modifier(Draggable(condition: condition, data: data))
}
}
Is it possible to create a SwiftUI toolbar item that will refresh its content in response to model changes. I want to display some text in the toolbar on macOS but the Text() object does not update its content when the model changes.
Is there some other solution to achieve this for SwiftUI toolbars for macOS ?
According to the Core Image documents the following API should return the RAW images native output size.
let filter = CIFilter(imageURL: url, options: nil)
let value = filter.value(forKey: CIRAWFilterOption.outputNativeSize.rawValue ) as? CIVector
However this seems to always return the camera's native resolution rather than the actual image size contained in the RAW file.
For example if I uses this API on a RAW file that was shot at 16:9 aspect ratio on a Sony 19 the image size should be 6000 x 3376 but this API call returns 6000 x 4000.
Is this a bug or am I missing something - is there another API call to get the actual image size ?
Note that the EXIF data does contain the correct image size.
I get a lot of heif images with diagonal lines through them when using this API to generate output files. The same files output in other formats show no signs of the same diagonal lines.
This problem only seems to occur when the input CIImage is cropped and then only for certain crop dimensions.
Can someone confirm they see the same issue - I have created a test Playground which includes two sample RAW files. You will need to change the path to the input files and also change the crop dimensions to test different combinations. It seems the diagonal lines are only caused by certain width values. See the few crop examples in the Playground. I have logged a bug with Apple (FB9096406) but it would be good to get confirmation this is a bug.
I have tried cropping with CICrop filter and CIImage.cropped(to:) APIs and get the same results.
Link to a zip file with the test playground for reproducing the issue at the following link - I can't post the actual link. Use https://
to download the file from
xxxx://duncangroenewald.com/files/CoreImageHEIFExporterBug.zip
I see some others have posted links on this forum, why is it I can't seem to do that?
How do I go about debugging this crash.
Crash log - https://developer.apple.com/forums/content/attachment/fb4f1046-e867-4c87-ae6f-1d8ce690fdf7
I am trying to get an image drawn of a CALayer containing a number of sublayers positioned at specific points, but at the moment it does not honour the zPosition of the sublayers when I use CALayer.render(in ctx:). It works fine on screen but when rendering to PDF it seems to render them in the order they were created.
These sublayers that are positioned(x, y, angle) on the drawing layer.
One solution seems to be to override the render(in ctx:) method on the drawing layer, which seems to work except the rendering of the sublayers is in the incorrect position. They are all in the bottom left corner (0,0) and not rotated correctly. }
If I don't override this method then they are positioned correctly but just in the wrong zPosition - i.e. ones that should be at the bottom (zPosition-0) are at the top.
What am I missing here ? It seems I need to position the sublayers correctly somehow in the render(incts:) function?
How do I do this ? These sublayers have already been positioned on screen and all I am trying to do is generate an image of the drawing. This is done using the following function. }
Why is NSAlert.runModal() crashing when called from within the Task.init { } block ?
Is this a bug. I can upload a sample program if required.
@IBAction func start(_ sender: Any) {
isBusy = true
log("start")
Task.init {
log("sync start")
let (result, message) = await asyncObj.process(5, progress: progressCallback)
isBusy = false
log(message)
log("sync end") // Reports as running in main thread
// CRASHES HERE !
alert = self.dialogOK(question: "Some question", text: "Some thing else")
log("task end") // Reports as running in main thread
}
}
func dialogOK(question: String, text: String) -> NSAlert {
let alert = NSAlert()
alert.messageText = question
alert.informativeText = text
alert.alertStyle = .warning
alert.addButton(withTitle: "OK")
return alert //.runModal() == .alertFirstButtonReturn
}
/// This process MUST run on a background thread - it cannot run on the main thread
/// So how do I force it to run on a background thread ?
func process(_ count: Int, progress: @escaping (Double)->Void) async -> (Bool, String) {
// Still on the main thread here ??
log("1 First part on the main thread ??")
let task = Task.detached { () -> (Bool, String) in
//Thread.sleep(forTimeInterval: 0.1)
log("2 Run a loop in the background")
log(" Thread: \(Thread.isMainThread ? "UI" : "BG")")
var cnt = 0
for i in 0..<count {
Thread.sleep(forTimeInterval: 0.025)// sleep(seconds: 0.1)
log(" i: \(i)")
cnt += 1
progress (Double(cnt)/Double(count)*100)
}
log(" >>>>")
return (true, "Background task Done")
}
let result = await task.value
log("2 First part on the main thread ??")
return result
}
I am trying to create a TextView to use with SwiftUI and have use the code shown below to create the TextView.
This all seems to work fine except that the TextView retains the binding to the first object for all the updates.
For example if the TextView is used I a Master Detail arrangement then it will always update the first object that was selected. It seems the binding does not update to subsequent objects.
I have created a small sample application you can test out here. Run the app and select one of the objects in the left panel list and then try editing the textView and the textField. The text fields work as expected but the textView does not.
https://duncangroenewald.com/files/SampleApps/TextView.zip
import SwiftUI
// OSTextView is a subclass of NSTextView - but just use NSTextView if required
#if !os(macOS)
struct TextView: UIViewRepresentable {
@Binding var attributedText: NSAttributedString
func makeUIView(context: Context) -> OSTextView {
let textView = OSTextView()
textView.delegate = context.coordinator
return textView
}
func updateUIView(_ uiView: OSTextView, context: Context) {
uiView.attributedText = attributedText
}
func makeCoordinator() -> Coordinator {
Coordinator($attributedText)
}
class Coordinator: NSObject, UITextViewDelegate {
var text: Binding<NSAttributedString>
init(_ text: Binding<NSAttributedString>) {
self.text = text
}
func textViewDidChange(_ textView: UITextView) {
self.text.wrappedValue = textView.attributedText
}
}
}
#endif
#if os(macOS)
struct TextView: NSViewRepresentable {
@Binding var attributedText: NSAttributedString
func makeNSView(context: Context) -> OSTextView {
let textView = OSTextView(frame: .zero)
textView.delegate = context.coordinator
return textView
}
func updateNSView(_ nsView: OSTextView, context: Context) {
nsView.textStorage?.setAttributedString(attributedText)
}
func makeCoordinator() -> Coordinator {
return Coordinator($attributedText)
}
class Coordinator: NSObject, NSTextViewDelegate {
var text: Binding<NSAttributedString>
init(_ text: Binding<NSAttributedString>) {
self.text = text
super.init()
}
func textDidChange(_ notification: Notification) {
if let textView = notification.object as? NSTextView {
self.text.wrappedValue = textView.attributedString()
}
}
}
}
#endif
I am trying to use a C++ library (Sony Remote SDK) in a macOS project and I have successfully created an Objective-C test project but when I tried to create a SwiftUI version of the same project I get a compiler error complaining that the #include <atomic> file is not found.
This line is in one of the C++ header files and I have a wrapper Objective-C++ object which imports the C++ header file.
How can I fix this issue ?
Does anyone know if it is possible to get Core Location data on a MacBook from an iPhone ?
So either by using a Bluetooth connection between the MacBook and iPhone or using a USB/Lightning cable or a Wifi hotspot.
Note that the phone will only have a GPS signal - no wifi or mobile network.
Topic:
App & System Services
SubTopic:
Hardware
Tags:
External Accessory
IOBluetooth
Core Location
Core Bluetooth
I am trying to generate a PDF file with certain components draw with Spot Colours. Spot colours are used for printing and I am not clear on how one would do that but I think that if I can create a custom ColorSpace with a specific name or a color that has a specific name - our printer looks for the name Spot1 and they use the colour green.
Can anyone shed any light on how I might be able to do this. For reference I have attached two pdf files with two different spot colours in them.
I need to be able to create similar using CGContext and CGPDFDocument. I can already generate the PDF documents using CMYK colors but don't know how I can create the equivalent "spot" colors.
At the moment I am loading the page from these attached pdf files and scaling them to fill the page to get a background with the spot color. This works fine but I also need to generate text and lines using this same spot color and I am not clear how I could do that using the Core Graphics APIs.
My guess is I need to create a custom ColorSpace with a single color and then use that color for drawing with.
The only 'custom' option for creating a ColorSpace seems to be the CGColorSpace(propertyListPList:) constructor, however there does not appear to be any documentation on what needs to be in the property list to do so. Nor can I find any examples of that.
Any pointers would be appreciated.
Regards