Post

Replies

Boosts

Views

Activity

Reply to macOS Shortcut stuck halfway
answering my own post here. It was something extremely simple - I hadn't actually implemented - (nullable id)application:(NSApplication *)application handlerForIntent:(INIntent *)intent in the app delegate. I had implemented something like it but spelled it wrong. I still don't know how an "App Intent" differs from an intent in a .intentdefinition file.
Topic: App & System Services SubTopic: Core OS Tags:
Oct ’22
Reply to NoSleep for Mac
You haven't said why you want to do this, but here are some suggestions System Preferences/Battery, set the machine to not sleep and not send the display to sleep 'man pmset' favorite search engine - "Caffeine macOS"
Topic: Safari & Web SubTopic: General Tags:
Oct ’22
Reply to Save items from userdefault are not deleted even-though deleting the macOS app.
you are correct. The behavior might be different for sandboxed apps (I don't know, I don't write or maintain any). But why is this a problem for you? During testing, you should delete any old preferences or user data, to simulate the first-install scenario. Users may install your app, try it out, delete it and then re-install it years later. During that time, your preferences may have changed, and you have to account for that. If you want every install for end users to be like the first install, you can use an installer rather than drag-and-drop. The installer script can delete any lingering preferences before installation. You might also be able compare the 'date added' of your app to a timestamp you write into your preferences at first launch. If your app is now in /Applications, and was put there after the timestamp in your preferences, it has been run before (from somewhere), and you may want to clear the preferences to first-run default values. I believe the 'date added' attribute is only readable by parsing the result of a Spotlight query.
Topic: Programming Languages SubTopic: Swift Tags:
Sep ’22
Reply to Windows Developer moving to macOS – please help me to start
You certainly can use C++ in macOS apps. By default, Xcode will compile files with a .m extension as Objective C, and with a .mm extension as Objective C++. You can also select individual files in your project and tell Xcode to use a different compiler than the default, but I'd advise against this. However, you'll notice that nearly all current Apple sample code is in Swift. Most of the hints and tips you'll find on the Internet will be for Swift. Any samples in Objective C are often out of date. It isn't impossible to figure out what to do, but it is an additional hurdle. There is a macOS GUI API, it is called AppKit for macOS. There's another similar one for iOS called UIKit. Again, you'll find the majority of posts on Apple programming are for iOS and UIKit, because there are far more phones out there than desktops. You can call into AppKit from ObjC or ObjC++ or Swift. If you have cross-platform library code written in C++ you can use that in a Swift or ObjC/C++ app, but you'll need to give it a C interface. I don't think VS code can generate a macOS app (it can generate an iOS app), but I could be wrong. You can certainly run VS for Mac on your Mac, and use its editor if you're more comfortable with that. You can run Xcode from the command line if you want to fire off a build from VS, but Xcode integrates a debugger and I don't know how that would work in VSCode (for a macOS app). A lot of the work in producing a shippable app isn't in the code itself, it is in localization, testing, packaging, signing and notarizing. Xcode can help with that. Swift was introduced in 2014, it is quite mature now. If I were starting something brand new for macOS, I'd certainly write it in Swift. If you're used to one platform and move to another, you'll find many things are different. The platforms are different, the user expectations are different and the tools and frameworks are different. Embrace that.
Topic: App & System Services SubTopic: Core OS Tags:
Sep ’22
Reply to Trying to make an expandable table in Swift using API calls
My first advice would be don't try to work with expandable TableView cells if you're new to Swift, Table Views etc. Tables with varying height and content are a pain. The request is that a tap on a school shows more information about that school. So add a detail view segue when you tap on a table item, and have that detail view display all the detailed school information. On the phone, that will take up an entire screen, with a back button on the top left, you'll put the name of the school in the top middle and you have a whole lot of space to display the per-school details on their own, outwith the table. There are a whole lot of tutorials on displaying varying levels of database detail for iOS using more prosaic information display methods.
Topic: UI Frameworks SubTopic: UIKit Tags:
Jul ’22
Reply to Detecting background volume level in Swift
Read up on "noise gates". The audio samples are PCM float values between -1.0 and 1.0. That's the instantaneous value of the signal amplitude in the channel, it cannot exceed maximum volume. Values close to 0.0 are very quiet. Assuming your microphone is correctly set up, when you speak loudly, close to the microphone, the peak signal amplitude will be close to 1.0. A background noise might be 30 to 40dB below this, or at a level of about 0.03 to 0.01. A reasonable noise gate might want to measure the input values and cut any sound at or below this level, if it persists, while it would permit sound above this level, again if it persists. A reasonable noise gate usually turns on quickly (so you don't cut off the beginning of speech) and turns off slowly (so you don't have abrupt cut-offs which can sound jarring). The first sample uses vDSP_meamvg, which is documented here: https://developer.apple.com/documentation/accelerate/1449731-vdsp_meamgv It isn't clear what LEVEL_LOWPASS_TRIG is supposed to be. It might be a value between 0 and 1. Its name suggests it is a trigger value (i.e. a threshold for the noise gate) but it seems to be used here to adjust how rapidly the averagePowerForChannel variables approach the measured value in a single buffer. The second sample is easier to comprehend. It takes every incoming sample (a floating point value between -1 and +1) and finds its absolute value. If the current rectified sample is larger than the 'envelopeState', the envelopeState is increased by 0.16 of the difference, while if it is smaller the envelopeState is decreased by 0.03 of the difference. So 'envelopeState' will rapidly rise if there is input, and slowly fall if there is none, or if the volume of that input falls below that of 'envelopeState'. The routine uses the peak value of an array of 'envelopeState' values as a gate. The buffer is considered full of background noise only if that peak value is less than 0.015. Remember envelopeState isn't an absolute signal level, it is a filtered difference value. hope this helps.
Topic: Programming Languages SubTopic: Swift Tags:
Jul ’22
Reply to Is there a way to remove system extension without rebooting the device?
I certainly haven't found a way to replace a running system extension (in my case, a Camera Extension), without rebooting. I don't know how this is supposed to be communicated to end users, because the impression given by Apple is that if they update your app (with its built-in extensions), that the extensions are automatically updated. They are, but only at the next reboot. If you make a protocol change such that App version N only works with Extension version N, but not Extension version N-1, you'll have to tell the user that they need to reboot. The OS won't tell the user.
Topic: App & System Services SubTopic: Drivers Tags:
Jul ’22
Reply to Univeral installer for M1/Intel app
why? The easiest thing to do is build a Universal application, at some (usually small cost) to the user in disk space. If you really must do this, you'll need to read up on the documentation for installers: https://developer.apple.com/library/archive/documentation/DeveloperTools/Reference/DistributionDefinitionRef/Chapters/Distribution_XML_Ref.html#//apple_ref/doc/uid/TP40005370-CH100-SW7 and consult the man pages for pkgbuild and/or productbuild. Somewhere in there, you'll probably find that you can restrict what is installed based on arch.
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’22
Reply to is XPC from app to CMIOExtension possible?
As we learned in the "Create camera extensions with CoreMedia IO" presentation at WWDC 2022, a Camera Extension can present a sink interface to the system, accessible to an app via the CMIOHardwareObject.h and CMIOHardwareStream.h APIs. The Camera Extension template only offers a source stream, but it is pretty easy to add a sink stream. That's one way to get video into a Camera Extension, it works on 12.3 and it doesn't require a helper daemon to enable the app and extension to find one another. In Ventura, the plan is to have the app's extension be owned by the app's owner, so XPC would be an option. Easier again is to have the extension itself handle the video sourcing, while the accompanying app only provides control and status reporting, using the property interfaces of the extension. That's the intent behind the design.
Topic: App & System Services SubTopic: Drivers Tags:
Jun ’22
Reply to is XPC from app to CMIOExtension possible?
I'll type this in the answer box then. If you can require 10.12 or later, it is much simpler than this. As I found out, an IOSurfaceRef cannot be directly transported over NSXPCConnection, but an IOSurface can. No need to do the CreateXPCObject or LookupFromXPCObject stuff unless you're on 10.7 to 10.11. The two types are toll-free bridged. The only thing I'm not sure about is which flavor of __bridge to use at each end. I'm thinking I need write IOSurfaceRef surface = CVPixelBufferGetIOSurface(pixelBuffer); [remoteObjectProxy sendIOSurfaceToExtension:(__bridge_transfer IOSurface *)surface]; and at the receiving end CVPixelBufferRef pixelBuffer; CFPixelbufferCreateWithIOSurface(kCFAllocatorDefault, (__bridge_retained IOSurfaceRef)surface,...); I guess I'll find out...
Topic: App & System Services SubTopic: Drivers Tags:
May ’22