Is it possible to share multiple links at once on Mac Catalyst to Messages? When I provide multiple urls via the UIActivityItemSource API Messages just picks 1 of the links.
The Mail activity handles multiple links without a problem but I'd like this to work with Messages too. I know for sure this is possible in native AppKit but can't seem to figure out how to get this to work on Catalyst.
I tried providing the links to UIActivityViewController with a UIActivityItemsConfiguration object instead of using the UIActivityItemSource API but that didn't work either.
Thanks in advance
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Is there a way to programmatically determine the default size of an NSToolbarItem when creating them on Mac Catalyst like so:
UIImage *downArrowImage = [UIImage systemImageNamed:@"arrow.down"];
UIBarButtonItem *goDownUIBarButtonItem = [[UIBarButtonItem alloc]initWithImage:downArrowImage style:UIBarButtonItemStylePlain target:nil action:@selector(navigateDownard:)];
NSToolbarItem *nsToolbarItem = [NSToolbarItem itemWithItemIdentifier:itemIdentifier barButtonItem:goDownUIBarButtonItem];
Why am I asking? I need to create a toggle toolbar item (which requires me to change the toolbar item's image when the toggle is flipped.
I can do this by using NSUIViewToolbarItem and embedding a UIButton. But when you feed an SF Symbol image to UIButton and call sizeToFit on it it doesn't generate a view that matches the size of UIBarButtonItems for all the other NSToolbarItems next to it. I can hard code size constraints in like this to make it size properly:
NSLayoutConstraint *width = [theUIButtonToEmbedInToolbarItem.widthAnchor constraintEqualToConstant:32.0];
NSLayoutConstraint *height = [theUIButtonToEmbedInToolbarItem.heightAnchor constraintEqualToConstant:32.0];
//activate these constraints and embed the UIButton in the NSToolbarItem.
And that's works but I'm guessing the size so the layout could break in a OS update.
I tried just feeding an SF Symbol image to NSToolbarItem's image property directly but then the toolbar item doesn't draw at all (edit: this is only true when using an NSToolbarItem subclass, which I created to change the toolbar item's image itself when the toggle property is flipped)
//Inside my NSToolbarItem subclass.
-(void)setOn:(BOOL)isOn
{
if (_on != on)
{
_on = isOn;
self.image = (isOn) ? self.onImage : self.offImage;
}
}
So when using NSToolbaritem's image property directly setting the image property works fine, this means I have to track the toggle state externally which is possible but definitely not as nice).
Edit: Actually if forgot to modify my toolbar item subclass to subclass NSToolbarItem directly (was subclassing NSUIViewToolbarItem when I was first experimenting with using UIButton).
Using the image property on NSToolbarItem directly works for me. So I'm happy. I still think it'd be useful for clients that need to embed custom views inside a NSToolbarItem to get some kind of size recommendation.
I'm trying to enable the web inspector on WKWebView in a Mac Catalyst app. I'm only doing this for debugging purposes. In the released the web inspector will not be enabled.
Doing this under Mac Catalyst does not work:
WKPreferences *prefs = [[WKPreferences alloc]init];
[prefs _setDeveloperExtrasEnabled:YES];
//Assign the WKPreferences to a WKWebViewConfiguration and create the web view..
Is there any way to do this?
Thanks in advance.
I'd like to ensure certain content doesn't overlap UISheetPresentationController's grabber.
In the view controller subclass that is being presented as a sheet via UISheetPresentationController I tried inspecting self.view.safeAreaInset.top hoping that this would account for the UISheetPresentationController's grabber but safeAreaInset.top is 0 unfortunately.
Right now I'm just using a hard coded value but is there API for this I overlooked?
I'm using a custom UIPresentationController. I'd like the replicate the behavior of UISheetPresentationController and allow user interaction the presenting view controller when the presented view controller does not cover the entire screen.
I tried this:
-(void)presentationTransitionDidEnd:(BOOL)completed
{
UIView *theView = self.presentingViewController.view;
theView.userInteractionEnabled = YES;
}
But has no effect. Is it possible to achieve this behavior? Thanks in advance
I'm having a hard time finding samples that clearly explain how to use WKContentRuleList objects. I read that WKContentRuleList use the same format as Safari content blocker extensions however when I try to compile a content blocker from the sample app AdoptingDeclarativeContentBlockingInSafariWebExtensions it complains about missing a "trigger". I get
Error Domain=WKErrorDomain Code=6 "(null)" UserInfo={NSHelpAnchor=Rule list compilation failed: Invalid trigger object.}
When I try to use the rules from the AdoptingDeclarativeContentBlockingInSafariWebExtensions sample project:
[
{
"id": 1,
"priority": 1,
"action": { "type": "block" },
"condition": {"regexFilter": ".*", "resourceTypes": [ "image" ] }
},
{
"id": 2,
"priority": 1,
"action": { "type": "allow" },
"condition": {"regexFilter": "wikipedia", "resourceTypes": [ "image" ] }
}
]
So if WKContentRuleList requires different keys/value pairs than Safari content blockers are those differences documented anywhere? I can't really find any good info on this.
Thanks in advance.
When trying to compile a rule list via WKContentRuleListStore's -compileContentRuleListForIdentifier:encodedContentRuleList:completionHandler: method I'm getting the following error:
** Rule list compilation failed Too many rules in JSON array.**
The max number of rules allowed in a WKContentRuleList doesn't seem to be documented (or I couldn't find it). Does anyone know what the limit is?
Thanks
Now that live previews are available in UIKit (source: https://developer.apple.com/wwdc23/10252)
I was wondering how to get a UIViewController from Objective-C. The Swift macro looks like this:
#Preview {
var controller = SomeViewController()
return controller;
}
Is there a way to get a live preview for UIViewControllers/UIViews written in Objective-C (other than wrapping it as a child view controller in an empty swift view controller)?
Is there a way to programmatically launch a macOS Action Extension (related documentation: https://developer.apple.com/library/archive/documentation/General/Conceptual/ExtensibilityPG/Action.html)
I'm aware how to create "Action Extensions" but they only seem to be activated from NSTextView "rollover" button.
Say I know there is an action extension that works on a particular type of data can I launch it from my app? Obviously we can launch "regular" apps with NSWorkspace by bundle iD but I was wondering if there is any API where my app could directly launch an action extension as the "host app" programmatically.
I'm aware signing in to an Apple ID on a macOS Virtual Machine is not supported, unfortunately. But is there a way to download Xcode from within the VM? I know the developer website used to have links where you could directly download Xcode outside the Mac App Store but I can't seem to find it?
I just tried submitting an app to be notarized. This app is actually only used by me internally (but I have other apps this question would be relevant to) and I can't submit for notarization. I get the following error:
"Hardened Runtime is not enabled."
Is the Hardened Runtime now required? I know it used to be optional (I believe the last time I submitted an app update a few months ago outside the Mac App Store I got no such error).
I have a PCM audio buffer (AVAudioPCMFormatInt16). When I try to play it using AVPlayerNode / AVAudioEngine an exception is thrown:
"[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868
(related thread https://forums.developer.apple.com/forums/thread/700497?answerId=780530022#780530022)
If I convert the buffer to AVAudioPCMFormatFloat32 playback works.
My questions are:
Does AVAudioEngine / AVPlayerNode require AVAudioPCMBuffer to be in the Float32 format? Is there a way I can configure it to accept another format instead for my application?
If 1 is YES is this documented anywhere?
If 1 is YES is this required format subject to change at any point?
Thanks!
I was looking to watch the "AVAudioEngine in Practice" session video from WWDC 2014 but I can't find it anywhere (https://forums.developer.apple.com/forums/thread/747008).
I'm using AVAudioEngine to play AVAudioPCMBuffers. I'd like to synchronize some events with the playback. For example if the audio's frame position is >= some point && less than some point trigger some code.
So I'm looking at - (void)installTapOnBus:(AVAudioNodeBus)bus bufferSize:(AVAudioFrameCount)bufferSize format:(AVAudioFormat * __nullable)format block:(AVAudioNodeTapBlock)tapBlock;
Now I have frame positions calculated (predetermined before audio is scheduled I already made all necessary computations) . So I just need to fire code at certain points during playback:
[playerNode installTapOnBus:bus
bufferSize:bufferSize
format:format
block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) {
//Inspect current audio here and fire...
}];
[playerNode scheduleBuffer:fullbuffer
atTime:startTime
options:0
completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType)
{
// some code is here, not important to this question.
}];
The problem I'm having is figuring out at what point in full buffer I'm at within the tap block. The tap block passes chunks (not the full audio buffer). I tried using the when parameter of the block to calculate the frame position relative to the entire audio but have be unsuccessful so far. I'm assuming the when parameter is relative to the buffer passed in the tap block (not my entire audio buffer I scheduled).
Not installing a tap and just using a timer before scheduling my fullBuffer has given me good results but I'd rather avoid using a timer if possible and use sample time.
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AVAudioNode
AVAudioSession
AVAudioEngine
AVFoundation
I was wondering if there is a quick way to convert a model trained with the open source CRFSuite for use with NLTagger?
It seems like retraining should be possible but was wondering if automatic conversion was supported?
I have a simple little Mac app that embeds a Python interpreter. I wrote this app almost ten years ago and completely forgot about it. Anyway I submitted an update to it with a new version of Python but it's being rejected by App review for the following reason:
Your app uses or references the following non-public or deprecated APIs:
Symbols:
• _Tcl_NewByteArrayObj
• _Tcl_ResetResult
• _Tcl_MutexLock
• _Tcl_GetBooleanFromObj
• _Tcl_SetObjResult
• _Tcl_CreateInterp
• _Tcl_ThreadQueueEvent
• _Tcl_UnsetVar2
• _Tcl_GetBignumFromObj
• _TclBN_mp_to_unsigned_bin_n
• _Tcl_ListObjLength
• _Tcl_ConditionWait
• _Tcl_GetDouble
• _Tcl_GetDouble
• _Tcl_DeleteFileHandler
• _Tcl_SetVar
• _Tcl_SetVar
• _Tcl_SetVar
• _Tcl_DoOneEvent
• _TclFreeObj
• _Tcl_Eval
• _Tcl_Eval
• _Tcl_Eval
• _Tcl_FindExecutable
• _Tcl_NewLongObj
• _Tcl_CreateTimerHandler
• _Tcl_Init
• _Tcl_ConditionFinalize
• _Tcl_GetByteArrayFromObj
• _Tcl_ListObjIndex
• _Tcl_ExprLong
• _Tcl_NewDoubleObj
• _Tcl_GetDoubleFromObj
• _Tcl_ExprString
• _TclBN_mp_read_radix
• _Tcl_DeleteTimerHandler
• _Tcl_CreateFileHandler
• _Tcl_GetVar
• _Tcl_GetVar
• _Tcl_CreateObjCommand
• _Tcl_SetVar2Ex
• _Tcl_GetStringFromObj
• _Tcl_NewStringObj
• _Tcl_GetObjType
• _Tcl_MutexUnlock
• _Tcl_DeleteCommand
• _TclBN_mp_init
• _Tcl_GetCurrentThread
• _Tcl_ExprDouble
• _Tcl_AddErrorInfo
• _Tcl_Free
• _Tcl_GetStringResult
• _Tcl_SetVar2
• _Tcl_SetVar2
• _Tcl_GetBoolean
• _Tcl_GetBoolean
• _Tcl_RecordAndEval
• _Tcl_EvalFile
• _Tcl_GetLongFromObj
• _TclBN_mp_clear
• _Tcl_ThreadAlert
• _Tcl_ExprBoolean
• _Tcl_DeleteInterp
• _TclBN_mp_unsigned_bin_size
• _Tcl_AttemptAlloc
• _Tcl_GetObjResult
• _Tcl_GetWideIntFromObj
• _Tcl_NewListObj
• _Tcl_ConditionNotify
• _Tcl_NewBooleanObj
• _Tcl_SplitList
• _Tcl_EvalObjv
• _Tcl_GetThreadData
• _Tcl_GetVar2Ex
• _Tcl_NewWideIntObj
• _Tcl_NewBignumObj
• _Tcl_ListObjGetElements
• _Tcl_GetString
• _Tcl_GetString
• _Tcl_GetString
The use of non-public or deprecated APIs is not permitted on the App Store, as they can lead to a poor user experience should these APIs change and are otherwise not supported on Apple platforms.
I read online that this is a sort of a widespread issue right now with apps that embed Python (would share links but then my post will have to be approved by a moderator). Anyone have a workaround?
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Tags:
Developer Tools
App Store
App Review
Entitlements