I just created a new project in the newest version of Xcode as a sample project for a feedback.
Bug 1
So simply what I do in every new project is create a "Supporting Files" group (not a folder because I don't want to move these files on the file system). I put the following files in this group:
.entitlements file
-the Info.plist (which apparently new projects don't create anymore because I don't see one).
main.m
Assets.xcassets
In previous version of Xcode this was done with the "New Group without Folder" action (though back in the day I believe you'd get yellow folders in "New Group" and blue folder with 'New Folder" and they were separate actions.... which was actually better and much less insane IMO but that's not really important to this).
In any case, "New Group without Folder" is nowhere to be found in the context menu. I finally was able to get "New Group" to appear as long as I wasn't right clicking underneath any directory. But.... New Group actually creates a New Folder, just like New Folder. So I put the .entitlements in the Supporting Files group (which is not a group, but a directory) and the app won't compile unless I fix the path in project settings because I moved the file which is most definitely not what I wanted.
So we can no longer group files in the project navigator without moving them to new directories? Is this intentional behavior? It can't be, right?
Bug 2
I noticed dragging and dropping to reorder files in the project navigator no longer seems to work? In previous versions of Xcode I could drag and drop to reorder files (in groups and in folders, this would work). This appears to no longer work. I just have to accept the way Xcode orders my project files?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a NSWindow subclass. The window has a custom shape, and thus has a custom contentView which overrides drawRect to draw .
Since the window has a custom shape it cannot use the system provided titlebar.
The problem I'm having is when there are multiple screens, if my window is on the inactive screen (not mainScreen with menu bar) and I move the mouse over to the second monitor and click the window....the menu bar doesn't travel to the screen my app is on after the window is clicked.
This does not happen with any other window. In all other windows, the menu bar moves to the screen once you click a window on that screen. So it appears this is because my window is not using NSWindowStyleMaskTitled. As far as I know, I can't use the system title bar and draw my custom window shape. Abandoning the custom window shape is not an option.
Without going into too many details as to why I care, the menu bar really should travel with first click on my window like other apps.. Is there a way to tell the system (other than using the NSWindowStyleMaskTitled) that clicking on my window should make that screen the "main screen" (bring the menu bar over?
I tried programmatically activating the application, ordering the window to the front, etc. but none of this works. This forces the user to click outside my app window, say on the desktop, to move the menu bar over, which feels wrong.
Thanks in advance if anyone has any suggestions.
My app, which has been on the Mac App Store for many years, has an update being blocked by App review. The only change made is a bug fix (documented in the release notes).
First rejection:
Said I was using an entitlement I didn't need.
My response: I explained the feature that required the entitlement.
App goes back in review and gets rejected again for completely different reasons. They don't want me to write files in my App Sandbox container and instead write them in a more traditional user facing location (like in the Documents folder). They keep sending me a link to the "App Sandbox Design Guide" in the Documentation Archive (which appears to redirect to a different page?) and are quoting a section that is nowhere to be found in the link they send me (on the page I'm redirected to). I keep explaining to them that I cannot write outside my sandbox container and that this isn't my choice.
And they keep rejecting my app and sending me a broken link to the "App Sandbox design guide." It isn't my fault that I have to write to my sandboxed container by default or have a non-functioning app.
In any case, I don't understand why a bug fix update is being held up and I'm getting some vague instructions about possibly having to design some long winded explanation to the user in some ridiculously complicated onboarding process about choosing a folder in a save panel, why you have to choose the folder in the save panel (because I need your permission), OR just quit the app you just bought because it'll otherwise do nothing if you don't choose a folder in the save panel. Users got enough panels to deal with. At the very least App review shouldn't send me broken links from the Documentation Archive.
So I'm using my sandbox container by default (because by default I cannot do anything else). I've been doing this for a long time and I don't understand why it is suddenly a problem. What is my sandboxed container for if I can't write to it?
If documentation such as the "App Sandbox Design Guide" is still relevant and important why is it being archived anyway ? The link redirects I cannot find the section the reviewer is citing in the provided link.
I don't mind being asked to do something to improve the app but I've wasted a lot of time trying to satisfy app review in the past only to misinterpret what they actually are asking me to do causing more wasted time and energy and not getting a whole lot in return.
And I don't think it's fair to block a bug fix update.
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Tags:
macOS
App Review
App Sandbox
Mac App Store
So I get JPEG data in my app. Previously I was using the higher level NSBitmapImageRep API and just feeding the JPEG data to it.
But now I've noticed on Sonoma If I get a JPEG in the CMYK color space the NSBitmapImageRep renders mostly black and is corrupted. So I'm trying to drop down to the lower level APIs. Specifically I grab a CGImageRef and and trying to use the Accelerate API to convert it to another format (to hopefully workaround the issue...
CGImageRef sourceCGImage = `CGImageCreateWithJPEGDataProvider(jpegDataProvider,`
NULL,
shouldInterpolate,
kCGRenderingIntentDefault);
Now I use vImageConverter_CreateWithCGImageFormat... with the following values for source and destination formats:
Source format: (derived from sourceCGImage)
bitsPerComponent = 8
bitsPerPixel = 32
colorSpace = (kCGColorSpaceICCBased; kCGColorSpaceModelCMYK; Generic CMYK Profile)
bitmapInfo = kCGBitmapByteOrderDefault
version = 0
decode = 0x000060000147f780
renderingIntent = kCGRenderingIntentDefault
Destination format:
bitsPerComponent = 8
bitsPerPixel = 24
colorSpace = (DeviceRBG)
bitmapInfo = 8197
version = 0
decode = 0x0000000000000000
renderingIntent = kCGRenderingIntentDefault
But vImageConverter_CreateWithCGImageFormat fails with kvImageInvalidImageFormat. Now if I change the destination format to use 32 bitsPerpixel and use alpha in the bitmap info the vImageConverter_CreateWithCGImageFormat does not return an error but I get a black image just like NSBitmapImageRep
Does anyone know why the following call fails?
CGPDFOperatorTableSetCallback(operatorTable, "ID", &callback);
The PDF specification seems to indicate that ID is an operator?
BTW what is the proper topic/subtopic for questions about Quartz? Wasn't sure what topic on the new forums to post this under.
So with SKStoreReviewController now deprecated... I'm wondering what API is recommended for UIKit apps?
I have an XPC service that embeds Python. It executes a python script on behalf of the main app.
The app and xpc service are sandboxed. All seems to work just fine in the development environment but the script fails in the released version.
I disabled writing pycache by setting the PYTHONDONTWRITEBYTECODE environment variable because pycache tries to write inside my app bundle which fails (I believe I can redirect the pycache directory with PYTHONPYCACHEPREFIX and may experiment with that later).
Specifically this line fails in the release version only (not from Xcode):
PyObject *pModule = PyImport_Import(moduleNameHere);
if (pModuleOwnedRef == NULL)
{
// this is null in release mode only.
}
Any ideas what can be going wrong? Thanks in advance.
I have a simple little Mac app that embeds a Python interpreter. I wrote this app almost ten years ago and completely forgot about it. Anyway I submitted an update to it with a new version of Python but it's being rejected by App review for the following reason:
Your app uses or references the following non-public or deprecated APIs:
Symbols:
• _Tcl_NewByteArrayObj
• _Tcl_ResetResult
• _Tcl_MutexLock
• _Tcl_GetBooleanFromObj
• _Tcl_SetObjResult
• _Tcl_CreateInterp
• _Tcl_ThreadQueueEvent
• _Tcl_UnsetVar2
• _Tcl_GetBignumFromObj
• _TclBN_mp_to_unsigned_bin_n
• _Tcl_ListObjLength
• _Tcl_ConditionWait
• _Tcl_GetDouble
• _Tcl_GetDouble
• _Tcl_DeleteFileHandler
• _Tcl_SetVar
• _Tcl_SetVar
• _Tcl_SetVar
• _Tcl_DoOneEvent
• _TclFreeObj
• _Tcl_Eval
• _Tcl_Eval
• _Tcl_Eval
• _Tcl_FindExecutable
• _Tcl_NewLongObj
• _Tcl_CreateTimerHandler
• _Tcl_Init
• _Tcl_ConditionFinalize
• _Tcl_GetByteArrayFromObj
• _Tcl_ListObjIndex
• _Tcl_ExprLong
• _Tcl_NewDoubleObj
• _Tcl_GetDoubleFromObj
• _Tcl_ExprString
• _TclBN_mp_read_radix
• _Tcl_DeleteTimerHandler
• _Tcl_CreateFileHandler
• _Tcl_GetVar
• _Tcl_GetVar
• _Tcl_CreateObjCommand
• _Tcl_SetVar2Ex
• _Tcl_GetStringFromObj
• _Tcl_NewStringObj
• _Tcl_GetObjType
• _Tcl_MutexUnlock
• _Tcl_DeleteCommand
• _TclBN_mp_init
• _Tcl_GetCurrentThread
• _Tcl_ExprDouble
• _Tcl_AddErrorInfo
• _Tcl_Free
• _Tcl_GetStringResult
• _Tcl_SetVar2
• _Tcl_SetVar2
• _Tcl_GetBoolean
• _Tcl_GetBoolean
• _Tcl_RecordAndEval
• _Tcl_EvalFile
• _Tcl_GetLongFromObj
• _TclBN_mp_clear
• _Tcl_ThreadAlert
• _Tcl_ExprBoolean
• _Tcl_DeleteInterp
• _TclBN_mp_unsigned_bin_size
• _Tcl_AttemptAlloc
• _Tcl_GetObjResult
• _Tcl_GetWideIntFromObj
• _Tcl_NewListObj
• _Tcl_ConditionNotify
• _Tcl_NewBooleanObj
• _Tcl_SplitList
• _Tcl_EvalObjv
• _Tcl_GetThreadData
• _Tcl_GetVar2Ex
• _Tcl_NewWideIntObj
• _Tcl_NewBignumObj
• _Tcl_ListObjGetElements
• _Tcl_GetString
• _Tcl_GetString
• _Tcl_GetString
The use of non-public or deprecated APIs is not permitted on the App Store, as they can lead to a poor user experience should these APIs change and are otherwise not supported on Apple platforms.
I read online that this is a sort of a widespread issue right now with apps that embed Python (would share links but then my post will have to be approved by a moderator). Anyone have a workaround?
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Tags:
Developer Tools
App Store
App Review
Entitlements
I was wondering if there is a quick way to convert a model trained with the open source CRFSuite for use with NLTagger?
It seems like retraining should be possible but was wondering if automatic conversion was supported?
I'm using AVAudioEngine to play AVAudioPCMBuffers. I'd like to synchronize some events with the playback. For example if the audio's frame position is >= some point && less than some point trigger some code.
So I'm looking at - (void)installTapOnBus:(AVAudioNodeBus)bus bufferSize:(AVAudioFrameCount)bufferSize format:(AVAudioFormat * __nullable)format block:(AVAudioNodeTapBlock)tapBlock;
Now I have frame positions calculated (predetermined before audio is scheduled I already made all necessary computations) . So I just need to fire code at certain points during playback:
[playerNode installTapOnBus:bus
bufferSize:bufferSize
format:format
block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) {
//Inspect current audio here and fire...
}];
[playerNode scheduleBuffer:fullbuffer
atTime:startTime
options:0
completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType)
{
// some code is here, not important to this question.
}];
The problem I'm having is figuring out at what point in full buffer I'm at within the tap block. The tap block passes chunks (not the full audio buffer). I tried using the when parameter of the block to calculate the frame position relative to the entire audio but have be unsuccessful so far. I'm assuming the when parameter is relative to the buffer passed in the tap block (not my entire audio buffer I scheduled).
Not installing a tap and just using a timer before scheduling my fullBuffer has given me good results but I'd rather avoid using a timer if possible and use sample time.
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AVAudioNode
AVAudioSession
AVAudioEngine
AVFoundation
Configuration:
I have a NSTextField (multiline) inside an NSWindow.
I have another NSTextField (single line) inside an NSBox (which is in the same window).
The multiline text field is first responder and is editing.
I click on the single line text field inside the NSBox to edit that one.
The NSWindow just closes.
This is on Sonoma 14.2.1.
I subclassed NSWindow and override the close method and put a breakpoint.
Here's the call stack that leads to the window suddenly closing:
#1 0x0000000189c73d90 in -[NSWindow __close] ()
#2 0x000000032343 in -[NSApplication(NSResponder) sendAction:to:from:] ()
#3 0x0000000189b543ac in -[NSControl sendAction:to:] ()
#4 0x0000000189b542f0 in __26-[NSCell _sendActionFrom:]_block_invoke ()
#5 0x0000000189b54218 in -[NSCell _sendActionFrom:] ()
#6 0x0000000189b5413c in -[NSButtonCell _sendActionFrom:] ()
#7 0x0000000189c4c508 in __29-[NSButtonCell performClick:]_block_invoke ()
#8 0x0000000189c4c264 in -[NSButtonCell performClick:] ()
#9 0x0000000189b545a8 in -[NSApplication(NSResponder) sendAction:to:from:] ()
#10 0x0000000189b543ac in -[NSControl sendAction:to:] ()
#11 0x0000000189befb48 in -[NSTextField textDidEndEditing:] ()
#12 0x0000000__CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__ ()
#13 0x000000018625c65c in ___CFXRegistrationPost_block_invoke ()
#14 0x000000018625c5a4 in _CFXRegistrationPost ()
#15 0x00000001861971dc in _CFXNotificationPost ()
#16 0x0000000187289ff0 in -[NSNotificationCenter postNotificationName:object:userInfo:] ()
#17 0x0000000189bef754 in -[NSTextView(NSSharing) resignFirstResponder] ()
#18 0x0000000189a9fab8 in -[NSWindow _realMakeFirstResponder:] ()
#19 0x0000000189b4f18c in -[NSWindow(NSEventRouting) _handleMouseDownEvent:isDelayedEvent:] ()
#20 0x0000000189ada79c in -[NSWindow(NSEventRouting) _reallySendEvent:isDelayedEvent:] ()
#21 0x0000000189ada45c in -[NSWindow(NSEventRouting) sendEvent:] ()
#22 0x000000018a1879f4 in -[NSApplication(NSEventRouting) sendEvent:] ()
#23 0x0000000189dd6908 in -[NSApplication _handleEvent:] ()
#24 0x00000001899a1d74 in -[NSApplication run] ()
The mouse click is no where near the close button in the title bar.
I have a PCM audio buffer (AVAudioPCMFormatInt16). When I try to play it using AVPlayerNode / AVAudioEngine an exception is thrown:
"[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868
(related thread https://forums.developer.apple.com/forums/thread/700497?answerId=780530022#780530022)
If I convert the buffer to AVAudioPCMFormatFloat32 playback works.
My questions are:
Does AVAudioEngine / AVPlayerNode require AVAudioPCMBuffer to be in the Float32 format? Is there a way I can configure it to accept another format instead for my application?
If 1 is YES is this documented anywhere?
If 1 is YES is this required format subject to change at any point?
Thanks!
I was looking to watch the "AVAudioEngine in Practice" session video from WWDC 2014 but I can't find it anywhere (https://forums.developer.apple.com/forums/thread/747008).
I'm looking for the AVAudioEngine in Practice video from WWDC 2014 (session 502) but can't seem to find it anywhere.
Does anyone have a link to this session video? I can only find the slides. Thanks.
I just tried submitting an app to be notarized. This app is actually only used by me internally (but I have other apps this question would be relevant to) and I can't submit for notarization. I get the following error:
"Hardened Runtime is not enabled."
Is the Hardened Runtime now required? I know it used to be optional (I believe the last time I submitted an app update a few months ago outside the Mac App Store I got no such error).
For awhile I've wrapped OSLog in my own macros to include the line number and source file the logging statement originates from in debug mode because OSLog didn't include that info in console output (until recently).
Now I noticed the source code file and line number of the logging statement isn't being shown (I have all the metadata switches turned on in Xcode's console "Metadata Options" popover). Then I realized it is being shown, only on mouse hover over the logging statement in very tiny text.
The text is barely readable (on mouse hover). Why would viewing the line number require me to move the mouse cursor over a logging statement? It doesn't look pretty at all (hiding information behind mouse hover) and even if it did look pretty, this is the console for programmers and we don't care about such nonsense.