I have a UTI for "public.directory" and can drag-drop folders onto my app and open them. I also added this to the Info.plist to say the app supported directoryies.
But the default "Open" command seems to popup up an NSOpenPanel with folders not selectable. The "Open" button stays disabled.
How do I change this? I tried implementing "openDocument", but then it lets through any file type, not just the ones in my Info.plist. So I'd like to just use the default implementation, but need an override for the NSOpenPanel.
(IBAction)openDocument:(id)sender
{
NSOpenPanel *panel = [NSOpenPanel openPanel];
[panel setCanChooseFiles:YES];
[panel setCanChooseDirectories:YES];
[panel setAllowsMultipleSelection:NO];
...
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My build generates 10 errors, and sometimes it halts the "build and run" as expected.
Most of the time, it just runs the previously successful build anyways. Seems like a basic tenent of an IDE to not do this unless I've explicitly enabled the run.
Seems that metal-shaderconverter can build a metallib, but I need .air files. Then I link the .air files into a single metallib and metallibdsym file.
HLSL -> dxc -> DXIL -> metal-shaderconverter -> .metallib
But there's no way to link together multiple metallib into a single metallib is there?
Visual Studio, Stadia, and JetBrains are supporting natvis files, where Xcode is still stuck on lldb python files that even Xcode no longer uses to debug data structures. Apple has converted all of their scripts to native code, so there are no samples of how to write complex lldb visualizer rules.
There is already an lldb-eval that can bring in natvis files, so something like this should be brought to Xcode. C++ packages like EASTL only ship with a natvis file, and it's far simpler to edit and write than the lldb rules and python scripting.
I have the following set, but the WKWebView is losing the page content that I'm updating in a given page whenever the user presses the "delete" key. This is in SwiftUI, and I see no way to intercept and block this key.
webView.allowsBackForwardNavigationGestures = false
Isn't the whole point of setting this flag, to not just have "swipe" navigation stop, but all back/forward support stop? Also with the backForwardList immutable, it's not like I can delete an entry from it.
I have an app hosting a page with a 'drop' operation where the page accepts file drops. But I can't easily intercept that, nor seep to be able to block it in Javascript. Yet WKWebView has no way to disable the drop operation from settings, or intercept the url of the dropped file. Either would be useful here.
Trying to create a List that sorts by some criteria and handles searchable without any samples is only made even more difficult by the textfield constantly refocusing itself. So I can't even tab away from it. This is some awful bug in SwiftUI.
class FileSearcher: ObservableObject {
@Published var searchIsActive = false
@Published var searchText = ""
var files: [File] = []
...
}
NavigationSplitView() {
}
.searchablel(text: $fileSearcher.searchText,
isPresented: $fileSearcher.searchIsActive,
placement: .sidebar,
prompt: "Filter")
I have a webpage that needs to receive keypresses to zoom and scroll. This seems to be the only way to prevent the annoying NSBeep from occurring. Return true too much, and command keys stop working. So need to be explicit about which keys you want. Also no way to block delete/shift+delete from going fwd/back in the history using this same mechanism.
func isKeyHandled(_ event: NSEvent) -> Bool {
// can't block delete or shift+delete
// so the WKWebView goes back/foward through it's 1 page history.
// that loses all context for the user.
// prevent super annoying bonk/NSBeep
// if don't check modifier flags (can't check isEmpty since 256 is often set
// then the cmd+S stops working
if !(event.modifierFlags.contains(.command) ||
event.modifierFlags.contains(.control) ||
event.modifierFlags.contains(.option)) {
// wasd
if event.keyCode == Keycode.w || event.keyCode == Keycode.a || event.keyCode == Keycode.s || event.keyCode == Keycode.d {
return true
}
}
return false
}
// Apple doesn't want this to be overridden by user, but key handling
// just doesn't work atop the WKWebView without this. KeyUp/KeyDown
// overrides don't matter, since we need the WKWebView to forward them
override func performKeyEquivalent(with event: NSEvent) -> Bool {
if !isKeyHandled(event) {
return super.performKeyEquivalent(with: event)
}
return true
}
I create @State holding a WKWebView (heavyweight object), and then wrap that in a VKViewRepresentable, then
@State var myWebView = newWebView(request: URLRequest(url:URL(string: request)!))
This is the only way I can then reference the webView late on to run javascript queries on it. Otherwise, it's embedded in the View/ContentView hierarchy.
So then when I moved from Window to WindowGroup, only one of these WKWebView is created. This looks bad to have an empty detail panel in the previous Window. The docs on WindowGroup state that it makes new state to go with each Window in the WindowGroup, but in this case, that's not the case here.
The following generates a prior definition warning from #include <__config>. But that has an ifndef in it. I'm in C++17 on Xcode latest (14.5). Is this documented anywhere?
-D_LIBCPP_ENABLE_ASSERTIONS=1
I have this game controller connected to my M1, and the Simulator won't announced it via .GCControllerDidConnect. This works fine on iOS and macOS.
I have the simulator set to "Send Game Controller to Device" which the Simulator does. If I disable that, then I can control the simulator view. But once enabled, the Simulator doesn't tell the app about the controller.
First I get this
ar_world_tracking_provider_query_device_anchor_at_timestamp <0x302b9c0a0>: The device_anchor can only be queried when the world tracking provider is running.
This seemed to all break with the auto-update to 2.0.1. Simulator runs the code fine.
I seem to see an infinite stall here
frameLayer.endUpdate()
// Pace frames by waiting for the optimal prediction time.
try await LayerRenderer.Clock().sleep(until: timing.optimalInputTime, tolerance: nil)
// Start submitting the updated frame.
frameLayer.startSubmission() <-
This has been broken for over 5 years now. I see 2 different behaviors in 2 different SwiftUI apps. This makes SwiftUI not ready for prime time apps, but I just have tools right now.
The VStack { List } doesn't scroll to the item in a long list. The selection moves to the next item in the list, but can't see it. This is just basic UI functionality of a list. UIListView doesn't have this issue.
The NavigationView { List { NavigationLink }} wraps around back to the top of the list when pressing down arrow past the last visible item, but there are plenty more list items to visit.
So Xcode has projects. And I breakup my code/libraries/targets into those. So why after 20 years of Xcode, can a workspace, that holds projects not display the same project opened in another workspace? The workspace redirects all output the project would normally generate in DerivedData to a completely different folder by default.
What this means is I have to shut one workspace, and then open the other workspace with a different set of projects to see the ones that are shared. The workspace can't build because it can't open the project shared with another one that is open. VS sln files don't have this issue, so why do workspace files.
I was working on a macOS ObjC++ tool that allocated and then replaced a single MTLTexture, and noted that all of the textures were leaked. This project was built with CMake, and I found out the Xcode defaults ARC to off.
ARC has been around long enough that I can't think of many ObjC or Swift projects that would work without it. This default should probably be changed.
The workaround for now, is in all CMakeLists.txt files, to set the following:
XCODE_ATTRIBUTE_CLANG_ENABLE_OBJC_ARC YES