Post

Replies

Boosts

Views

Created

Too many empty "required" UIView.init(coder:) methods
Hi,I have a lot of UIViews where the compiler forces me to add an init(coder:) initializer, like this:class FooView : UIView /* or a UIView subclass */ { ... required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } ... }It claims it's required but my program runs fine without it. I do not create the views from an archive.This makes me wonder if something is wrong here with the design of the library, or the concept of a 'required' initializer. What do people think? Does it make sense or is this a wart? If so, can it be fixed?Rob
Topic: UI Frameworks SubTopic: UIKit Tags:
5
1
2.8k
Mar ’17
HelloTriangle in Swift: error about depth attachment pixel format
Hi,I'm trying to port this to Swift. https://developer.apple.com/documentation/metal/hello_triangleTo get it to run I had to add the following line to the renderer class. Any idea why it's not needed in the Objective-C version? pipelineStateDescriptor.depthAttachmentPixelFormat = metalView.depthStencilPixelFormatWithout it I get an error: -[MTLDebugRenderCommandEncoder validateFramebufferWithRenderPipelineState:]:1232: failed assertion `For depth attachment, the render pipeline's pixelFormat (MTLPixelFormatInvalid) does not match the framebuffer's pixelFormat (MTLPixelFormatDepth32Float).'Rob
3
0
5.8k
Mar ’18
How to save and load MTLTexture, getting back same RGB values?
Hi,I need to save and load metal textures to a file. Example code below is below. I'm noticing the RGB values are changing as it gets saved and reloaded again.metal texture pixel: RGBA: 42,79,12,95after save and reload: 66,88,37,95I thought it might be a colorspace issue, but the colorspaces I’ve tried all had the same problem. `genericRGBLinear` got close, but there’s got to be a way to save the RGB data and get it back exactly. ?thanks,RobCode:// saving... let ciCtx = CIContext() let ciImage = CIImage(mtlTexture: metalTexture, options: [:]) [ … transfrom to flip y-coordinate …] let colorSpace = CGColorSpaceCreateDeviceRGB() let cgImage = ciCtx.createCGImage(ciImage, from: fullRect, format: kCIFormatRGBA8, colorSpace: colorSpace)! let imageDest = CGImageDestinationCreateWithData(mData, kUTTypePNG, 1, nil)! CGImageDestinationAddImage(imageDest, cgImage, nil) CGImageDestinationFinalize(imageDest) // loading... let src = CGImageSourceCreateWithData(imgData, nil) let img = CGImageSourceCreateImageAtIndex(src, 0, nil) let loader = MTKTextureLoader(device: self.metalDevice) let texture = try! loader.newTexture(cgImage: img, options: [:])
8
1
6.9k
Apr ’18
Upgrading app's UIDocument format from Data to FileWrapper?
Hi,I had a document-based iOS app working, but want to change it so it saves to a package. Seems like it's better when big chunks of a file may not be changing. In Xcode, under the Target > Info > Exported UTI > Conforms To, I had: "public.data, public.content". If I change that to "com.apple.package", then I can't open my old files to upgrade them. But if I *add* "com.apple.package", then the app opens both kinds as desired. I wonder if having it conform to all three of those types is going to cause other problems.Rob
Topic: UI Frameworks SubTopic: UIKit Tags:
3
0
1.5k
Jul ’18
UI Test localization screenshots fail - they're between screens
I'd like to use the new localization screenshots and test plans feature to take screenshots in different languages, for the app store. I end up with images that are half one screen and half another, like it's some kind of timing issue. My test code is below. Is it missing something that would give it the right timing on the screenshots?I wrote a UI test and set "Localization Screenshots" to "On" in the test plan's settings. The UI test walks through a few screens and the resulting test report has a few image files attached labeled "Localization screenshot". Some images are are split so that the left side is one view controller and the right side is another, like it captured a push navigation transition. Another image has two overlaid screens, each half faded.I'm running in the simulator. My test code looks like: func testTakeScreenshots() { let app = XCUIApplication() app.launch() // At workouts page app.tables["workouts"].cells.element(boundBy: 0).tap() // At first workout app.navigationBars.buttons["edit"].tap() // At workout edit screen, click first exercise app.tables.cells.element(boundBy: 0).tap() ... }
1
0
1.4k
Sep ’19
iOS 13, iPad Pro now says hardware does not support read-write texture?
I have some code that used to run on my iPad Pro. Today I compiled it for iOS 13, with Xcode 11, and I get errors like this: validateComputeFunctionArguments:834: failed assertion `Compute Function(merge_layer): Shader uses texture(outTexture[1]) as read-write, but hardware does not support read-write texture of this pixel format.'The pixel format is showing as `MTLPixelFormatBGRA8Unorm`. That's what I expected.The debugger says the device has no support for writeable textures. (lldb) p device.readWriteTextureSupport (MTLReadWriteTextureTier) $R25 = tierNoneDid some devices lose support for texture writing in iOS 13?Rob
12
0
5.1k
Oct ’19
Simple layout still fails
I'm surprised this simple code still doesn't work on iOS 13.3 / Xcode 11.3. On my iPhone SE it's almost all off screen.It's just two pieces of text, side by side, and two pickers, side by side. Anyone know a workaround? struct ContentView: View { @State var choice: Int = 10 @State var choice2: Int = 10 var body: some View { return VStack { HStack { Text("Some text here") Spacer() Text("Foo baz") } HStack { Picker(selection: self.$choice, label: Text("C1")) { ForEach(0..<10) { n in Text("\(n) a").tag(n) } } Picker(selection: self.$choice2, label: Text("C2")) { ForEach(0..<10) { n in Text("\(n) b").tag(n) } } } } } }
4
1
4k
Dec ’19
How does SwiftUI update if objectWillChange fires *before* change
I'm wondering how SwiftUI updates work in connection with ObservableObjects. If a SwiftUI View depends on an `ObservableObject`, the object's `objectWillChange` publisher fires, and SwiftUI learns about the change, before the change happens. At this point, SwiftUI can't re-render a View, right? Because the new properties aren't there yet. So what does SwiftUI do? Does it schedule the change for later? That doesn't make sense either - how can it know when the object will be ready to be used in a new rendering of the UI? ~ Rob
3
0
5.4k
Jan ’20
Can't use protocols with SwiftUI models?
I've been using protocols to help model a hierarchy of different object types. As I try to convert my app to use SwiftUI, I'm finding that protocols don't work with the ObservableObject that you need for SwiftUI models. I wonder if there are some techniques to get around this, or if people are just giving up on "protocol oriented programming" when describing their SwftUI models? There is example code below. The main problem is that it seems impossible to have a View that with an model of protocol `P1` that conditionally shows a subview with more properties if that model also conforms to protocol `P2`.For example, I'm creating a drawing/painting app, so I have "Markers" which draw on the canvas. Markers have different properties like color, size, shape, ability to work with gradients. Modeling these properties with protocols seems ideal. You're not restricted with a single inheritance class hierarchy. But there is no way to test and down-cast the protocol...protocol Marker : ObservableObject { var name: String { get set } } protocol MarkerWithSize: Marker { var size: Float { get set } } class BasicMarker : MarkerWithSize { init() {} @Published var name: String = "test" @Published var size: Float = 1.0 } struct ContentView<MT: Marker>: View { @ObservedObject var marker: MT var body: some View { VStack { Text("Marker name: \(marker.name)") if marker is MarkerWithSize { // This next line fails // Error: Protocol type 'MarkerWithSize' cannot conform to 'MarkerWithSize' // because only concrete types can conform to protocols MarkerWithSizeSection(marker: marker as! MarkerWithSize) } } } } struct MarkerWithSizeSection<M: MarkerWithSize>: View { @ObservedObject var marker: M var body: some View { VStack { Text("Size: \(marker.size)") Slider(value: $marker.size, in: 1...50) } } }Thoughts?
3
1
9.4k
Feb ’20
Images in documentation comments not working?
Does anyone use this? I can't get it working. I'm talking about code in normal swift files, not playgrounds.Example:/// ![testdiagram](/Users/me/fullpathto/test-image.png) /// ![test xcode image](http://devimages.apple.com.edgekey.net/assets/elements/icons/128x128/xcode.png) ///Neither one displays anything in the documentation popover or the Quick Help Inspector.The docs act like it works: https://developer.apple.com/library/archive/documentation/Xcode/Reference/xcode_markup_formatting_ref/Images.html#//apple_ref/doc/uid/TP40016497-CH17-SW1A lot of my code would really benefit from diagrams in documentation. I'd love to be able to use this.
3
0
1.9k
Mar ’20
Popping back multiple levels?
I have a UI where you can navigate/push views like this: Root view > List of things > View thing > Edit thingThe "Edit thing" view can also delete it. After a delete, I want it to pop back to the "List of things". Best I've got now is to call `presentationMode.wrappedValue.dismiss()` on the "Edit thing" view, and then again in the "View thing" view, but that time inside DispatchQueue.main.async { }. It works but the double animation is kind of clunky.Is there a better way?
3
0
7.5k
Jun ’20
Still wondering about objectWillChange (vs objectDidChange) magic :)
Hi all, The WWDC video offers (at about 16:45) as short explanation as to why it's "objectWillChange" instead of "objectDidChange". He said it's because SwiftUI needs to coalesce the changes. Okay, but then how does it know when the changes have ended? It seems like you'd need two events. Something like: self.objectWillChange.send() self.foo = ... self.bar = ... self.objectHasFinishedChanging.send() or self.objectChanges { 		self.foo = ... 		self.bar = ... }
4
0
2.5k
Jun ’20
Hierarchical List / OutlineGroup - can't change data?
Does anyone use the new hierarchical data List (ie, OutlineGroup) with a model where the tree data changes? I'm on Big Sur beta. I have code like this: List([treeRoot], children: \.children) { item in ... Say the treeRoot (observed object) gets a new child. The view tries to re-render and fails: 2020-09-21 ... [General] NSOutlineView error inserting child indexes <_NSCachedIndexSet: 0x600000760000>[number of indexes: 1 (in 1 ranges), indexes: (3)] in parent 0x0 (which has 1 children).
1
0
807
Sep ’20
How to build a replacement for sandbox-exec?
Hi, I want something like sandbox-exec, so I can run things that I don't trust, and restrict their ability to read or write files to only certain locations. Like most software devs I have to download and run lots of code from the internet and the danger of this really annoys me. Unfortunately sandbox-exec is marked as deprecated and the APIs in sandbox.h say "No longer supported". I notice there is some new stuff in the Apple docs about "hypervisors" and "virtualization". https://developer.apple.com/documentation/hypervisor https://developer.apple.com/documentation/virtualization Would these APIs allow me to start and control a virtual copy of my macOS, to serve like a sandbox? Are there other solutions that people use? As an example, say that I need to download and run a copy of memcached. It's a typical open source project – you unpack a source tgz, then run configure; make and get a binary. Now I want to run that without worrying that some hacker injected a piece of evil code to copy my files and send them somewhere. So I want to say "run this binary, while disallowing file reads and writes, except for directories X,Y,Z, and disallowing network connections, except for listening on port 1234."
12
0
4.8k
Sep ’20
Bad: OutlineGroup is not lazy loading children?
This appears to severely limit the usefulness of hierarchical lists in SwiftUI. I want to use the new hierarchical list/outline to display a filesystem tree. For data to pass to OutlineGroup, I created a class named FileSystemNode, and gave it a computed children property. When the getter is first called, it will read its directory contents to return a list (if it is a directory). Problem is, when the OutlineGroup is first displayed, even though it is collapsed on-screen to a single node, it calls children and recurses over the entire filesystem. Is there a way to stop this? If not, I hope it gets fixed before release. (This is on macOS Big Sur beta)
3
3
1.9k
Oct ’20