Here's a simple demo. I run it on macOS 12.1, compiled with Xcode 13.2.
struct ContentView: View {
var body: some View {
VStack {
Text("ee")
Text("eé")
Text("e\u{E9}") // "e with acute"
Text("ee\u{301}") // "combining acute"
}.font(.custom("Avenir", fixedSize: 18))
.padding(20)
}
}
In the 4rd one, the "e" is rendered in the wrong size/font. Screenshot:
Other fonts do not have the problem. For example, "Avenir Next" and "Helvetica".
Is there a way (in code) to tell which fonts can handle combining chars?
Is this a bug?
I notice the same thing happens when I use Core Text to draw the strings at a low level. So it's not just SwiftUI.
In TextEdit, if have Avenir font, type "option-e" + "e" I get a nice letter. Maybe TextEdit is doing what Xcode did in the second line, and using the "e with acute" unicode character.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm working on a drawing and painting program. When I turn my iPad, I detect the screen change and keep the drawing stable, while allowing the toolbars and things to rotate. But SwiftUI creates a disorienting animation of the main canvas anyway. Is there a way to shut that off?
Hi,I need to save and load metal textures to a file. Example code below is below. I'm noticing the RGB values are changing as it gets saved and reloaded again.metal texture pixel: RGBA: 42,79,12,95after save and reload: 66,88,37,95I thought it might be a colorspace issue, but the colorspaces I’ve tried all had the same problem. `genericRGBLinear` got close, but there’s got to be a way to save the RGB data and get it back exactly. ?thanks,RobCode:// saving...
let ciCtx = CIContext()
let ciImage = CIImage(mtlTexture: metalTexture, options: [:])
[ … transfrom to flip y-coordinate …]
let colorSpace = CGColorSpaceCreateDeviceRGB()
let cgImage = ciCtx.createCGImage(ciImage, from: fullRect, format: kCIFormatRGBA8, colorSpace: colorSpace)!
let imageDest = CGImageDestinationCreateWithData(mData, kUTTypePNG, 1, nil)!
CGImageDestinationAddImage(imageDest, cgImage, nil)
CGImageDestinationFinalize(imageDest)
// loading...
let src = CGImageSourceCreateWithData(imgData, nil)
let img = CGImageSourceCreateImageAtIndex(src, 0, nil)
let loader = MTKTextureLoader(device: self.metalDevice)
let texture = try! loader.newTexture(cgImage: img, options: [:])
I can't find any documentation on this.
I've tried adding png files to my project and adding them with the "+" button here, but Xcode just ignores me. No error message.
Is it looking for a specific file format? This is Xcode 13.
This appears to severely limit the usefulness of hierarchical lists in SwiftUI.
I want to use the new hierarchical list/outline to display a filesystem tree. For data to pass to OutlineGroup, I created a class named FileSystemNode, and gave it a computed children property. When the getter is first called, it will read its directory contents to return a list (if it is a directory).
Problem is, when the OutlineGroup is first displayed, even though it is collapsed on-screen to a single node, it calls children and recurses over the entire filesystem.
Is there a way to stop this? If not, I hope it gets fixed before release.
(This is on macOS Big Sur beta)
My app crashes and Xcode shows no stack trace. It just pops up some line of assembly language in __pthread_kill, and shows this in the console:
[error] precondition failure: invalid value type for attribute: 230424 (saw PreferenceKeys, expected ViewList) AttributeGraph precondition failure: invalid value type for attribute: 230424 (saw PreferenceKeys, expected ViewList).
Seems like a bug in SwiftUI. Any ideas? This is on macOS 11.
I know how to add items to the main menu. But what if I want to connect a handler to one that is already there by default? (For example "Select All").
WindowGroup {
ContentView()
}.commands {
CommandGroup(after: CommandGroupPlacement.pasteboard) {
Button("Select All") { selectAll() }
}
That adds a second "Select All" menu item.
If I use CommandGroup(replacing: ...) then it replaces others, not just the "Select All"
Simple code like this gives an error.
struct MyView: View {
@State private var test: Bool = false
var body: some View {
Text("Hello. \(test)")
The error:
Instance method 'appendInterpolation(_:formatter:)' requires that 'Bool' inherit from 'NSObject'
What is going on?
I think they are, but I got myself into a situation with this error:
// Inside a SwiftUI View struct...
let cancelled: @MainActor () -> Void
var body: some View {
...
Button(action: self.cancelled) { Text("Cancel") } // Error here
The compiler reports:
Converting function value of type '@MainActor () -> Void' to '() -> Void' loses global actor 'MainActor'
I originally put that attribute on the cancelled property, because in the passed-in closure, I was doing something that gave me an error about how I had to be on the MainActor.
I created a simple app in Xcode. Installed and ran it in the iOS Simulator. I can also start it from command line like this:
% xcrun simctl launch booted com.example.TestBuildCommands --console
com.example.TestBuildCommands: 52877
%
It just prints that one line and returns my shell prompt. But the help for launch says that --console should: "Block and print the application's stdout and stderr to the current terminal."
In my main() function I have:
int main(int argc, char * argv[]) {
printf("In main.\n");
...
Shouldn't I be seeing that message in my terminal screen?
% xcrun simctl help launch
Launch an application by identifier on a device.
Usage: simctl launch [-w | --wait-for-debugger] [--console|--console-pty] [--stdout=<path>] [--stderr=<path>] [--terminate-running-process] <device> <app bundle identifier> [<argv 1> <argv 2> ... <argv n>]
--console Block and print the application's stdout and stderr to the current terminal.
Signals received by simctl are passed through to the application.
(Cannot be combined with --stdout or --stderr)
[...]
ps. Anyone know how to turn off syntax highlighting in those triple-backquoted code blocks that are meant to be just terminal text, not highlighted code?
I'd like to use the new localization screenshots and test plans feature to take screenshots in different languages, for the app store. I end up with images that are half one screen and half another, like it's some kind of timing issue. My test code is below. Is it missing something that would give it the right timing on the screenshots?I wrote a UI test and set "Localization Screenshots" to "On" in the test plan's settings. The UI test walks through a few screens and the resulting test report has a few image files attached labeled "Localization screenshot". Some images are are split so that the left side is one view controller and the right side is another, like it captured a push navigation transition. Another image has two overlaid screens, each half faded.I'm running in the simulator. My test code looks like: func testTakeScreenshots() {
let app = XCUIApplication()
app.launch()
// At workouts page
app.tables["workouts"].cells.element(boundBy: 0).tap()
// At first workout
app.navigationBars.buttons["edit"].tap()
// At workout edit screen, click first exercise
app.tables.cells.element(boundBy: 0).tap()
...
}
I have some code that used to run on my iPad Pro. Today I compiled it for iOS 13, with Xcode 11, and I get errors like this: validateComputeFunctionArguments:834: failed assertion `Compute Function(merge_layer): Shader uses texture(outTexture[1]) as read-write, but hardware does not support read-write texture of this pixel format.'The pixel format is showing as `MTLPixelFormatBGRA8Unorm`. That's what I expected.The debugger says the device has no support for writeable textures. (lldb) p device.readWriteTextureSupport (MTLReadWriteTextureTier) $R25 = tierNoneDid some devices lose support for texture writing in iOS 13?Rob
Hi,
I want something like sandbox-exec, so I can run things that I don't trust, and restrict their ability to read or write files to only certain locations. Like most software devs I have to download and run lots of code from the internet and the danger of this really annoys me.
Unfortunately sandbox-exec is marked as deprecated and the APIs in sandbox.h say "No longer supported".
I notice there is some new stuff in the Apple docs about "hypervisors" and "virtualization".
https://developer.apple.com/documentation/hypervisor
https://developer.apple.com/documentation/virtualization
Would these APIs allow me to start and control a virtual copy of my macOS, to serve like a sandbox?
Are there other solutions that people use?
As an example, say that I need to download and run a copy of memcached. It's a typical open source project – you unpack a source tgz, then run configure; make and get a binary. Now I want to run that without worrying that some hacker injected a piece of evil code to copy my files and send them somewhere. So I want to say "run this binary, while disallowing file reads and writes, except for directories X,Y,Z, and disallowing network connections, except for listening on port 1234."
Hi,
If I change the AppIcon in Xcode's Assets.xcassets, and rerun my app, the image used in the dock and app switcher does not update. If I "Clean Build Folder", and re-run, then it updates.
This is annoying when I keep tweaking the colors in the icon and want to see how they look. A full rebuild takes a while, because I have a few Swift Package dependencies.
Anyone know a trick to get the AppIcon to stop caching (or whatever it's doing)?
I tried killall Dock and killall Finder, but that didn't help.
(macOS 11.2.3)
Rob
I have a TextField in my toolbar that I use for a search function. I want to trigger the search by hitting the "return" key in this field. But if I do it in onCommit, the search also gets triggered when the user un-focuses the field.
Is there a way to respond to just the "return" key?
TextField("Search", text: $searchQuery) { editing in
print("onEditingChanged \(editing)")
} onCommit: {
// Problem: this is triggered by both
// 1. Return key
// 2. Losing focus
searchModel.startSearch(query: searchQuery)
}