Post

Replies

Boosts

Views

Activity

In SwiftUI, how to manipulate sampling rate of a gesture in time?
If increasing sampling rate isn't possible, then is the only option for "continuous" drawing repeatedly adding a bezier curve from n-2 to n-1, that takes into account n-3 and n? Are there other (easy) options to interpolate between location n-2 and n-1 not only visually, but with knowing and storing all intermediate points? Think more bitmap, less vector. Below my code and a current "uneven" result struct ContentView: View {     @State var points: [CGPoint] = []     var body: some View {         Canvas { context, size in             for point in points{                 context.draw(Image(systemName: "circle"), at: point)             }         } .gesture(             DragGesture().onChanged{ value in                 points += [value.location]             }         )     } }
0
0
1k
Dec ’22
Bringing visual elements from imperative to declarative paradigm
UIViews can be wrapped and used in SwiftUI. But guess what other visual elements are there that can't be wrapped? Sprites! Do you thing wrapping SKNode in similar way even make sense, also with performance considerations? For a start it could bring back physics introduced in UIKit and lacking in SwiftUI, with even more options... Have you ever thought about it, and what were your thoughts?
0
0
872
Dec ’22
Can Xcode facilitate non-AWS lambda function testing and deployment?
Since WWDC'20 it is easy to test and deploy lambda functions to AWS. But can Xcode similarly facilitate workflow for cloud providers other than Amazon? There are open source solutions, like https://openwhisk.apache.org, that also support Swift. If not directly, then with some format converting scripts used for deployment, can we adapt/extend lambda testing built into Xcode?
0
0
872
Dec ’22
Can app running on Simulator exchange data with USB device connected to a Mac?
In Simulator, I can select send keyboard input to device, or get input from a microphone. Can I have I/O from other USB devices connected to a Mac? I am interested in receiving in a simulator events from a MIDI keyboard connected to a Mac (and supported by both MacOS, and iOS). There are many more possible use cases, since with DriverKit iPads can now support many more external devices.
0
0
873
Dec ’22
Why repeating protocol conformance in a subclass is considered redundant?
Repeating protocol conformance in a subclass of conforming class results in a redundancy error. My problem is that, because of how protocol witness tables work, repeating protocol conformance in a subclass seems not to be redundant at all . Intuitively, in the Example3 below repeating a conformance of the parent class in the child class could restore the route from protocol extension's implementation back to child's implementation . Where am I wrong? protocol Prot { func f() func g() } extension Prot { func f(){ print("protocol extension's implementation") } func g(){ f() } } class Parent: Prot { } //Directly implementing protocol would route to a child's implementation. class Example1: Prot { func f(){ print("child's implementation")} } //Indirectly implementing protocol would route to a protocol extension's implementation. class Example2: Parent { func f(){ print("child's implementation")} } //Redundant conformance of 'Child' to protocol 'Prot' error, instead of restoring route to a child's implementation. class Example3: Parent, Prot { func f(){ print("child's implementation")} } Example1().g() Example2().g()
5
0
2.0k
Dec ’22
After updating my legal name, can I expect my original and separate developer name to stay intact?
When an organisation adds an app to an account the first time, it may choose a developer name different to its legal name. Can I expect my previously chosen developer name to stay exactly as it was before after I update my legal name? I've heard that for developers who decide to choose a different name, when/if the legal entity name has to be changed at a later stage, the App Store Connect company name is also changed to the legal entity name accordingly. But are terms App Store Connect company name and Developer Name, interchangeable? Do they refer to the same thing? If any change to a legal name means losing an originally accepted developer name irreversibly , doesn't it contradict Apple statement that a developer name cannot be edited or updated later?
0
0
1k
Dec ’22
Background uploads with short-lived authorisation tokens.
Let’s say I have a hundred of big files to upload to a server, using requests with a short-lived authorisation token. NSURLSession background upload describes a problem with adding too many requests, so the first question is: should we manage uploads manually, by which I mean queuing, postponing, retrying. Will we then fall into the same traps only in a different way? What to do if tasks picked by the system have outdated token, and so fail. How to update a token: is there a delegate method (preferable pre iOS 13 compatibile) in which I can get a fresh token and modify a request header? Is there any iOS specific design pattern or contract (Apple's list of server requirements) that would allow uploads to be resumable?
1
0
1.4k
Dec ’22
For OSStatus, how to get a string explaining its meaning?
SecCopyErrorMessageString returns a string explaining the meaning of a security result code and its declaration is func SecCopyErrorMessageString( _ status: OSStatus, _ reserved: UnsafeMutableRawPointer? ) -> CFString? with typealias OSStatus = Int32 Having arbitrary OSStatus (for example for kAudioFormatUnsupportedDataFormatError it is 1718449215), is there something to get the description as string? The idea would be analogous to: let x: Int32 = 1718449215 if let errMsg = SecCopyErrorMessageString(x, nil) as? String{     print(errMsg) } This is not a security result code, and the output is just OSStatus 1718449215, what I expect is a "string explaining the meaning".
1
0
4.3k
Dec ’22
Optimising initialisation of big arrays with random data
I will be filling audio and video buffers with randomly distributed data for each frame in real time. Initializing these arrays with Floats inside basic for loop somehow seems naive. Are there any optimised methods for this task in iOS libraries? I was looking for data-science oriented framework from Apple, did not found one, but maybe Accelerate, Metal, or CoreML are good candidates to research? Is my thinking correct, and if so, can you guide me?
1
0
1.4k
Dec ’22
Recording and Saving MIDI Sequences
Below I have included steps for Java from https://docs.oracle.com/javase/8/docs/technotes/guides/sound/programmer_guide/chapter11.html as an example. Can you help me with a similar instruction for Swift and iOS? How do I feed MusicSequenceFileCreate(::::_:) with MIDIEventList from MIDIInputPortCreateWithProtocol(::::_:) without parsing incoming MIDI events to recreate messages, one by one, with scraped raw data. I feel really embarrassed asking this question, because IMHO this should be documented, easy to find, and to do.
0
0
969
Dec ’22
Wouldn't it be beneficial to have identifiers for SKScenes, like we have for UITableViewCells?
With SKView having presentScene(withIdentifier identifier: String) -> SKScene? we won't be forced to reinstantiate the same scenes for every SKView's presentScene(_:), or else to keep a collection of strong references outside. And scene presentation could be optimised by the framework. Is it now in any way? Let's imagine a SpriteKit scene with hundreds of SKNodes, recreated all over again. Or better, let's imagine dealing with UITableViewCells for UITableViews manually, if there was no auto-recycling mechanism. Can you see any resemblance, what is your opinion about it?
0
0
832
Dec ’22
WWDC example of signal generator outputs files with 0 duration
Using the example from WWDC, with the following command ./SignalGenerator -signal square -duration 3 -output ./uncompressed.wav I can generate an audio file, which is not playable, and its Get Info is: Duration: 00:00 Audio channels: Mono Sample rate: 44,1 kHz Bits per sample: 32 Although for MacOS Preview the duration is zero, this file is playable in VLC and convertible to other formats, so its content is ok. To get more information about the format of an output file I changed the example to print outputFormatSettings: ["AVLinearPCMBitDepthKey": 32, "AVLinearPCMIsBigEndianKey": 0, "AVSampleRateKey": 44100, "AVFormatIDKey": 1819304813, "AVLinearPCMIsFloatKey": 1, "AVNumberOfChannelsKey": 1, "AVLinearPCMIsNonInterleaved": 1] I don’t know how to interpret the “number of AVFormatIDKey": 1819304813. The documentation says: let AVFormatIDKey: String - For information about the possible values for this key, see Audio Format Identifiers Having red this, I still don't know the relation of AVFormatIDKey and listed Audio Format Identifiers. If I knew which file format to expect, it might have helped to guess why the duration of the generated files is always 0? Can you help me with both questions? Thanks.
2
0
1.3k
Dec ’22
If not AVMIDIRecorder, then what?
For AVAudioPlayer, there is corresponding AVAudioRecorder. For AVMIDIPLayer, I found nothing for recording from the system's active MIDI input device. Can I record midi events from the system’s active input midi device, without resolving to low level CoreMidi? After configuring AVAudioUnitSampler with just a few lines of code, import AVFoundation var engine = AVAudioEngine() let unit = AVAudioUnitSampler() engine.attach(unit) engine.connect(unit, to: engine.outputNode, format: engine.outputNode.outputFormat (forBus:0)) try! unit.loadInstrument(at:sndurl) //url to .sf2 file try! engine.start() I could send midi events programmatically. // feeding AVAudioUnitMIDIInstrument with midi data let range = (0..<100) let midiStart = range.map { _ in UInt8.random(in: 70...90) } let midiStop = [0] + midiStart let times = range.map { _ in TimeInterval.random(in: 0...100) * 0.3 } for i in range {     DispatchQueue.main.asyncAfter(deadline: .now()+TimeInterval(times[i])){         unit.stopNote(midiStop[i], onChannel: 1)         unit.startNote(midiStart[i], withVelocity: 127, onChannel: 1)     } } But instead, I need to send midi events from a midi instrument, and tap to them for recording.
1
0
1.4k
Dec ’22
In SwiftUI, how to manipulate sampling rate of a gesture in time?
If increasing sampling rate isn't possible, then is the only option for "continuous" drawing repeatedly adding a bezier curve from n-2 to n-1, that takes into account n-3 and n? Are there other (easy) options to interpolate between location n-2 and n-1 not only visually, but with knowing and storing all intermediate points? Think more bitmap, less vector. Below my code and a current "uneven" result struct ContentView: View {     @State var points: [CGPoint] = []     var body: some View {         Canvas { context, size in             for point in points{                 context.draw(Image(systemName: "circle"), at: point)             }         } .gesture(             DragGesture().onChanged{ value in                 points += [value.location]             }         )     } }
Replies
0
Boosts
0
Views
1k
Activity
Dec ’22
Bringing visual elements from imperative to declarative paradigm
UIViews can be wrapped and used in SwiftUI. But guess what other visual elements are there that can't be wrapped? Sprites! Do you thing wrapping SKNode in similar way even make sense, also with performance considerations? For a start it could bring back physics introduced in UIKit and lacking in SwiftUI, with even more options... Have you ever thought about it, and what were your thoughts?
Replies
0
Boosts
0
Views
872
Activity
Dec ’22
Can Xcode facilitate non-AWS lambda function testing and deployment?
Since WWDC'20 it is easy to test and deploy lambda functions to AWS. But can Xcode similarly facilitate workflow for cloud providers other than Amazon? There are open source solutions, like https://openwhisk.apache.org, that also support Swift. If not directly, then with some format converting scripts used for deployment, can we adapt/extend lambda testing built into Xcode?
Replies
0
Boosts
0
Views
872
Activity
Dec ’22
Can app running on Simulator exchange data with USB device connected to a Mac?
In Simulator, I can select send keyboard input to device, or get input from a microphone. Can I have I/O from other USB devices connected to a Mac? I am interested in receiving in a simulator events from a MIDI keyboard connected to a Mac (and supported by both MacOS, and iOS). There are many more possible use cases, since with DriverKit iPads can now support many more external devices.
Replies
0
Boosts
0
Views
873
Activity
Dec ’22
Why repeating protocol conformance in a subclass is considered redundant?
Repeating protocol conformance in a subclass of conforming class results in a redundancy error. My problem is that, because of how protocol witness tables work, repeating protocol conformance in a subclass seems not to be redundant at all . Intuitively, in the Example3 below repeating a conformance of the parent class in the child class could restore the route from protocol extension's implementation back to child's implementation . Where am I wrong? protocol Prot { func f() func g() } extension Prot { func f(){ print("protocol extension's implementation") } func g(){ f() } } class Parent: Prot { } //Directly implementing protocol would route to a child's implementation. class Example1: Prot { func f(){ print("child's implementation")} } //Indirectly implementing protocol would route to a protocol extension's implementation. class Example2: Parent { func f(){ print("child's implementation")} } //Redundant conformance of 'Child' to protocol 'Prot' error, instead of restoring route to a child's implementation. class Example3: Parent, Prot { func f(){ print("child's implementation")} } Example1().g() Example2().g()
Replies
5
Boosts
0
Views
2.0k
Activity
Dec ’22
After updating my legal name, can I expect my original and separate developer name to stay intact?
When an organisation adds an app to an account the first time, it may choose a developer name different to its legal name. Can I expect my previously chosen developer name to stay exactly as it was before after I update my legal name? I've heard that for developers who decide to choose a different name, when/if the legal entity name has to be changed at a later stage, the App Store Connect company name is also changed to the legal entity name accordingly. But are terms App Store Connect company name and Developer Name, interchangeable? Do they refer to the same thing? If any change to a legal name means losing an originally accepted developer name irreversibly , doesn't it contradict Apple statement that a developer name cannot be edited or updated later?
Replies
0
Boosts
0
Views
1k
Activity
Dec ’22
Spreading incoming events evenly in time, using Combine
How to spread incoming messages (A, B, ...) evenly, one every second, marking them with most recent timestamps (1, 2, ...) ? output on the gray stripe, input above it: It requires buffering messages (e.g. B) when necessary, and omitting timer ticks (e.g. 4) in case there were no messages to consume, current or buffered.
Replies
0
Boosts
0
Views
998
Activity
Dec ’22
New KeyPath syntax for old CAKeyframeAnimation(keyPath: #keyPath(CALayer.position))
The new syntax would be \CALayer.position. Of course easy cut and paste into CAKeyframeAnimation(keyPath: #keyPath(CALayer.position)) won't work. What will?
Replies
0
Boosts
0
Views
1k
Activity
Dec ’22
Background uploads with short-lived authorisation tokens.
Let’s say I have a hundred of big files to upload to a server, using requests with a short-lived authorisation token. NSURLSession background upload describes a problem with adding too many requests, so the first question is: should we manage uploads manually, by which I mean queuing, postponing, retrying. Will we then fall into the same traps only in a different way? What to do if tasks picked by the system have outdated token, and so fail. How to update a token: is there a delegate method (preferable pre iOS 13 compatibile) in which I can get a fresh token and modify a request header? Is there any iOS specific design pattern or contract (Apple's list of server requirements) that would allow uploads to be resumable?
Replies
1
Boosts
0
Views
1.4k
Activity
Dec ’22
For OSStatus, how to get a string explaining its meaning?
SecCopyErrorMessageString returns a string explaining the meaning of a security result code and its declaration is func SecCopyErrorMessageString( _ status: OSStatus, _ reserved: UnsafeMutableRawPointer? ) -> CFString? with typealias OSStatus = Int32 Having arbitrary OSStatus (for example for kAudioFormatUnsupportedDataFormatError it is 1718449215), is there something to get the description as string? The idea would be analogous to: let x: Int32 = 1718449215 if let errMsg = SecCopyErrorMessageString(x, nil) as? String{     print(errMsg) } This is not a security result code, and the output is just OSStatus 1718449215, what I expect is a "string explaining the meaning".
Replies
1
Boosts
0
Views
4.3k
Activity
Dec ’22
Optimising initialisation of big arrays with random data
I will be filling audio and video buffers with randomly distributed data for each frame in real time. Initializing these arrays with Floats inside basic for loop somehow seems naive. Are there any optimised methods for this task in iOS libraries? I was looking for data-science oriented framework from Apple, did not found one, but maybe Accelerate, Metal, or CoreML are good candidates to research? Is my thinking correct, and if so, can you guide me?
Replies
1
Boosts
0
Views
1.4k
Activity
Dec ’22
Recording and Saving MIDI Sequences
Below I have included steps for Java from https://docs.oracle.com/javase/8/docs/technotes/guides/sound/programmer_guide/chapter11.html as an example. Can you help me with a similar instruction for Swift and iOS? How do I feed MusicSequenceFileCreate(::::_:) with MIDIEventList from MIDIInputPortCreateWithProtocol(::::_:) without parsing incoming MIDI events to recreate messages, one by one, with scraped raw data. I feel really embarrassed asking this question, because IMHO this should be documented, easy to find, and to do.
Replies
0
Boosts
0
Views
969
Activity
Dec ’22
Wouldn't it be beneficial to have identifiers for SKScenes, like we have for UITableViewCells?
With SKView having presentScene(withIdentifier identifier: String) -> SKScene? we won't be forced to reinstantiate the same scenes for every SKView's presentScene(_:), or else to keep a collection of strong references outside. And scene presentation could be optimised by the framework. Is it now in any way? Let's imagine a SpriteKit scene with hundreds of SKNodes, recreated all over again. Or better, let's imagine dealing with UITableViewCells for UITableViews manually, if there was no auto-recycling mechanism. Can you see any resemblance, what is your opinion about it?
Replies
0
Boosts
0
Views
832
Activity
Dec ’22
WWDC example of signal generator outputs files with 0 duration
Using the example from WWDC, with the following command ./SignalGenerator -signal square -duration 3 -output ./uncompressed.wav I can generate an audio file, which is not playable, and its Get Info is: Duration: 00:00 Audio channels: Mono Sample rate: 44,1 kHz Bits per sample: 32 Although for MacOS Preview the duration is zero, this file is playable in VLC and convertible to other formats, so its content is ok. To get more information about the format of an output file I changed the example to print outputFormatSettings: ["AVLinearPCMBitDepthKey": 32, "AVLinearPCMIsBigEndianKey": 0, "AVSampleRateKey": 44100, "AVFormatIDKey": 1819304813, "AVLinearPCMIsFloatKey": 1, "AVNumberOfChannelsKey": 1, "AVLinearPCMIsNonInterleaved": 1] I don’t know how to interpret the “number of AVFormatIDKey": 1819304813. The documentation says: let AVFormatIDKey: String - For information about the possible values for this key, see Audio Format Identifiers Having red this, I still don't know the relation of AVFormatIDKey and listed Audio Format Identifiers. If I knew which file format to expect, it might have helped to guess why the duration of the generated files is always 0? Can you help me with both questions? Thanks.
Replies
2
Boosts
0
Views
1.3k
Activity
Dec ’22
If not AVMIDIRecorder, then what?
For AVAudioPlayer, there is corresponding AVAudioRecorder. For AVMIDIPLayer, I found nothing for recording from the system's active MIDI input device. Can I record midi events from the system’s active input midi device, without resolving to low level CoreMidi? After configuring AVAudioUnitSampler with just a few lines of code, import AVFoundation var engine = AVAudioEngine() let unit = AVAudioUnitSampler() engine.attach(unit) engine.connect(unit, to: engine.outputNode, format: engine.outputNode.outputFormat (forBus:0)) try! unit.loadInstrument(at:sndurl) //url to .sf2 file try! engine.start() I could send midi events programmatically. // feeding AVAudioUnitMIDIInstrument with midi data let range = (0..<100) let midiStart = range.map { _ in UInt8.random(in: 70...90) } let midiStop = [0] + midiStart let times = range.map { _ in TimeInterval.random(in: 0...100) * 0.3 } for i in range {     DispatchQueue.main.asyncAfter(deadline: .now()+TimeInterval(times[i])){         unit.stopNote(midiStop[i], onChannel: 1)         unit.startNote(midiStart[i], withVelocity: 127, onChannel: 1)     } } But instead, I need to send midi events from a midi instrument, and tap to them for recording.
Replies
1
Boosts
0
Views
1.4k
Activity
Dec ’22