Post

Replies

Boosts

Views

Activity

Reply to I've been fooling around with this and I lose "stuff"
The area on the left in grey is called the "navigator" You've selected the Find navigator, and you expect to see files containing text containing "readNumber". You're searching in an area called "macOS" (selected using the popup underneath the search field. I don't know what that is - it isn't one of the pre-defined search scopes. I usually use "in Project" or "in Project & SDK". In addition, if you look at the bottom of the Find navigator, there's a file name filter. Your search doesn't find "readNumber" at all, so this doesn't make any difference here, but I have no idea where it is searching. Clear the file filter at the bottom and change the search scope to "project" and see if it works then. Also, select the Project navigator, expand the project view and post the screenshot - it should looks something like this if it doesn't, maybe we'll be able to help you out. Incidentally, clearly Xcode is finding something, because it built and ran your program. Also, if you're new to this, why are you building for both macOS and iOS? It would be easier to start with just one.
May ’25
Reply to Multiple Commands Error
Hi. In future, please post error messages as text, not screen shots, they're a lot easier to view that way. Somehow you have almost every file in your target twice. If you're new to Xcode, this could be quite difficult to fix. If you don't want to go that route, open the target in Xcode, select the Build Phases tab, and check the Compile Sources, Link Binary With Libraries and Copy Bundle Resources phases. Remove any duplicate entries. You may find it helpful to create a new, minimal project in Xcode from its iOS app template, so that you can compare a working target with your own.
May ’25
Reply to Create Ios app using Xojo
I suspect most folks who develop for their personal use are using Xcode, that's why you're not seeing much of a response. You can develop an app using a personal developer account and deploy it to your own phone. In Xcode, you select "automatic" signing under your own personal team. Xcode will put the app on your phone. It will work for a week I think before you need to build and deploy it again. I don't know how you handle this if the app is built and signed in some other workflow. But there is an Xcode menu item under its Window menu, called "Devices and Simulators". Your phone should show up there, if you've ever used it for development. There's a list of installed apps there, with a little "+" button underneath - you should be able to select your app built with third party tools and send it to your phone. If your phone doesn't show up in the list, build one of the template apps for iOS in Xcode, then plug your phone in and select it as a run destination. Xcode should prompt you to put your phone in developer mode. After you've done this, your phone should appear in the list of available Devices (even if it is only available over WiFi).
May ’25
Reply to Compiler stuck::considering giving up on SwiftUI
yes to splitting up views into smaller chunks. They can be easier to test that way. I've often pasted non-working code into Copilot and asked "why doesn't this compile?". LLMs are quite good a pointing out errant commas and the like, where a compiler will often give you an entirely logical, but useless error message (unless you write compilers for a living).
Topic: UI Frameworks SubTopic: SwiftUI Tags:
May ’25
Reply to Is it possible for iOS to continue BLE scanning even when the app goes into the background?
I'd advise against this, but it depends on exactly what your requirements are. BLE connections are low power because the radio can turn when there is no data to transfer. A "connection" implies that the receiver and transmitter are tightly synchronized, but this synchronization falls apart if you push the connection interval out to 5 minutes. Usually, connection intervals are in the tens or hundreds of milliseconds range. Most types of connections aren't going to be maintained in the background - your app is competing with others for antenna time and energy from the battery. Radio connections are unreliable. You can't expect every transmission to arrive at the receiver intact, so using the phone to log data is likely to be disappointing, unless either: a. you don't care about the occasional missing sample, or b. you buffer samples on the logging device and upload historic data in a batch Lastly, advertising isn't a great way to log data with any temporal precision. The advertising device is usually something with a real time OS and nothing better to do, so it can send advertising packets at regular intervals. The scanning device, however, doesn't know when those advertising packets are being broadcast, so it will generally turn on its receiver at irregular intervals. Even if it were to keep its receiver on permanently, it doesn't know what channel to listen to (advertisements cycles through three different advertising channels). On top of that, you can't fit much data in a regular advertising packet. Extended advertising in BLE 5 allows for more efficient advertising of larger packets, but I don't know how much support for extended advertising is built into iOS and the various phone models. If I were trying to make a data logger, I'd log on the device and have an explicit, user-driven data download procedure from a foreground app. If you're more interested in status than logging, advertising might be an option. If your data requirements are low, and timing isn't critical - e.g. if you're monitoring temperature every 10 minutes, you might be able to leverage iBeacon technology.
Topic: App & System Services SubTopic: Core OS Tags:
Apr ’25
Reply to Cannot login on macOS 15.5 beta 2
can you log in in Safe Mode? (https://support.apple.com/en-gb/guide/mac-help/mh21245/mac) If so, you might be able to access some crash reports and submit them with your bug report (they should be included with sysdiagnose). Inspecting those reports may give you a clue as to what is causing your Mac to crash on login.
Topic: Community SubTopic: Apple Developers Tags:
Apr ’25
Reply to how to achieve "concave in" glass view look?
does this work for you - a stroke with a gradient? struct ContentView: View { var body: some View { let backgroundColor = Color(hue: 0.0, saturation: 0.0, brightness: 0.44) let foregroundColor = Color(hue: 0.0, saturation: 0.0, brightness: 0.33) let rimTop = Color(hue: 0.0, saturation: 0.0, brightness: 0.26) let rimBottom = Color(hue: 0.0, saturation: 0.0, brightness: 0.5) let gradient = LinearGradient(stops: [ Gradient.Stop(color: rimTop, location: 0.0), Gradient.Stop(color: rimTop, location: 0.40), Gradient.Stop(color: rimBottom, location: 0.6 ), Gradient.Stop(color: rimBottom, location: 1.0 ) ], startPoint: .top, endPoint: .bottom) ZStack { RoundedRectangle(cornerRadius: 40) .fill(backgroundColor) .frame(width:400, height: 300) Rectangle() .fill(.clear) .frame(width: 300, height: 50) .overlay { RoundedRectangle( cornerRadius: 20, style: .circular) .fill(foregroundColor) .stroke(gradient, lineWidth: 2) } } } }
Topic: Spatial Computing SubTopic: General Tags:
Apr ’25
Reply to Does USB Audio in macOS support 12 channels with 16bit PCM samples?
thanks for posting the descriptor. Here's the descriptor of alternate interface 1 of interface 3, the audio streaming interface, when set to 12 channels of 8-bit-per-sample audio: Endpoint 0x03 - Isochronous Output Address: 0x03 (OUT) Attributes: 0x09 (Isochronous adaptive data endpoint) Max Packet Size: 0x024c (588 x 1 transactions opportunities per microframe) Polling Interval: 4 (8 microframes (1 msecs) Your sampling rate is 48kHz, so 48 samples every millisecond. The polling interval is 1 millisecond (the actual number in the descriptor is 4, we're on USB 2.0 high speed here, so that means every 2^(4-1) or 8 micro-frames. Each sample is 1 bytes, and there are 12 channels. So in each frame, there are 12x48 or 576 bytes to transfer (nominally). Up to one additional sample frame (12 bytes) may be transferred in each interval to synchronize the device rate with the host rate, which is where the 588 bytes comes from. In the 16 bits-per-channel case, the polling interval is the same - 1ms. But you need to transfer up to 1152+24 = 1176 bytes per interval. However, the maxPacketSize here is 1024 - the maximum allowed by the USB 2.0 Audio Specification. One way to fix this is to change the descriptor to reduce the polling interval from every millisecond to twice every millisecond (the number '3' in the bInterval field, and change the wMaxPacketSize to 588. Another possibility would be to ask for up to two transactions per micro frame - this is squeezed into bits 12 and 11 of bMaxPacketSize. However, the (now ancient) tech note TN2274 says "At this time, AppleUSBAudio supports at most one transaction per microframe". See Table 9-13 and section 5.6.3 of the USB 2.0 Specification and Table 4-33 of the USB Audio Class Specification v2.0 You might be asking yourself why it "works on Linux", but does it really?
Topic: App & System Services SubTopic: Drivers Tags:
Apr ’25