Post

Replies

Boosts

Views

Activity

Detecting Sleep End Events and Sleep Data Sync Timing from Apple Watch to HealthKit on iPhone
Hello, I’m developing an iOS app that works with sleep data from Apple Watch via HealthKit. I would like to clarify the following: How can an iPhone app detect when a sleep session ends on the Apple Watch? When is sleep data typically written to the HealthKit store on iPhone after sleep ends? Is it immediately after wake-up, or does it depend on certain conditions (e.g., watch charging, connectivity)? Understanding the timing and mechanism of sleep data synchronization is crucial for our app to process accurate and timely health information. Thank you for your assistance.
1
0
36
Apr ’25
Synchronization Timing Between Apple Watch HealthKit Store and iPhone HealthKit Store
Hi, I’m currently working on an app that utilizes sleep data from HealthKit to provide users with meaningful insights about their sleep. To ensure a smooth user experience, I’d like to understand when sleep data collected by the Apple Watch is saved to the HealthKit store and when it gets synced to the iPhone. Ideally, I want to fetch sleep data right after the user wakes up and opens our app. However, to do this reliably, I need to know the timing of how and when this data becomes available in the iPhone’s HealthKit store. I’ve looked through the official documentation and relevant WWDC sessions but couldn’t find clear information on this topic. If anyone has insights or experience with how and when the Apple Watch syncs HealthKit data—especially sleep records—to the iPhone, I’d greatly appreciate your input. Thanks!
1
0
60
Apr ’25
Accuracy of IBI Values Measured by Apple Watch
I am currently developing an app that measures HRV to estimate stress levels. To align the values more closely with those from Galaxy devices, I decided not to use the heartRateVariabilitySDNN value provided by HealthKit. Instead, I extracted individual interbeat intervals (IBI) using the HKHeartBeatSeries data. Can I obtain accurate IBI data using this method? If not, I would like to know how I can retrieve more precise data. Any insights or suggestions would be greatly appreciated. Here is a sample code I tried. @Observable class HealthKitManager: ObservableObject { let healthStore = HKHealthStore() var ibiValues: [Double] = [] var isAuthorized = false func requestAuthorization() { let types = Set([ HKSeriesType.heartbeat(), HKQuantityType.quantityType(forIdentifier: .heartRateVariabilitySDNN)!, ]) healthStore.requestAuthorization(toShare: nil, read: types) { success, error in DispatchQueue.main.async { self.isAuthorized = success if success { self.fetchIBIData() } } } } func fetchIBIData() { var timePoints: [TimeInterval] = [] var absoluteStartTime: Date? let dateFormatter = DateFormatter() dateFormatter.timeZone = TimeZone(identifier: "Asia/Seoul") dateFormatter.dateFormat = "yyyy-MM-dd HH:mm:ss.SSS" var calendar = Calendar.current calendar.timeZone = TimeZone(identifier: "Asia/Seoul") ?? .current var components = DateComponents() components.year = 2025 components.month = 4 components.day = 3 components.hour = 15 components.minute = 52 components.second = 0 let startTime = calendar.date(from: components)! components.hour = 16 components.minute = 0 let endTime = calendar.date(from: components)! let predicate = HKQuery.predicateForSamples(withStart: startTime, end: endTime, options: .strictStartDate) let sortDescriptor = NSSortDescriptor(key: HKSampleSortIdentifierStartDate, ascending: false) let query = HKSampleQuery(sampleType: HKSeriesType.heartbeat(), predicate: predicate, limit: HKObjectQueryNoLimit, sortDescriptors: [sortDescriptor]) { (_, samples, _) in if let sample = samples?.first as? HKHeartbeatSeriesSample { absoluteStartTime = sample.startDate let startDateKST = dateFormatter.string(from: sample.startDate) let endDateKST = dateFormatter.string(from: sample.endDate) print("series start(KST):\(startDateKST)\tend(KST):\(endDateKST)") let seriesQuery = HKHeartbeatSeriesQuery(heartbeatSeries: sample) { query, timeSinceSeriesStart, precededByGap, done, error in if !precededByGap { timePoints.append(timeSinceSeriesStart) } if done { for i in 1..<timePoints.count { let ibi = (timePoints[i] - timePoints[i-1]) * 1000 // Convert to milliseconds // Calculate absolute time for current beat if let startTime = absoluteStartTime { let beatTime = startTime.addingTimeInterval(timePoints[i]) let beatTimeString = dateFormatter.string(from: beatTime) print("IBI: \(String(format: "%.2f", ibi)) ms at \(beatTimeString)") } self.ibiValues.append(ibi) } } } self.healthStore.execute(seriesQuery) } else { print("No samples found for the specified time range") } } self.healthStore.execute(query) } }
1
0
53
Apr ’25
Clarification on Where Application Code and Static Libraries Are Stored in Memory
Hello, I’m seeking some clarity regarding the memory storage of application code and static libraries. I understand the basic memory layout in terms of the code (text) segment, data segment, heap, and stack: • Code Segment (Text Segment): Typically stores the compiled program code. • Data Segment: Stores global and static variables. • Heap: Dynamically allocated memory during runtime. • Stack: Stores local variables and function call information. However, I’ve come across some conflicting information: 1. Official Documentation: In an illustration from Apple’s official documentation, it appeared as though application code might be stored in the heap. This seemed unusual given my understanding that compiled code is generally stored in the code segment. from document archive 2. Blog Posts: Several blogs mention that the source code for static libraries is stored in the heap. This also contradicts my understanding since static libraries, after being linked, should be part of the application’s executable code, thus residing in the code segment. Given these points, my understanding is that: • Application Code: After compilation, the executable code should be stored in the code segment. • Static Libraries: Once linked, the code from static libraries should also be part of the code segment. Could you please clarify: • Where exactly is the application code stored in memory? • Is the claim that static libraries’ source code is stored in the heap correct, or is it a misunderstanding? Thank you!
1
0
514
Jun ’24
About reducing size of AVAudioPCMBuffer
Hi, I'm trying to send audio data via UDP. I am using Network.framework with networking, so to use send method in NWConnection sending data must be Data type or confirm to DataProtocol. To satisfy those conditions, I have implemented a method to convert from AVAudioPCMBuffer type to Data type. func makeDataFromPCMBuffer(buffer: AVAudioPCMBuffer, time: AVAudioTime) -> Data {         let audioBuffer = buffer.audioBufferList.pointee.mBuffers         let data: Data!         data = .init(bytes: audioBuffer.mData!, count: Int(audioBuffer.mDataByteSize))         return data     } Implementation above is referenced from this post The problem is that the size of converted data is too big to fit in UDP datagram and error below occurs when I try to send data. I have found out that initial size of buffer is too big to fit in maximumDatagramSize. Below is code regarding to buffer.         let tapNode: AVAudioNode = mixerNode         let format = tapNode.outputFormat(forBus: 0)         tapNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: { (buffer, time) in          // size of buffer: AVAudioPCMBuffer is 19200 already.             let bufferData = self.makeDataFromPCMBuffer(buffer: buffer, time: time)             sharedConnection?.sendRecordedBuffer(buffer: bufferData)         }) I need to reduce size of AVAudioPCMBuffer to fit in UDP datagram, But I can't find right way to do it. What would be best way to make data fit in datagram? I thought of dividing data in half, but this is UDP so I'm not sure how to handle those datas when one data has lost. So I'm trying to make AVAudioPCMBuffer fit in datagram. Any help would be very appreciated!
0
0
933
May ’22
About dividing Data type data into small pieces to fit in datagram size
I am trying to send and receive an audio data(saved by .caf extension device) using Network.framework. This is my plan. convert file into Data type using below code: guard let data = try? Data(contentsOf: recordedDocumentURL) else {             print("recorded file to data conversion failed in touchUpCallButton Method")             return         } send Data type data using NWConnection.send, UDP. receive Data type data using NWConnection.receiveMessage convert received Data type data into AVAudioFile type data and play it using AVAudioEngine Right now my problem is that size of data converted from audio file is too big to fit in maximumDatagramSize to send. So in my understanding, I need to split Data type into many small bytes of data and send it one by one. But in this case, I need to collect received data to make complete audio file again, so received device can play complete audio file. And.. I'm stuck at this step. I can't find right solution to divide Data type into small pieces to send by datagram using UDP. What I have in my mind is to use 'subdata(in: Range<Data.Index>) -> Data' function and 'append(Data)' function to divide and sum up data. Is this right approach to solve my problem? Little advice would be very appreciated!
3
0
1.2k
May ’22
About flow between send and receive in Network.framework
Hi, I'm trying to build a Walki-talkie app using Swift. My Idea is... record user's voice by AVAudioEngine in device1 convert recorded file into Data type data send data from device1 to device2 using NWConnection.send receive data using NWConnection.receiveMessage play received data in device2 I am implementing this app using P2P option in Network.framework, so each device has both browser and listener. And I have to make each device to keep receiving incoming data, and to send recorded voices. At first I thought that if receiveMessage method was executed, it would wait for other device's send method to send data and receive it. But while debugging, program didn't stopped at receiveMessage method, it just went through and executed next line. I must be missing something, but I'm not sure what it is. Below is send and receive part of code I tried.     func sendRecordedAudio(data: Data) {         guard let connection = connection else {             print("connection optional unwrap failed: sendRecordedAudio")             return         }         connection.send(content: data, completion: .contentProcessed({ (error) in             if let error = error {                 print("Send error: \(error)")             }         }))     }     func receiveRecordedAudio() {                 guard let connection = connection else {             print("connection optional unwrap failed: receiveRecordedAudio")             return         }         connection.receiveMessage{ (data, context, isComplete, error) in             if let error = error {                 print("\(error) occurred in receiveRecordedAudio")             }             if let data = data {                 self.delegate?.receivedAudio(data: data)             }         }     } App is calling sendRecordAudio when recording audio is ended, and calling receiveRecordeAudio when user pressed receive button. Any help would be greatly appreciated!
1
0
993
May ’22