I am really confused by some of the new data returned in WeatherKit for iOS 18.
The visibility (of an object) was already being returned in HourWeather as a Measurement.
iOS 18 added max/min visibility (of terrain) in DayWeather. BUT instead of a Measurement it's just a Double.
HourWeather:
/// The distance at which an object can be clearly seen.
///
/// The amount of light and weather conditions like fog, mist, and smog affect visibility.
public var visibility: Measurement<UnitLength>
DayWeather's comment:
/// The maximum distance at which terrain is visible for the day.
///
/// The amount of light, and weather conditions like fog, mist, and smog affect visibility.
@available(iOS 18.0, macOS 15.0, tvOS 18.0, watchOS 11.0, visionOS 2.0, *)
public var maximumVisibility: Double
This makes it sound like the new items are also a distance and not a percentage.
Why wasn't Measurement used so the unit would be clear? Documentation doesn't explain this either. I'm hoping that this isn't being returned in the unit used by the current locale because my app lets you specify what unit to use for temperature, length, etc regardless of locale. Since all the temperature, length, etc data returned had used Measurement that was possible.
The iOS weather app refers to the lowest/highest visibility in my preferred unit, which is miles.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I just noticed something really odd with WeatherKit.
If the temperate at midnight of the FOLLOWING day is colder than all temps in the day you want weather for then WeatherKit will report that as the low temperature for the day even though it's a different day.
Here it's reporting the lowest temp for Feb 6 is 22F. But that's a temp from Feb 7. I'm displaying lowTemperature and lowTemperatureTime from DayWeather as well as the HourWeather in these examples.
I wasn't sure if this was working as designed or a bug. I can provide raw data from this example.
----daily formatted start
weather for Feb 6
High 41°
high at Feb 6 at 2 PM
Low 22°
low at Feb 7 at 12 AM
----daily formatted end
Feb 6 at 12 AM 34°
Feb 6 at 1 AM 35°
Feb 6 at 2 AM 36°
Feb 6 at 3 AM 36°
Feb 6 at 4 AM 36°
Feb 6 at 5 AM 34°
Feb 6 at 6 AM 33°
Feb 6 at 7 AM 33°
Feb 6 at 8 AM 33°
Feb 6 at 9 AM 33°
Feb 6 at 10 AM 35°
Feb 6 at 11 AM 36°
Feb 6 at 12 PM 38°
Feb 6 at 1 PM 40°
Feb 6 at 2 PM 41°
Feb 6 at 3 PM 40°
Feb 6 at 4 PM 39°
Feb 6 at 5 PM 37°
Feb 6 at 6 PM 36°
Feb 6 at 7 PM 33°
Feb 6 at 8 PM 31°
Feb 6 at 9 PM 29°
Feb 6 at 10 PM 27°
Feb 6 at 11 PM 24°
Feb 7 at 12 AM 22°
I added gesture support to my app that supports iOS 16 and 17 and have never had issues with it.
However, when I compiled my app with Xcode 16 I immediately noticed a problem with the app when I ran it in the simulator. I couldn't scroll up or down. I figured out it’s because of my gesture support.
My gesture support is pretty simple.
let myDragGesture = DragGesture()
.onChanged { gesture in
self.offset = gesture.translation
}
.onEnded { _ in
if self.offset.width > threshold {
...some logic
} else if self.offset.width < -threshold {
...some other logic
}
logitUI.debug("drag gesture width was \(self.offset.width)")
self.offset = .zero
}
If I pass nil to .gesture instead of myDragGesture then scrolling starts working again.
Here’s some example output when I’m trying to scroll down. These messages do NOT appear when I run my app on an iOS 16/17 simulator with Xcode 15.
drag gesture width was 5.333328
drag gesture width was -15.333344
drag gesture width was -3.000000
drag gesture width was -24.333328
drag gesture width was -30.666656
I opened FB14205678 about this.