For direction and distance, it needs GPS data in this case. And that accuracy is largely outside of Apple's control. It's looking for two points for a direction vector and if the second is within three meters of the first, it's only seeing the one point. Yes, while it might not work for one flight, as long as you get data to establish the user is climbing or descending, you could extrapolate what the step data means. The API uses this to determine if the user is taking the steps or using an elevator. You could also take the start and end GPS data directly and apply your own analysis of what is occurring between these points. But this 3 meter accuracy will still be an issue. The device is capable of obtaining a more accurate picture of its surroundings but then that's not really practical for your case. You probably don't want the user to start their exercise by doing a room scan. It's the old trade off of speed/accuracy/convenience. I'd try to determine how sensitive it really is by looking at the raw GPS data. You might get by with some trick like having the user press start with the device on the floor. Or just explaining to the user the need to move further or otherwise show them when they hit a new GPS point to qualify them.