An article in the New York Times' travel section breathlessly reported that many flights cancelled because of the recent spate of bad weather in the Northeast "won't be counted as late or cancelled in the government's on-time statistics."
My response: "So what?"
Reading into the article, the focus turns to a proposed federal rule that would make the counting of delayed flights more accurate. Today, statistics don't include international flights or flights by regional carriers, among others.
Fine; count them. Count every carrier, and count them all the same way. I'm completely in support of that. But that brings me to the crux of my position. What I want to get out of these statistics is a picture of how well a given airline does, absent any outside factors like bad weather.
Why? Stuff happens. Exigent circumstances like bad weather can and do affect everyone, from the best to the worst. So don't lower an airline's on-time rating when Mother Nature, not the airline, was responsible.
A far more meaningful metric would be to chart the percentage of on-time departures by airport. Every experienced flier knows that some airports are worse than others for reasons that vary. Either they've incredible busy, like Atlanta's Hartsfield-Jackson (ATL), they're in an area prone to bad weather like Chicago's O'Hare (ORD), or some other factor. Measuring the airports would quantify what many of us already know through experience: If you're traveling through XYZ airport, anticipate not being on time.
As for the airlines, on-time statistics should count the things over which the airline has control.
Visit my main page at TheTravelPro.us for more news, reviews, and personal observations on the world of upmarket travel.
Follow @TheTravelProUS
Photo by Carl Dombek
Click on photo to view larger image
My response: "So what?"
Reading into the article, the focus turns to a proposed federal rule that would make the counting of delayed flights more accurate. Today, statistics don't include international flights or flights by regional carriers, among others.
Fine; count them. Count every carrier, and count them all the same way. I'm completely in support of that. But that brings me to the crux of my position. What I want to get out of these statistics is a picture of how well a given airline does, absent any outside factors like bad weather.
Why? Stuff happens. Exigent circumstances like bad weather can and do affect everyone, from the best to the worst. So don't lower an airline's on-time rating when Mother Nature, not the airline, was responsible.
A far more meaningful metric would be to chart the percentage of on-time departures by airport. Every experienced flier knows that some airports are worse than others for reasons that vary. Either they've incredible busy, like Atlanta's Hartsfield-Jackson (ATL), they're in an area prone to bad weather like Chicago's O'Hare (ORD), or some other factor. Measuring the airports would quantify what many of us already know through experience: If you're traveling through XYZ airport, anticipate not being on time.
As for the airlines, on-time statistics should count the things over which the airline has control.
Visit my main page at TheTravelPro.us for more news, reviews, and personal observations on the world of upmarket travel.
Follow @TheTravelProUS
Photo by Carl Dombek
Click on photo to view larger image
Comments
Post a Comment
PLEASE NOTE:Comments on this website must pertain to the topic of the article and may be edited for content and/or clarity. Comments that include URLs WILL NOT BE POSTED. Please contact me directly if you wish to do a "link exchange."