There are thousands of airports
connecting cities across countries
and continents. Yet, with just 3 letters,
from 'AAC' and 'BBI' to 'YYZ' and 'ZZU',
both me and you and our bags
route around the world
as unambiguously as practically possible.
Airport Codes!
If you fly, you know them as part of
the planning on your tickets,
trackers, and tags, and even as part of
the port itself as big big branding.
It's impossible not to wonder, bored on a
long haul with only in-flight
entertainment, about potential patterns
peaking through,
like all of the Canadian "Y" airports.
Why Canada?
And, why everyone? How do all of these
codes code?
Well, neighbor, to find the answer, we
need to divert this flight to 'YUL',
the Canadian city that's capitol of codes,
Montreal,
where is headquartered IATA, the
International Air Transport Association.
It's not a governmental organization,
more an independent aviation agency
for airlines, where they work to make
airports and airplanes increasingly
interoperable using humanity's
most exciting and powerful, yet
oft-maligned as dull tool, Standards!
One of which is the IATA Airport Code -
three letters to identify every airport
in the world, from the most connected
to the least.
All are coded so companies can communicate
clearly and concisely complicated
connections to carry their customers and
their bags.
And, actually, the code IATA created isn't
only for airports, rather, technically
it's a location code for all kinds of
transportation interchanges,
like plane stations that connect to train
stations such as
Amsterdam Schiphol, which is just so
intermodally epic!
Okay, let's try not to get distracted by
efficient infrastructure (easier said
than done). Here's how the IATA code
is supposed to work.
One airport, One code, which is unique
because airport names are not.
Booking passage to Portland? Cool, that
could be Oregon or Maine or Victoria.
(Australia.)
Ambiguity is the enemy. International
flying creates communication connections
between every language on Earth. So, the
IATA code helps when you don't speak
Greenlandic or Odia but still need to
book a flight to
"Kangerlussuaq" via
"Bhubaneswar"
(I'm so sorry Greenland and India.)
Instead of mangling pronunciation, it's
just "SFJ via BBI." Much clearer! Not just
for you, but also for the ground crew
getting the bags through.
Ideally, the IATA code comes from the
first three letters of the location, like
with Gibraltar where Gibraltar Airport
is given "GIB." GIB, Gibraltar.
So, going to Cork, it'll be "COR" -
COR, Ireland.
Oh, that didn't work.
Seems Cordoba, Argentina built their
airport first and got "COR" ahead of Cork
so, uh, "ORK" for Cork!
Tough noogies, Ork, Germany, that's an
adorable town name you've got there, but
you're going to need to pick something
else for your code. Thus, a single code
collision kicks off a consistency cascade
as airports compete for clearer codes.
So, if your local airport has an
odd 3 letters, there's probably a rival
port that picked previously. This is one
of the major things IATA does - coordinate
everyone's code preferences, which means
dealing with not just individual airports,
but all of the aviation agencies in
different countries, some with their own
desires for inter-country code
consistency, such as Canada, who clearly
claimed all of the 'Y's. Thus, picking a
'Y' one at random, at least you know
roughly where you're going to go.
Oops! No! That didn't work!
YKM brought us to Washington, USA,
and since we're here, we might as well
talk about the FAA. In America, the
Federal Aviation Administration,
daughter of the Department
of Transportation, is given the job of
assigning all American airports an
American airport code. Yes, the FAA
actually has her own set of three letter
codes, but we're not going to talk about
it because it means in America
there's "One airport, Two codes," and,
for simplicity, I'm sticking to this
story, "One airport, One code,"
right? Right. Now, FAA has letters
she really rather American airports
noooooooot.
Please, no: N, Q, W, K, Z, or Y.
'N' is reserved for the Navy for,
"OMG, is it for aircraft carriers?"
No, they use an unrelated and additional
system that we're also not going to talk
about. The Navy 'N' is given to Navy bases
with airports. So, American airports like
Nashville that seem like they should start
with the letter "N" were encouraged to
pick something else, like 'B',
for B-nashville. There is also 'A' for the
Army and the Air Force, although not
all of the 'A's, so there's a bunch of 'A'
airports like
Albuquerque, Aberdeen, Anchorage,
Amarillo, and Augusta.
Next, 'Q' FAA wants avoided because of
(checks notes)
Morse code? Wow, really?
There's a set of three letter
international Morse code that begin with
'Q' for 'quicky communications' that are
still used, I guess? So because of 1800s
telegraph slang, American airports
shouldn't start with the letter 'Q'.
Next, 'K' and 'W' the FAA advises against
because FCC, the Federal Communications
Commission, daughter of no one, she's an
independent agency, assigns 'K' and 'W'
for US civilian broadcast stations, so
that thing on the radio they say,
"KMAD Action News!" or
"WDUL Public Airwaves," yea, they all
start with a 'K' or a 'W' which is
actually location information. 'K's are
in the west and 'W's in the east, except
for the middle where it's both?
FCC, why did you do it this way?
Well, since you coded those codes first,
FAA discourages airports from starting
with those letters. Even though broadcast
codes are 4 letters, not 3, and they're,
you know, radio stations, not airports,
and definitely not weather stations.
"Of course they're not weather stations."
"Why would you even say that?"
No reason. It won't come up later.
Don't worry. Moving on.
'Z' is reserved for air route traffic
control centers themselves, and
why no 'Y'? Because Canada, of course!
Yes, I understand that's not an
explanation. We'll get to it later.
That's America's preferences for airport
codes, but other countries exist, and
their aviation agencies don't care at all
which letters the United States avoids,
so while 'B-nashville' was building her
big big branding,
Nassau grabbed the 'N' to get 'NAS'
for the Bahamian capitol.
There's no shortage of airport codes that
start outside the US with America's
reluctant letters, and also, because
FAA's precedents aren't laws, you can
find American exceptions like
'KEK', 'WAH', 'YAK', 'QWG', and 'ZZV.'
Boy, that was fun to say! Let's end the
video with more of that, shall we?
And that 'NEW' must particularly burn
Newark, New Jersey, who had to go with
'Ewark, Ew Jersey' instead. Right,
finishing this thought, every country
and their agencies has their own wacky
preferences for letters and wants to
ignore every other country's preferences,
and IATA's job is to coordinate
between them. The result of which is:
IATA airport codes have no satisfying
system at all, which is so sad for a
standard, and the story of
"One airport, One code" also falls
apart even within IATA because of
megacodes for megacities.
Example: London, which has 6
international airports - Heathrow,
Gatwick, City, Luton, Stansted, Southend.
'LHR', 'LGW', 'LCY', 'LTN'.
"Oh, they all start with L?"
No, 'STN' and 'SEN', but there's a
megacode for them all - 'LON',
which you can use while searching for
flights landing in London but don't care
where, even though these airports are
ages apart. LON is the international city
mega-est megacode, but there's also
Moscow 'MOW' and Stockholm 'STO' with
4 airports each, and more with 2 or 3,
like 'NYC' and 'BUE.'
And then, code-wise, is the most
exceptional airport:
EuroAirport Basel Mulhouse Freiburg, an
airport so nice they coded it thrice.
'MLH', 'BSL', 'EAP.' How this happened is
France and Switzerland both wanted an
airport here-ish near the German border
and teamed up.
France provided the land, Switzerland the
capital, Germany has nothing to do with
this, and the pair co-built the port,
constructing duplicate and separate
everythings. So it was, effectively, 2
airports run by 2 countries with
2 runways and 2 sets of rules, and thus
needed 2 airport codes, depending on
which side passengers could connect
through and one megacode if it
didn't matter. But, all of this doesn't
mega-matter now because the two airports
mostly act as one anyway. Thus, one
airport, three codes. And there are plans
to run a railway through for epic
inter-modalness, so it could become
one airport, four codes, or five codes.
I mean, why not at this point?
So yea, an airport isn't uniquely
identified by one code and there's no
location information coded in this
location code, not even a
checksum letter? What is this, a social
security number? Without a checksum,
if you're planning a flight to 'CGP'
in Bangladesh but typo the incorrect
'CPG', you'll end up in Argentina instead.
Again. But, at least the chance of a
switcheroo like that must be pretty small.
After all, a 3 letter code means
17,000 permutations - way more than the
actual number of airports, which is only
40,000 airports worldwide?! How can that
possibly be true? Well, it's time to
introduce you to ICAO, the International
Civil Aviation Organization, daughter of
the United Nations, who also lives in
Montreal with IATA. And, it might seem
like they're the same, but IATA actually
only covers what we might call the
standard commercial airports you'd find
searching for flights normally.
While ICAO covers what she calls
"aerodromes," which is everything from
the world's busiest passenger airport in
the always unlikely seeming
Atlanta, Georgia, down to rarely used
runways on ranches in Texas, of which
there are an absolutely absurd number.
So with all those aerodromes to account
for, ICAO uses 4 letters, which gives,
wow, a lot more options
(thanks exponentials), and she also uses
the extra space to add location
information. Finally! In ICAO system,
the first letter of an airport code is
roughly where on Earth it is. 'P' is for
airports in the Pacific, one letter to
cover flying over the most terrifyingly
empty half of the Earth (try not to think
about it as you look down into the endless
abyss) before arriving at South America,
'S'. Then 'M' for Middle America, and 'K'
for 'Kontinental America'?
'C', sensibly, is Canadian America, and
flying over the pole we get to 'U' for
'Used to be USSR.' Yes, that's actually
the name. Look, what makes standards
standards is their stubbornness.
Just because a gigantic country collapsed
is no reason to change what millions of
flight computers know in their code and
pilots in their brains. After ICAO's first
letter there is also a bunch of second
sub-letters, well except for America and
Canada, who skip that, but don't worry.
Moving on, as an example, if your airport
starts with a 'F',
it's in 'Southern aFrica,' and if the next
letter is 'A,' that's South Africa, and
the last 2 letters are for the airport,
so Cape Town gets CT for a 'FACT'.
Of course, there are some exceptions like
Antarctica, the continent no one owns but
all of the cool kids want to claim.
Aerodromes here are supposed to use the
code for the country's claim they're in,
such as Williams Field, which is American
run but uses 'NZWD' because it's in the
Kiwi claim. But, also lots of
Antarctic aerodromes use pseudo codes
(no we're not going to talk about what
that means) which start with 'AT' and end
with a number, like 27 for Troll Airfield
serving Troll Research Station which runs
on Troll Time. Norway, is that you? I knew
it was, but you really should be using
'EN' for 'Europe, Norway' and 'TR' is
free. 'ENTR Trolls', it's so perfect! And
yes, the 27 means there are at least
26 other runways in Antarctica (I was
surprised too) but this, along with all
of the ranches, is how you get to crazy
numbers of aerodromes. And yes, ICAO
has more exceptions to this system that
we're going to skip, but I can't resist
just one more which is Region J. Looking
at the map, you won't be able to find it
because 'J' is Mars. When the rover
arrived at Jezero Crater, ICAO gave the
historic landing location the code 'JZRO'.
Okay, but that's it for exceptions. So,
to sum up, the story of one airport,
one code, was just that - a story.
Tons of airports have at least 2, and
when they do, the ICAO code is what
computers and pilots know to plan where
the plane needs to go, and IATA is what
passengers say to get on their way, but if
ICAO exists with a more comprehensive
code, why is IATA at all?
So IATA isn't about you, it's about
your bags.
At an airport, you, as a human, walk to
your connecting flight, but your bags
below need a lot of logistical assistance.
Before IATA, there was just like a
handwritten tag saying, "Please get me
where my owner is going," written in
potentially every language on Earth, so
you can't imagine how often that went
wrong. So, IATA used codes to make life
better for bags with bag tags with big
clear codes to get those bags cleanly
through connections across countries
and companies. And, the original plan
was that train stations with IATA codes
would also let you check in your bag
there, and it would be part of the
automatic connection too, but that mostly
doesn't happen now because of logistical
difficulties, which is the same reason
that the IATA code is a club that excludes
all of the little aerodromes too annoying
to attend to. So, if your bag's final
destination after connecting at Austin is
one of the many random ranch airstrips,
the ground crew is not going to swap your
bags onto the tiny crop duster for you.
Ditto if you're connecting through
Argentina to Antarctica anywhere. Those
tiny airports? No IATA code for you, and
without an IATA code, your bag depends
on you to get it all the way through. And,
that's what IATA is actually for. That
big big branding you see is for your bags,
and because of the tag, it became what
customers know, which brings us back to
the start. And, oh sorry Canada, I know
I've been avoiding answering the whys, but
it's just so much more complicated than
expected. There's a tale that the 'Y's
are an old system for if Canadian airports
had a weather station. 'Y' for 'Yes,
weather station' and
'W', 'without.' And, since pilots want to
know the weather, that explains all the
'Y's but also the few oddball Canadian
'W's. But, investigating the truth of that
story took 8 months of my life which I
will now give to you as an extremely
compressed executive summary.
Working backwards, the American and
Canadian IATA codes created in the 1950s
come from the last 3 letters of ICAO codes
created in the 1940s. The first letter of
ICAO codes come from the ITU, the
International Telecommunication Union's
codes created in the 1910s for radio
stations, which used 'K' for America and
'CY' for Canada. So, 'K' and 'CY' into
4 letters and back down to 3 leaves
'Y' for Canada. Here is where you
reasonably ask, "Why CY for Canada?"
but that goes all the way back to
telegraphs and beyond, so is a story for
another time.
But, for now, for this video, why 'Y' for
Canada? Because of radio callsigns
(because of a lot of other things),
because of US and Canada coordinating
that for flights within North America, it
really would be 'Y' for 'Yes Canada'
(mostly)
Well, that was a lot of bureaucratic
history, so let's finish with the final
fun IATA codes promised from before.
Starting with Sioux City with the
sensible looking 'SUX' until you say it
out loud, but to her credit, totally owns
that branding for airport merch. Good for
you, Sioux! And there's also
Beaches International Airport, summer
break central, their top 2 picks for codes
were picked, so to help the confused
collegiates find their connections, the
agencies agreed on 'ECP' to stand for
'Everyone Can Party' which is awesome
branding, but you'd never know because
beaches doesn't bother.
Geez 'ECP', you can a learn a thing or two
from Sioux.
But now everyone can party on this 'round
the world flight of IATA codes
entertaining to say out loud. Ready?
'FAB', 'BOO', 'EEK', 'COW', 'WOW',
'POO', 'GAG', 'BRO', 'BUT',
'GOT', 'HOT', 'PIE', 'YUM', 'UMM',
'MOM', 'DAD', 'MAD', 'RUN', 'FUN',
'IOU', 'FAQ', 'OMG', 'LOL'.