WEBVTT 00:00:00.000 --> 00:00:03.090 There are thousands of airports connecting cities across countries 00:00:03.090 --> 00:00:08.657 and continents. Yet, with just 3 letters, from 'AAC' and 'BBI' to 'YYZ' and 'ZZU', 00:00:08.657 --> 00:00:10.537 both me and you and our bags 00:00:10.537 --> 00:00:14.006 route around the world as unambiguously as practically possible. 00:00:14.006 --> 00:00:15.153 Airport Codes! 00:00:15.153 --> 00:00:18.256 If you fly, you know them as part of the planning on your tickets, 00:00:18.256 --> 00:00:22.077 trackers, and tags, and even as part of the port itself as big big branding. 00:00:22.077 --> 00:00:25.965 It's impossible not to wonder, bored on a long haul with only in-flight 00:00:25.965 --> 00:00:28.582 entertainment, about potential patterns peaking through, 00:00:28.582 --> 00:00:31.926 like all of the Canadian "Y" airports. Why Canada? 00:00:31.926 --> 00:00:35.095 And, why everyone? How do all of these codes code? 00:00:35.095 --> 00:00:39.879 Well, neighbor, to find the answer, we need to divert this flight to 'YUL', 00:00:39.879 --> 00:00:42.697 the Canadian city that's capitol of codes, Montreal, 00:00:42.697 --> 00:00:46.617 where is headquartered IATA, the International Air Transport Association. 00:00:46.617 --> 00:00:52.176 It's not a governmental organization, more an independent aviation agency 00:00:52.176 --> 00:00:55.722 for airlines, where they work to make airports and airplanes increasingly 00:00:55.722 --> 00:00:59.018 interoperable using humanity's most exciting and powerful, yet 00:00:59.018 --> 00:01:01.542 oft-maligned as dull tool, Standards! 00:01:01.542 --> 00:01:06.120 One of which is the IATA Airport Code - three letters to identify every airport 00:01:06.120 --> 00:01:09.426 in the world, from the most connected to the least. 00:01:09.426 --> 00:01:13.802 All are coded so companies can communicate clearly and concisely complicated 00:01:13.802 --> 00:01:16.577 connections to carry their customers and their bags. 00:01:16.577 --> 00:01:20.822 And, actually, the code IATA created isn't only for airports, rather, technically 00:01:20.822 --> 00:01:24.959 it's a location code for all kinds of transportation interchanges, 00:01:24.959 --> 00:01:27.774 like plane stations that connect to train stations such as 00:01:27.774 --> 00:01:31.415 Amsterdam Schiphol, which is just so intermodally epic! 00:01:31.415 --> 00:01:35.120 Okay, let's try not to get distracted by efficient infrastructure (easier said 00:01:35.120 --> 00:01:37.787 than done). Here's how the IATA code is supposed to work. 00:01:37.787 --> 00:01:42.707 One airport, One code, which is unique because airport names are not. 00:01:42.707 --> 00:01:47.319 Booking passage to Portland? Cool, that could be Oregon or Maine or Victoria. 00:01:47.319 --> 00:01:48.305 (Australia.) 00:01:48.305 --> 00:01:52.359 Ambiguity is the enemy. International flying creates communication connections 00:01:52.359 --> 00:01:56.486 between every language on Earth. So, the IATA code helps when you don't speak 00:01:56.486 --> 00:01:59.189 Greenlandic or Odia but still need to book a flight to 00:01:59.189 --> 00:02:03.229 "Kangerlussuaq" via "Bhubaneswar" 00:02:03.229 --> 00:02:05.038 (I'm so sorry Greenland and India.) 00:02:05.038 --> 00:02:09.491 Instead of mangling pronunciation, it's just "SFJ via BBI." Much clearer! Not just 00:02:09.491 --> 00:02:12.798 for you, but also for the ground crew getting the bags through. 00:02:12.798 --> 00:02:16.810 Ideally, the IATA code comes from the first three letters of the location, like 00:02:16.810 --> 00:02:21.340 with Gibraltar where Gibraltar Airport is given "GIB." GIB, Gibraltar. 00:02:21.340 --> 00:02:24.700 So, going to Cork, it'll be "COR" - COR, Ireland. 00:02:24.700 --> 00:02:28.358 Oh, that didn't work. Seems Cordoba, Argentina built their 00:02:28.358 --> 00:02:32.353 airport first and got "COR" ahead of Cork so, uh, "ORK" for Cork! 00:02:32.353 --> 00:02:36.129 Tough noogies, Ork, Germany, that's an adorable town name you've got there, but 00:02:36.129 --> 00:02:39.902 you're going to need to pick something else for your code. Thus, a single code 00:02:39.902 --> 00:02:43.779 collision kicks off a consistency cascade as airports compete for clearer codes. 00:02:43.779 --> 00:02:47.476 So, if your local airport has an odd 3 letters, there's probably a rival 00:02:47.476 --> 00:02:51.721 port that picked previously. This is one of the major things IATA does - coordinate 00:02:51.721 --> 00:02:55.926 everyone's code preferences, which means dealing with not just individual airports, 00:02:55.926 --> 00:02:59.937 but all of the aviation agencies in different countries, some with their own 00:02:59.937 --> 00:03:04.433 desires for inter-country code consistency, such as Canada, who clearly 00:03:04.433 --> 00:03:08.144 claimed all of the 'Y's. Thus, picking a 'Y' one at random, at least you know 00:03:08.144 --> 00:03:10.720 roughly where you're going to go. 00:03:10.720 --> 00:03:14.402 Oops! No! That didn't work! YKM brought us to Washington, USA, 00:03:14.402 --> 00:03:18.770 and since we're here, we might as well talk about the FAA. In America, the 00:03:18.770 --> 00:03:21.636 Federal Aviation Administration, daughter of the Department 00:03:21.636 --> 00:03:25.401 of Transportation, is given the job of assigning all American airports an 00:03:25.401 --> 00:03:30.902 American airport code. Yes, the FAA actually has her own set of three letter 00:03:30.902 --> 00:03:34.334 codes, but we're not going to talk about it because it means in America 00:03:34.334 --> 00:03:37.962 there's "One airport, Two codes," and, for simplicity, I'm sticking to this 00:03:37.962 --> 00:03:43.141 story, "One airport, One code," right? Right. Now, FAA has letters 00:03:43.141 --> 00:03:46.758 she really rather American airports noooooooot. 00:03:46.762 --> 00:03:51.979 Please, no: N, Q, W, K, Z, or Y. 'N' is reserved for the Navy for, 00:03:51.979 --> 00:03:55.790 "OMG, is it for aircraft carriers?" No, they use an unrelated and additional 00:03:55.790 --> 00:03:59.910 system that we're also not going to talk about. The Navy 'N' is given to Navy bases 00:03:59.910 --> 00:04:03.946 with airports. So, American airports like Nashville that seem like they should start 00:04:03.946 --> 00:04:07.466 with the letter "N" were encouraged to pick something else, like 'B', 00:04:07.466 --> 00:04:11.924 for B-nashville. There is also 'A' for the Army and the Air Force, although not 00:04:11.924 --> 00:04:14.537 all of the 'A's, so there's a bunch of 'A' airports like 00:04:14.537 --> 00:04:17.292 Albuquerque, Aberdeen, Anchorage, Amarillo, and Augusta. 00:04:17.292 --> 00:04:20.698 Next, 'Q' FAA wants avoided because of (checks notes) 00:04:20.698 --> 00:04:24.987 Morse code? Wow, really? There's a set of three letter 00:04:24.987 --> 00:04:29.431 international Morse code that begin with 'Q' for 'quicky communications' that are 00:04:29.431 --> 00:04:35.355 still used, I guess? So because of 1800s telegraph slang, American airports 00:04:35.355 --> 00:04:40.065 shouldn't start with the letter 'Q'. Next, 'K' and 'W' the FAA advises against 00:04:40.065 --> 00:04:43.904 because FCC, the Federal Communications Commission, daughter of no one, she's an 00:04:43.904 --> 00:04:48.616 independent agency, assigns 'K' and 'W' for US civilian broadcast stations, so 00:04:48.616 --> 00:04:52.509 that thing on the radio they say, "KMAD Action News!" or 00:04:52.509 --> 00:04:58.251 "WDUL Public Airwaves," yea, they all start with a 'K' or a 'W' which is 00:04:58.251 --> 00:05:04.256 actually location information. 'K's are in the west and 'W's in the east, except 00:05:04.256 --> 00:05:08.296 for the middle where it's both? FCC, why did you do it this way? 00:05:08.296 --> 00:05:12.826 Well, since you coded those codes first, FAA discourages airports from starting 00:05:12.826 --> 00:05:17.644 with those letters. Even though broadcast codes are 4 letters, not 3, and they're, 00:05:17.644 --> 00:05:21.959 you know, radio stations, not airports, and definitely not weather stations. 00:05:21.959 --> 00:05:25.371 "Of course they're not weather stations." "Why would you even say that?" 00:05:25.371 --> 00:05:28.084 No reason. It won't come up later. Don't worry. Moving on. 00:05:28.084 --> 00:05:31.394 'Z' is reserved for air route traffic control centers themselves, and 00:05:31.394 --> 00:05:33.337 why no 'Y'? Because Canada, of course! 00:05:33.337 --> 00:05:36.576 Yes, I understand that's not an explanation. We'll get to it later. 00:05:36.576 --> 00:05:40.621 That's America's preferences for airport codes, but other countries exist, and 00:05:40.621 --> 00:05:44.712 their aviation agencies don't care at all which letters the United States avoids, 00:05:44.712 --> 00:05:48.111 so while 'B-nashville' was building her big big branding, 00:05:48.111 --> 00:05:51.295 Nassau grabbed the 'N' to get 'NAS' for the Bahamian capitol. 00:05:51.295 --> 00:05:55.238 There's no shortage of airport codes that start outside the US with America's 00:05:55.238 --> 00:06:00.104 reluctant letters, and also, because FAA's precedents aren't laws, you can 00:06:00.104 --> 00:06:04.177 find American exceptions like 'KEK', 'WAH', 'YAK', 'QWG', and 'ZZV.' 00:06:04.177 --> 00:06:07.810 Boy, that was fun to say! Let's end the video with more of that, shall we? 00:06:07.810 --> 00:06:11.739 And that 'NEW' must particularly burn Newark, New Jersey, who had to go with 00:06:11.739 --> 00:06:15.458 'Ewark, Ew Jersey' instead. Right, finishing this thought, every country 00:06:15.458 --> 00:06:19.054 and their agencies has their own wacky preferences for letters and wants to 00:06:19.054 --> 00:06:23.105 ignore every other country's preferences, and IATA's job is to coordinate 00:06:23.105 --> 00:06:28.164 between them. The result of which is: IATA airport codes have no satisfying 00:06:28.164 --> 00:06:33.558 system at all, which is so sad for a standard, and the story of 00:06:33.558 --> 00:06:38.335 "One airport, One code" also falls apart even within IATA because of 00:06:38.335 --> 00:06:42.337 megacodes for megacities. Example: London, which has 6 00:06:42.337 --> 00:06:45.926 international airports - Heathrow, Gatwick, City, Luton, Stansted, Southend. 00:06:45.926 --> 00:06:49.405 'LHR', 'LGW', 'LCY', 'LTN'. "Oh, they all start with L?" 00:06:49.405 --> 00:06:54.585 No, 'STN' and 'SEN', but there's a megacode for them all - 'LON', 00:06:54.585 --> 00:06:58.676 which you can use while searching for flights landing in London but don't care 00:06:58.676 --> 00:07:03.578 where, even though these airports are ages apart. LON is the international city 00:07:03.578 --> 00:07:07.910 mega-est megacode, but there's also Moscow 'MOW' and Stockholm 'STO' with 00:07:07.910 --> 00:07:12.170 4 airports each, and more with 2 or 3, like 'NYC' and 'BUE.' 00:07:12.170 --> 00:07:15.523 And then, code-wise, is the most exceptional airport: 00:07:15.523 --> 00:07:19.803 EuroAirport Basel Mulhouse Freiburg, an airport so nice they coded it thrice. 00:07:19.803 --> 00:07:24.334 'MLH', 'BSL', 'EAP.' How this happened is France and Switzerland both wanted an 00:07:24.334 --> 00:07:27.686 airport here-ish near the German border and teamed up. 00:07:27.686 --> 00:07:31.493 France provided the land, Switzerland the capital, Germany has nothing to do with 00:07:31.493 --> 00:07:34.929 this, and the pair co-built the port, constructing duplicate and separate 00:07:34.929 --> 00:07:39.329 everythings. So it was, effectively, 2 airports run by 2 countries with 00:07:39.329 --> 00:07:43.861 2 runways and 2 sets of rules, and thus needed 2 airport codes, depending on 00:07:43.861 --> 00:07:47.999 which side passengers could connect through and one megacode if it 00:07:47.999 --> 00:07:51.752 didn't matter. But, all of this doesn't mega-matter now because the two airports 00:07:51.752 --> 00:07:56.572 mostly act as one anyway. Thus, one airport, three codes. And there are plans 00:07:56.572 --> 00:08:00.102 to run a railway through for epic inter-modalness, so it could become 00:08:00.102 --> 00:08:03.986 one airport, four codes, or five codes. I mean, why not at this point? 00:08:03.986 --> 00:08:09.000 So yea, an airport isn't uniquely identified by one code and there's no 00:08:09.000 --> 00:08:13.631 location information coded in this location code, not even a 00:08:13.631 --> 00:08:17.603 checksum letter? What is this, a social security number? Without a checksum, 00:08:17.603 --> 00:08:22.219 if you're planning a flight to 'CGP' in Bangladesh but typo the incorrect 00:08:22.219 --> 00:08:27.428 'CPG', you'll end up in Argentina instead. Again. But, at least the chance of a 00:08:27.428 --> 00:08:31.915 switcheroo like that must be pretty small. After all, a 3 letter code means 00:08:31.915 --> 00:08:37.665 17,000 permutations - way more than the actual number of airports, which is only 00:08:37.665 --> 00:08:45.180 40,000 airports worldwide?! How can that possibly be true? Well, it's time to 00:08:45.180 --> 00:08:50.050 introduce you to ICAO, the International Civil Aviation Organization, daughter of 00:08:50.050 --> 00:08:54.354 the United Nations, who also lives in Montreal with IATA. And, it might seem 00:08:54.354 --> 00:08:58.195 like they're the same, but IATA actually only covers what we might call the 00:08:58.195 --> 00:09:01.944 standard commercial airports you'd find searching for flights normally. 00:09:01.944 --> 00:09:06.056 While ICAO covers what she calls "aerodromes," which is everything from 00:09:06.056 --> 00:09:09.950 the world's busiest passenger airport in the always unlikely seeming 00:09:09.950 --> 00:09:14.097 Atlanta, Georgia, down to rarely used runways on ranches in Texas, of which 00:09:14.097 --> 00:09:18.859 there are an absolutely absurd number. So with all those aerodromes to account 00:09:18.859 --> 00:09:22.692 for, ICAO uses 4 letters, which gives, wow, a lot more options 00:09:22.692 --> 00:09:27.190 (thanks exponentials), and she also uses the extra space to add location 00:09:27.190 --> 00:09:32.012 information. Finally! In ICAO system, the first letter of an airport code is 00:09:32.012 --> 00:09:36.489 roughly where on Earth it is. 'P' is for airports in the Pacific, one letter to 00:09:36.489 --> 00:09:40.298 cover flying over the most terrifyingly empty half of the Earth (try not to think 00:09:40.298 --> 00:09:44.207 about it as you look down into the endless abyss) before arriving at South America, 00:09:44.207 --> 00:09:49.911 'S'. Then 'M' for Middle America, and 'K' for 'Kontinental America'? 00:09:49.911 --> 00:09:55.260 'C', sensibly, is Canadian America, and flying over the pole we get to 'U' for 00:09:55.260 --> 00:09:59.557 'Used to be USSR.' Yes, that's actually the name. Look, what makes standards 00:09:59.557 --> 00:10:03.357 standards is their stubbornness. Just because a gigantic country collapsed 00:10:03.357 --> 00:10:07.525 is no reason to change what millions of flight computers know in their code and 00:10:07.525 --> 00:10:11.641 pilots in their brains. After ICAO's first letter there is also a bunch of second 00:10:11.641 --> 00:10:15.614 sub-letters, well except for America and Canada, who skip that, but don't worry. 00:10:15.614 --> 00:10:18.626 Moving on, as an example, if your airport starts with a 'F', 00:10:18.626 --> 00:10:23.530 it's in 'Southern aFrica,' and if the next letter is 'A,' that's South Africa, and 00:10:23.530 --> 00:10:27.905 the last 2 letters are for the airport, so Cape Town gets CT for a 'FACT'. 00:10:27.905 --> 00:10:32.277 Of course, there are some exceptions like Antarctica, the continent no one owns but 00:10:32.277 --> 00:10:36.075 all of the cool kids want to claim. Aerodromes here are supposed to use the 00:10:36.075 --> 00:10:40.201 code for the country's claim they're in, such as Williams Field, which is American 00:10:40.201 --> 00:10:45.222 run but uses 'NZWD' because it's in the Kiwi claim. But, also lots of 00:10:45.222 --> 00:10:49.068 Antarctic aerodromes use pseudo codes (no we're not going to talk about what 00:10:49.068 --> 00:10:53.519 that means) which start with 'AT' and end with a number, like 27 for Troll Airfield 00:10:53.519 --> 00:10:58.498 serving Troll Research Station which runs on Troll Time. Norway, is that you? I knew 00:10:58.498 --> 00:11:03.461 it was, but you really should be using 'EN' for 'Europe, Norway' and 'TR' is 00:11:03.461 --> 00:11:08.046 free. 'ENTR Trolls', it's so perfect! And yes, the 27 means there are at least 00:11:08.046 --> 00:11:11.998 26 other runways in Antarctica (I was surprised too) but this, along with all 00:11:11.998 --> 00:11:15.716 of the ranches, is how you get to crazy numbers of aerodromes. And yes, ICAO 00:11:15.716 --> 00:11:20.457 has more exceptions to this system that we're going to skip, but I can't resist 00:11:20.457 --> 00:11:26.199 just one more which is Region J. Looking at the map, you won't be able to find it 00:11:26.199 --> 00:11:32.155 because 'J' is Mars. When the rover arrived at Jezero Crater, ICAO gave the 00:11:32.155 --> 00:11:35.880 historic landing location the code 'JZRO'. 00:11:35.880 --> 00:11:39.842 Okay, but that's it for exceptions. So, to sum up, the story of one airport, 00:11:39.842 --> 00:11:44.118 one code, was just that - a story. Tons of airports have at least 2, and 00:11:44.118 --> 00:11:48.656 when they do, the ICAO code is what computers and pilots know to plan where 00:11:48.656 --> 00:11:53.274 the plane needs to go, and IATA is what passengers say to get on their way, but if 00:11:53.274 --> 00:11:58.466 ICAO exists with a more comprehensive code, why is IATA at all? 00:11:58.466 --> 00:12:02.882 So IATA isn't about you, it's about your bags. 00:12:02.882 --> 00:12:07.418 At an airport, you, as a human, walk to your connecting flight, but your bags 00:12:07.418 --> 00:12:12.362 below need a lot of logistical assistance. Before IATA, there was just like a 00:12:12.362 --> 00:12:17.430 handwritten tag saying, "Please get me where my owner is going," written in 00:12:17.430 --> 00:12:22.476 potentially every language on Earth, so you can't imagine how often that went 00:12:22.476 --> 00:12:27.837 wrong. So, IATA used codes to make life better for bags with bag tags with big 00:12:27.837 --> 00:12:31.919 clear codes to get those bags cleanly through connections across countries 00:12:31.919 --> 00:12:36.077 and companies. And, the original plan was that train stations with IATA codes 00:12:36.077 --> 00:12:39.785 would also let you check in your bag there, and it would be part of the 00:12:39.785 --> 00:12:44.081 automatic connection too, but that mostly doesn't happen now because of logistical 00:12:44.081 --> 00:12:49.489 difficulties, which is the same reason that the IATA code is a club that excludes 00:12:49.489 --> 00:12:54.146 all of the little aerodromes too annoying to attend to. So, if your bag's final 00:12:54.146 --> 00:12:58.202 destination after connecting at Austin is one of the many random ranch airstrips, 00:12:58.202 --> 00:13:03.226 the ground crew is not going to swap your bags onto the tiny crop duster for you. 00:13:03.226 --> 00:13:07.755 Ditto if you're connecting through Argentina to Antarctica anywhere. Those 00:13:07.755 --> 00:13:12.907 tiny airports? No IATA code for you, and without an IATA code, your bag depends 00:13:12.907 --> 00:13:18.375 on you to get it all the way through. And, that's what IATA is actually for. That 00:13:18.375 --> 00:13:24.019 big big branding you see is for your bags, and because of the tag, it became what 00:13:24.019 --> 00:13:28.701 customers know, which brings us back to the start. And, oh sorry Canada, I know 00:13:28.701 --> 00:13:33.411 I've been avoiding answering the whys, but it's just so much more complicated than 00:13:33.411 --> 00:13:37.821 expected. There's a tale that the 'Y's are an old system for if Canadian airports 00:13:37.821 --> 00:13:41.203 had a weather station. 'Y' for 'Yes, weather station' and 00:13:41.203 --> 00:13:45.942 'W', 'without.' And, since pilots want to know the weather, that explains all the 00:13:45.942 --> 00:13:52.157 'Y's but also the few oddball Canadian 'W's. But, investigating the truth of that 00:13:52.157 --> 00:13:58.338 story took 8 months of my life which I will now give to you as an extremely 00:13:58.338 --> 00:14:02.461 compressed executive summary. Working backwards, the American and 00:14:02.461 --> 00:14:07.285 Canadian IATA codes created in the 1950s come from the last 3 letters of ICAO codes 00:14:07.285 --> 00:14:12.590 created in the 1940s. The first letter of ICAO codes come from the ITU, the 00:14:12.590 --> 00:14:17.582 International Telecommunication Union's codes created in the 1910s for radio 00:14:17.582 --> 00:14:23.458 stations, which used 'K' for America and 'CY' for Canada. So, 'K' and 'CY' into 00:14:23.458 --> 00:14:28.351 4 letters and back down to 3 leaves 'Y' for Canada. Here is where you 00:14:28.351 --> 00:14:33.258 reasonably ask, "Why CY for Canada?" but that goes all the way back to 00:14:33.258 --> 00:14:38.208 telegraphs and beyond, so is a story for another time. 00:14:38.208 --> 00:14:43.770 But, for now, for this video, why 'Y' for Canada? Because of radio callsigns 00:14:43.770 --> 00:14:47.540 (because of a lot of other things), because of US and Canada coordinating 00:14:47.540 --> 00:14:53.097 that for flights within North America, it really would be 'Y' for 'Yes Canada' 00:14:53.097 --> 00:14:53.816 (mostly) 00:14:53.816 --> 00:14:57.662 Well, that was a lot of bureaucratic history, so let's finish with the final 00:14:57.662 --> 00:15:00.948 fun IATA codes promised from before. Starting with Sioux City with the 00:15:00.948 --> 00:15:04.867 sensible looking 'SUX' until you say it out loud, but to her credit, totally owns 00:15:04.867 --> 00:15:08.496 that branding for airport merch. Good for you, Sioux! And there's also 00:15:08.496 --> 00:15:12.340 Beaches International Airport, summer break central, their top 2 picks for codes 00:15:12.340 --> 00:15:16.785 were picked, so to help the confused collegiates find their connections, the 00:15:16.785 --> 00:15:22.334 agencies agreed on 'ECP' to stand for 'Everyone Can Party' which is awesome 00:15:22.334 --> 00:15:26.299 branding, but you'd never know because beaches doesn't bother. 00:15:26.299 --> 00:15:28.872 Geez 'ECP', you can a learn a thing or two from Sioux. 00:15:28.872 --> 00:15:32.340 But now everyone can party on this 'round the world flight of IATA codes 00:15:32.340 --> 00:15:34.675 entertaining to say out loud. Ready? 00:15:34.675 --> 00:15:36.304 'FAB', 'BOO', 'EEK', 'COW', 'WOW', 00:15:36.304 --> 00:15:37.591 'POO', 'GAG', 'BRO', 'BUT', 00:15:37.591 --> 00:15:39.227 'GOT', 'HOT', 'PIE', 'YUM', 'UMM', 00:15:39.227 --> 00:15:40.869 'MOM', 'DAD', 'MAD', 'RUN', 'FUN', 00:15:40.869 --> 00:15:42.754 'IOU', 'FAQ', 'OMG', 'LOL'.