You checked in at the airport, received your boarding pass, gathered your belongings, and prepared for your scheduled flight. As you glanced at your boarding pass, you noticed it displayed your seat number, gate number, boarding time, and the departure and arrival airports represented by three-letter codes.
Instinctively, you may have understood the meaning of these codes from the context alone. But have you ever wondered about the story behind these codes, how they are decided, and why they exist?
This article will deepen your understanding of airport codes by providing a brief history of their formulation and explaining how these codes are ultimately assigned.
History of IATA Codes
A system of designating codes for certain airports began in the United States in the 1930s due to the convenience of identifying airports for pilots. Originally, aviators used the two-letter city codes provided by the US National Weather Service.
As air travel boomed and the number of airports increased, a new system for airport identification became necessary. The solution was to add an additional letter, increasing the number of possible permutations. By the late 1940s, airports had started using these three-letter codes.
It wasn’t until the 1960s that the International Air Transport Association (IATA), the trade association of the world’s airlines, stepped in and standardized the airport identifiers we know today as IATA codes.
The IATA codes, strictly formed by a combination of any three letters, should not be confused with International Civil Aviation Organization (ICAO) codes, which use four letters and are most commonly used by pilots and aviation professionals. (We’ll cover ICAO codes in a separate article in the future, so stay tuned.)
Since one of the primary uses of IATA codes is to ensure passenger luggage is correctly routed to its destination, these codes are not only assigned to airports but also to ferry, train, bus, and helicopter terminals where baggage is commonly transferred from airlines.
How IATA Codes are Determined
Most commonly, airports request three letters derived from the location name for familiarity. For example, in the Philippines, we have Manila (MNL), Cebu (CEB), and Davao (DVO).
However, not all IATA codes explicitly reflect the airport’s name. For instance, YYZ is the code for Toronto, Canada, and OTP is for Bucharest, Romania. The Philippines also has some mysterious airport codes, such as WNP for Naga, EUQ for Antique, and SWL for San Vicente.
IATA codes cannot be shared between airports. So, when the New Manila International Airport (NMIA) in Bulacan opens and operates alongside the Ninoy Aquino International Airport (NAIA) in Pasay City, both cannot share the MNL airport code.
But if the government decides to make NMIA the new capital airport, the most likely scenario is that the MNL airport code will be transferred to NMIA, while NAIA will be assigned a brand-new code. However, it could also be the other way around, especially if NAIA will continue to operate when the new airport becomes operational.
A recent example of transferring IATA codes in the Philippines was when the Bohol-Panglao International Airport (BPIA) was opened to the public. The old Tagbilaran Airport was stripped off of its IATA and ICAO codes, but only the IATA code (TAG) was transferred to the new airport. BPIA adopted a brand-new ICAO code, RPSP.
Now that you’ve learned all about IATA codes, let’s put your knowledge to the test: CYZ, BXU, MPH. Can you guess which airports these codes represent? Write down your answers in the comments section below!
Cauayan
Butuan
Caticlan
Cauayan, Isabela
Butuan
Caticlan