Skip to content

Coordinates

Peter Mooney edited this page Feb 6, 2024 · 5 revisions

Coordinates

Coordinates are used extensively throughout the Sector File to pinpoint an exact location on the earth's surface, such as of a VRP, NAVAID or point on the edge of an airspace boundary. Each coordinate consists of two elements: its longitude and latitude. Both longitude and latitude are measures in degrees. Every 1 degree can be split up into 60 minutes and each minute split up into 60 seconds (60ths) or decimals (100ths). A publication listing latitude and longitude should somewhere state which unit is being used. In general the UK NATS AIP uses Degrees Minutes Seconds (and decimal seconds in places), e.g. if the Gatwick (EGKK) Runway 26L Threshold Latitude from the AIP is given as 510902.42N this means that point lies at 51 degrees, 9 minutes, 2.42 seconds North.

  • Latitude is the location's position either NORTH or SOUTH of the equator, (which is a line around the earth exactly halfway between the poles). Latitude is described as 0 degrees at the equator and 90 degrees at the poles. This means that the number of degrees in a coordinate representing latitude can never go above 90. It should be noted however that we still need to write this latitude's 'degree' component as a three figure number for the Sector File. Therefore, all latitudes will begin with a 0 which will probably not be present in the source reference (adding this 0 does not of course change the actual value since it comes before any other numbers).

  • Longitude is the location's position either EAST or WEST of the Prime Meridian. Unfortunately there is no arbitrary physical reference from which to measure longitude, so one had to be invented. In the last century, the line (more formally known as a meridian) of 0 degrees longitude was designated as passing through Greenwich in London, and to this day the 0 degrees meridian of longitude is sometimes known as the 'Greenwich' or 'Prime' meridian. Meridians of longitude extend to 180 degrees East and 180 degrees West making a total of 360 degrees around the earth.

Coordinates are used throughout the various sub-files used to create the Sector File. The way coordinates are written in the Sector File and how coordinates are written in the source documents (usually airport specific textual data) are slightly different e.g. the EGKK 26L Threshold of 510902.42N 0001019.00W from the AIP is not of a format which is compatable with the Sector file. In order to use this coordinate in the Sector File, a few things need swapping around and adding. This can be done either automatically or manually, as described below:

Automatic Conversion Tools

The simple NATS to ESCO tool hosted by VATSIM UK converts coordinates in the 'NATS' format (as in the textual data), to 'Euroscope' format, e.g. N051.09.02.420 W000.10.19.000.

There is also a more sophisticated coordinate conversion tool which allows for the conversion of coordinates from AIP format into various formats including SCT and ESE, with a map to check the positioning prior to importing into EuroScope. This tool also supports the parsing of arc segments.

Manual Conversion Method

Below is a flow of how to change a textual data coordinate into Sector File format:

  • Bring the N or S at the end of the latitude to the front: 510902.42N -> N510902.42.
  • Split up the coordinates with .s into DD.MM.SS.SS format: N510902.42 -> N51.09.02.42.
  • From what was described earlier, even the lines of latitude need to have 3 figure numbers for the degrees section. Therefore, change N51.09.02.42 into N051.09.02.42.
  • The Sector File also reads numbers after the seconds or decimals. This last part also needs to be made of three figures. If not defined to that presicion, 0s should be added to the end. So in our example N051.09.02.42 becomes N051.09.02.420. If the AIP definied a point more precisely, it should be rounded to 3 decimal places (e.g. N051.09.02.4255 would become N051.09.02.426).

In summary 510902.42N becomes N051.09.02.420.

The same process is followed for the longitude, noting the longitude degrees should be definied with three digits to begin with, e.g. 0001019.00W -> W000.10.19.000.

Coordinates are always written in pairs of corresponding latitude and longitude (in that order). In order to separate the two bits of information, we either use a : or a depending on context. In the case of a Threshold Coordinate, it is a space, so to use the converted coordinate we would insert N051.09.02.420 W000.10.19.000 into the revelant location of the Runway.txt file.