The future of GPS
Schematic created by Simon Roulstone
The improvement in the performance of Global Positioning Systems (GPS) has made it a popular tool for road navigation. Yet how reliable is it really and how can it be developed into a system robust enough to undertake critical tasks such as air and rail travel? David Bartlett, a freelance consultant on GPS systems, looks at the history of GPS and how it can be combined with other technologies and satellite navigation systems to reach optimal performance.
The improvement in the performance of GPS over the last decade has been phenomenal. Developed originally for the military in the 1970s, and owned and operated by the US Department of Defense, GPS gradually gained a foothold in sea and air navigation. More recently, with improved receivers and auxiliary technologies like map-matching, it has become the navigation technology of choice for road travel. Not only is GPS now a standard feature of luxury cars, it is also widely available as a separate unit for vehicles or walkers, and is fast becoming an integral feature of many mobile phones.
Disparate uses
GPS is now being used in many serious professional applications including land survey, synchronisation of base stations in mobile cellular networks, and pay-as-you-go (PAYG) motor insurance as pioneered by Norwich Union. In this case, a small black box containing a GPS receiver and a radio modem is fitted into the insured car. The GPS continually monitors the position of the car and, each day, records of places and times of travel are uploaded to Norwich Union computers which calculate the insurance fee due. The system allows the user to make context-sensitive decisions about driving – for example, it is typically much more expensive to drive late at night or in the early hours of the morning.
Every week the media publicises new and novel uses for GPS: prisoner tracking, locating farm animals, location-based gaming, lone-worker protection and many others. However, GPS does sometimes make mistakes, and often doesn’t work well indoors, if at all. For a non-critical task like driving from A to B we can live with the odd error, but what if a failure led to the cellular telecommunications system going down, to charging the wrong amount under a PAYG insurance scheme, or worse, to an air or rail crash?
So this is the rub – reliability. A system that satisfies all our navigation needs has to be reliable, yielding quantifiable performance levels for a very high proportion of the time. GPS simply has not yet reached this level of performance.
As good as it gets?
GPS was the first Global Navigation Satellite System (GNSS) and is the only system generally available for commercial use today. In order to compute a position, the receiver needs to be able to receive and accurately measure at least four (three for a height-constrained solution) satellite signals and it needs to know precisely where the satellites are in the sky.
However, the satellite signals are very low power. To place this in context, the weakest signal at which a mobile phone works is about 1,000 times stronger than the strongest signal from a GPS satellite; the weakest signal at which a digital TV can receive a picture is about 100,000 times stronger than the strongest GPS signal. When an obstruction obscures direct visibility of the satellite, the signal becomes even weaker and highly sensitive receivers with long integration times are needed in order to measure it.
In addition to coping with signal attenuation, the GPS receiver may also need to contend with interference. Usually this is unintentional, arising from electrical and electronic systems all around us. Many of these emit low-level spurious signals in the GPS band (around 1.5 GHz). However, interference may also be intentional and introduced in the system by the US Department of Defense, or by a hostile interferer. In either case, it can lead to the user being unable to obtain a GPS position fix.
Dealing with limitations
Most of the performance improvements achieved by GPS have come from better receivers and their ability to ‘dig out’ very weak signals from the background noise. This is particularly noticeable in ‘tracking mode’. Once the receiver has acquired the satellite signals, it is possible for it to continue tracking them even when they are as weak as one thousandth of the normal direct signal strength.
However, it is acquiring weak signals in the first place that is difficult. The problem is that the satellite signals carry a data signal containing all the information needed for the receiver to compute a position (see How GNSS Works section opposite). This data is transmitted at a bit rate of 50 bits per second which modulates the signal being measured. During acquisition the receiver does not know what data is being transmitted and this uncertainty makes it more difficult to accurately measure very weak satellite signals, and furthermore, the receiver has to be able to decode the data in order to know how to compute the position.
Dealing with interference is even trickier. GPS uses two radio frequencies: one for commercial services and the other for military. Commercial GPS receivers are based on a single frequency in the L1 (1.5 GHz) band and any interference in this band could stop them from working. Therefore, traditional evasion techniques such as frequency-hopping or use of frequency (and time and space) diversity are not available to them. Military systems have the benefit of both bands and are therefore less susceptible to jamming.
The next-generation GPS system is presently in development and is expected to embrace additional diversity techniques that will enable more robust receivers to be built. These techniques will include additional frequency bands and separation of data bearers from navigation signals to allow for quicker and easier signal acquisition. ‘Modernised GPS’, or ‘GPS III’ as it is sometimes known, is expected to come on-line around 2012.
Alternative systems
While GPS III is dealing with some of these common concerns, there are some political factors which are driving the desire for additional GNSS. Since GPS is managed and controlled by the US Department of Defense, its use for commercial services are by their grace and could be turned off at will. In addition, GPS does not offer any commercial service-level promises.
The first alternative GNSS was Glonass, which the USSR began constructing in 1976. However with the collapse of the Soviet Union, Glonass quickly fell into disuse after the completion of the system in 1995. The Russian government has said that it intends to restore it to full operational status.
Then there is Galileo, the European GNSS system, which is being constructed to directly address commercial applications, and will carry a service-level agreement. It has been designed to exploit multi-band use and other technical innovations, to help deal with signal acquisition and increase robustness against interferers. The satellites orbit at a slightly greater inclination, which is claimed to give better coverage than GPS in the high European latitudes. The system also integrates with the international Search and Rescue Satellite (SARSAT) system and can receive and pinpoint distress signals. Galileo has been plagued by delays and financial problems, but finally seems on track for operational status by about 2012.
Having more than one GNSS may seem unnecessary, but it does provide additional diversity and robustness by having several complete sets of satellites, managed independently through different ground infrastructure and using different frequency bands. It has been shown using simulations that combining Galileo and GPS measurements can improve overall navigation performance, especially in tough urban environments.
Transport for London (TfL) has conducted a series of detailed and comprehensive trials of 17 different GPS systems for use as road-pricing technology. It concluded that GPS, even when supported by map-matching, did not provide a sufficiently reliable system on which to base a road-pricing enforcement system. However, it is anticipated that the combination of GPS and Galileo, coupled with map-matching, would probably be sufficiently reliable to support a road-pricing application.
Improving GPS
Back inside GPS, there are a few ways in which external information can be provided to the receiver in order to improve its performance. These methods are often referred to as assisted GPS, or GPS augmentation techniques.
The most common use of external data is to improve the satellite acquisition process, to enable weaker signals to be detected, or to make the initial acquisition much quicker – perhaps 5-15 seconds rather than the minutes it may take without assistance data. This is particularly important for applications in which position fixes are required only infrequently. The information provided typically consists of the basic satellite data (identity, position in the sky), which helps the receiver narrow down the signal search, provided that it has a rough idea of its location.
If the receiver position is known a little more accurately, and it has a good timing source available – such as from a cellular telecommunications network – then the assistance data can be used to narrow the time and Doppler searches even more. This is the technique pioneered by CSR, a designer of wireless devices, which has shown that in cellular networks reasonable tracking can be maintained continuously, even though GPS measurements are made infrequently (minutes apart). This dramatically reduces power consumption and improves navigation indoors, where GPS traditionally does not work at all.
Signalling for land and sea
There are also many independent systems and technologies which are important to GNSS, and are likely to continue to be so. Besides map-matching – without which most commercial GPS navigation systems used today would fail to live up to our needs – auxiliary motion data from wheel-rotation counters, accelerometers, magnetometers, rate gyroscopes and other devices may be used (see ‘Map-Matching explained’ section). High-performance dynamic GPS receivers used for precision and aeronautical work are usually too costly and bulky for mass-market commercial systems, which consequently rely on much cheaper partial integration with other sensor data.
Given recent GPS improvements the old ground-based, wide-area radio location systems such as Decca and Loran C have largely fallen into disuse and have been replaced by the superior and better performing GPS system. However, the shipping industry is actively promoting the development of eLoran. This is a modernised version of Loran – a terrestrial radio navigation that uses a network of low-frequency ground transmitters. The receiver picks up and decodes the signals from at least two or three different transmitters and, using these measurements, computes the receiver position. Accuracy of at least 10 m is being claimed. Since the system is based on completely different radio modes, and very different radio spectrum, it is felt that eLoran provides a viable parallel, or backup, system for GPS for shipping, and for high-reliability road transport applications.
For specialised local area and high-precision, real-time tracking and positioning systems, it is likely that a number of other technologies will continue to find niche markets. These include positioning within wireless LANs (local area networks) using the WLAN signals, optical target tracking, UWB (ultra-wideband) and other bespoke radio location systems, not to mention the role that radar has in positioning objects in airspace.
Future take-up
It is likely that GPS will remain the hub around which navigation applications are centred for some time to come, although gradually this role will be taken on by a few different GNSS, including Galileo and Glonass, which will work alongside GPS. However, the role of other techniques, such as map-matching and inertial systems, as well as GNSS assistance and augmentation data, is likely to be critical in the future success of GNSS.
For many tasks, especially those that are not critical, GPS alone is adequate. For the rest, the perceived success of GNSS (and GPS) will be underpinned by many other underlying positioning techniques and technologies. It will be the way these disparate technologies can be fused together that will lead to wide-scale robust and reliable positioning for navigation applications.
Biography – David Bartlett
David Bartlett is a Chartered Engineer and member of the IET. He runs his own business at www.pyxidium.co.uk providing innovation, system design and technical services in the field of location and positioning to organisations that develop, use or rely on the technology. He is also founder and co-champion of the Location Special Interest Group at Cambridge Wireless, see www.cambridgewireless.co.uk
Keep up-to-date with Ingenia for free
SubscribeOther content from Ingenia
Quick read
- Environment & sustainability
- Opinion
A young engineer’s perspective on the good, the bad and the ugly of COP27
- Environment & sustainability
- Issue 95
How do we pay for net zero technologies?
Quick read
- Transport
- Mechanical
- How I got here
Electrifying trains and STEMAZING outreach
- Civil & structural
- Environment & sustainability
- Issue 95