Site Calibrations, Bases, and Correction Services
This is blog 5 in the series “Real-Time GNSS Corrections: Productivity, Profitability, and Practical Considerations” written by Jason Evans, Portfolio Manager, Trimble Positioning Services and NYS Registered & Licensed Land Surveyor. The blog series discusses important considerations for your GNSS surveying workflows: When should you use a base station? Are you missing the opportunity to gain productivity by including correction services in your workflows? Can differential base-rover RTK benefit from correction services? Reading the full series will not only help you to answer these questions, it will enable you to calculate the cost savings of a real-time GNSS correction subscription.
Site calibrations are an essential element of many surveying projects. There are many options on how to approach these.
In previous blogs, we looked at various aspects of base-rover workflows, contrasting the value proposition of bases vs. corrections services, and contrasting real-time vs. postprocessed methods for project control. All of these subjects can come into play when performing site calibrations. There is an old saying among surveyors: “Calibration is not your friend, except when it is”.
For surveyors, site calibration (also referred to as “localization” in some field and office software) is when more than one point is observed, compared, and adjusted to corresponding “known” points. And although site calibration can be essential, it also gets a bad rap.
Some surveyors insist that they will never calibrate because it “adds error”. But is it even possible to never calibrate when the requirements of a job or project cannot be achieved by any other means? After all, setting up a base (or total station) and inputting coordinates is essentially a radial calibration. GPS/GNSS in its raw form is outputting WGS84 values, so when you work in a specific datum that’s transformed in the rover along with any further projection, that is also a form of calibration.
Fitting the Site
A properly executed site calibration gets the observed points to “fit” the known points. With this type of calibration applied, any subsequent observed points are adjusted to have high fidelity to the created site reference framework. When a surveyor performs a site calibration they are transforming the geocentric baseline vector components measured by GNSS receiver to local geodetic coordinates that provide a Northing, Easting and Height. This is commonly referred to as “translate, rotate, and scale” so that the points in the plan match the points on the ground.
Using best practices for site calibration, error can be minimized to be close to the full capabilities of the instruments used and within the error budget for the job or project. Therefore, if a calibration is a project requirement, the job of a surveyor is to use best practices to yield the highest accuracy as is practical within the error budget. Surveying field software like Trimble Access have calibration routines that step you through the process. You can view quality data at each step. For instance, you can see how well the published control points fit each other (residuals) after observing them with GNSS, total station, or digital level and, depending on project requirements, you may wish to use combinations of these. Additionally, specific to project requirements, it is not uncommon for surveyors to create a site calibration in office software like Trimble Business Center.
GNSS static observations, real-time vectors exported from rovers, total station vectors, and levelling data can be combined to create the site calibration that can be exported for application in field instruments.
It is optimal when the published control that you are using to create the site calibration comes from one source. Yet it is not uncommon (often to save time/costs, or if the client does not understand the ramifications of doing so) that mixed control is provided. For example, you might be given values for say, two state highway monuments, two more from a local municipal control database, a benchmark from a local utility, etc. The hazard of mixing control is that values may have been established by different entities, different surveys years apart, different methods, and with different accuracy outcomes. Mixed, they may not have relative integrity that would make them good candidates for creating a site calibration. It is more likely that a project general contractor provides points set by a surveyor, tied together with a single survey.
While a surveyor does not need to be completely versed in spatial data analysis and the nature of measurement error sources to step through calibration routines, it is highly recommended that at least some self-study in the subject is undertaken. The software is great at guiding you, but you need to understand what it is telling you. Take in a seminar, take a formal course, or read through a treatise like this1.
By the time you create the site calibration you should have a very good idea of the expected accuracy for any work done inside the region of the constraining control points. And remember, the calibration should be viewed as invalid for any area outside that area. Doing additional checks should be viewed as an essential step, even if not otherwise specified as a project requirement.
A site calibration should provide adequate accuracy for the duration of most projects, noting that the actual position of any point on Earth is constantly changing due to tectonic plate drift. Some field and office software deal with changing positions by using time-dependent transformations that allow you to transform your measured coordinates to the reference epoch of the datum. Caution would be advised when using a site calibration years after it was created, especially in areas with high tectonic velocity.
Calibration and Site Bases
Surveying field software such as Trimble Access™, has calibration routines. Trimble Access software continues to be the leading field software professional surveyors use around the world.
The subject of site calibration would take volumes to cover comprehensively, so let’s focus on a common scenario: setting up your GNSS receiver as a base and performing a site calibration.
Performing a site calibration to create a local datum is a common workflow for construction job sites but can also apply to pre-design topographic surveys. In the construction example, you may have a set of plans that has a scale, orientation, and basis of bearing that correspond to points that already exist on the site and match points in your job file.
When performing a site calibration, you would set up your site base, by methods detailed in the previous four blog entries, and then measure three or more (emphasis on more) points on the ground. You can perform additional checks to see how closely the resultant RTK values match the predicted values shown in the rover software.
Five points is recommended, distributed outside the bounds of the project site to form a strong geometry. Putting a long line of control points along a narrow corridor often does not provide good geometry. If control points out wide from the site have not been provided, consider adding more high-quality control. If vertical precision is key for project specifics, consider doing a level loop through all bounding control points to check the relative integrity of the provided values.
Here again we note that you could use a correction service to perform the site calibration. With a correction service such as Trimble VRS Now or CenterPoint RTX, you can forego setting up a base to save time, avoid the need to locate a site for your base, guard your base station from theft or disturbance from machines or workers, and to reduce the complexity of your workflow.
Using correction services such as VRS Now or CenterPoint RTX in conjunction with a site calibration may also be useful at various stages of the job. For instance, if a site reference station will later be added to support the use of machine control systems, one could use a correction service while the area for the permanent base station is cleared and grubbed instead of setting up a base somewhere off the site and having to broadcast corrections over a great distance. If there is no safe and practical place within the site to put a base, some surveyors use an NTRIP client, for instance on a PC in the job shack to access a VRS solution for the center of the site and connect via a serial port to a base radio for re-broadcast.
If your error budget is tighter, doing your own postprocessing, or using postprocessing services to evaluate the quality of provided control, or establishing new control can be the best choice. There are trade-offs: postprocessing can often yield higher precision than real-time, but this comes with the premium of substantially more time needed for field observations and processing. While postprocessing services like NGS (National Geodetic Survey in the U.S.) Online User Positioning Service (OPUS) and CenterPoint RTX are free, the time/cost of the longer observations by contrast make the cost of a real-time service subscription seem a bargain.
The decision to use any or all of the above is the balancing act you must do in considering the specific error budget of the job or project and time/cost constraints involved. Some considerations may outweigh others, for instance a megabuck construction project may not wish to shut down certain operations (at thousands of dollars per day or hour) while you employ time consuming surveying methods to achieve higher precisions than are warranted.
A sample data report from an automated online-post processing service (NGS OPUS), from submitted long-static observations. Note that you may be able to meet the same accuracy estimates (shown) with much shorter observation times using real-time correction services.
Footnote
1. ADJUSTMENT COMPUTATIONS - Spatial Data Analysis