Transcat Webinars and Online Learning

How to Calibrate a Pressure Gauge Using a Pressure Comparator or Pressure Calibrator

Tim Francis | Product Marketing Manager at Fluke Calibration

In this webinar, Tim Francis, Product Marketing Manager at Fluke Calibration, details the process for calibrating a pressure gauge using a deadweight tester or pressure comparator. Tim prefaces his presentation with a brief discussion of the key physics-based pressure measurement consideration that can impact the calibration process. He then provides a step by step guide to the pre-test, testing, and post-test process for pressure gauge calibration. The presentation ends with an overview of the types of calibration equipment needed for various measurement applications.


Nicole: Good afternoon, and thank you for joining us on our call today. My name is Nicole VanWert from Transcat, and I’ll be your moderator this afternoon. Our webinar topic is how to calibrate a pressure gauge using a pressure comparator or pressure calibrator. Our presenter today is Tim Francis from Fluke Calibration.

Tim is a product marketing manager for pressure and flow at Fluke Calibration. He began his career in pressure metrology in 2002 at Ruska, where he served in a number of different engineering roles, including Applications Engineer. He joined Fluke Calibration in 2010, following the Ruska acquisition. He holds a bachelor’s degree in Computer Science from The University of Texas at Austin, and an MBA from Arizona State University. Tim works in the Fluke Calibration facility in Phoenix, Arizona.

We expect today’s presentation to last roughly 30 minutes, then we’ll answer any questions that have been submitted during the presentation. Any time during the presentation you can send questions through the question box, to the right in your webinar control. We’ll review all the questions at the end.

I also want to remind you that this webinar is being recorded. Each of you will receive a follow-up email with the links to the recorded webinar, at the end, and the slides, for today’s presentation within one or two days. At this time, I’m going to turn the presentation over to Tim.

The Physics Behind Pressure Measurement

Tim: Thank you, Nicole. Welcome everyone, thank you for your attendance today. And, as Nicole said, we’re going to be talking about how to calibrate a pressure gauge using a pressure comparator or pressure calibrator.

The basic agenda we’ll cover is first, we’ll lay a foundation with the basics. We’ll look at some of the physics behind pressure measurement, how pressure is defined and how it’s affected by the laws of physics. If it’s been a few years since you’ve been through a high school physics class, don’t worry, it’s not too complicated. We’ll be just looking at the physics and how it affects our ability to measure the pressure. We’ll then talk about the pressure calibration process.

When you’re calibrating a gauge, what exactly should you, and should you not do? So, we’ll look at the steps that should be included in the process, or the procedure. And we’ll look at certain techniques that can be used to improve the efficiency and the repeatability of the process.

And then, we’ll finish off by looking at the equipment selection. How to choose what type of equipment to use, be it the media, gas, oil or water —Very quick looking specifications, as we could spend an entire webinar on looking just at that topic... As well as looking at how some of the different types of equipment differs, and we’ll look at some examples of that equipment.

So, let’s look first at the basics, at how pressure is defined. Pressure is defined as the measure of force exerted over an area. Force divided by area. We see that in the pressure units that we use here in the US (Pounds per Square Inch —Pounds being force, Square Inch being area). In the SI system it would be Newtons per m², which equals Pascal. Same idea, force divided by area.
Back to Top of Page ↑

Hydrostatic Pressure

Now, the pressure that we’re interested in here is what’s called hydrostatic pressure. That’s just a fancy way of saying that first, that the media is a fluid, and a fluid can be either a gas or a liquid, but what we’re saying is that we’re not interested in a pressure that a solid object exerts on an area. We’re interested in pressure caused by a gas or a liquid, and static, that the pressure is stable and not changing. Now of course, the pressure will change during the course of our calibration.

I will talk a bit more about just how stable is stable. What we’re saying here is we’re not looking at measuring quick spikes in pressure, or the frequency of the change of pressure. Instead, we’re looking at a situation in a calibration where we change the pressure, we allow it to stabilize, and then we measure it.

Some important physical principles…
Back to Top of Page ↑

The Concept of Head Pressure

First, the concept of head pressure. If you’ve ever gone scuba diving, you’ll have experienced a situation where the pressure at the bottom of the ocean, or the bottom of the body of water, is greater than the pressure at the top. It’s basically that the weight of all the water on top of you is causing a pressure on you while you’re scuba diving. That phenomenon is what is referred to as a reference-head, or head pressure, and it exists in all fluids whether it be gas or liquid.

It’s calculated as the density of the fluid times the local acceleration of gravity times the height in question.

What this means for us when we do a calibration is that, if our reference and our test gauge are at two different heights, even if they’re opened up to the same test system (the same media), they’ll be seeing two different pressures.

In the example here where we have the dial gauge off to the right, the tire that’s above our pressure controller, the test pressure or, the pressure that the dial gauge is seeing, will be slightly less than the pressure that the reference controller’s seeing. This is caused by, as we said, this ‘head height’.

Now, the density is a key aspect of this, and the density of the fluid will be different, depending upon if it’s a gas or a liquid. So, does the head pressure matter? How big of a change are we really talking about?

When you’re using liquid, one inch of height difference is approximately 0.3 psi. So, if you’re calibrating say, a 100 psi gauge, and that gauge has a specification of 0.25% of full scale, that means it would have a specification of plus or minus 0.25 psi. In that case, if we’re using liquid as that measurement, or as that media, and our two gauges had a different height of about one inch, then our readings —even if both gauges were perfectly accurate— would differ by 0.3 psi. So, if we don’t account for that head pressure, we would be out of specification.

Conversely, if we’re calibrating a 10,000 psi gauge, with relatively high pressure, with that same 0.25% specification, well, now, 0.3 psi is 1/100th of the specification. So it’s not going to necessarily impact our measurement. So there’ll be situations where we need to account for it, and there are situations where an approximation is sufficient, where we can look at the two gauges and say, “They’re close enough, in height.”

With head pressure, we can calculate it and we can correct for it relatively easily, as I said, it’s the Height times Density times Gravity. We could also handle it, by simply making sure that our device is under test, and our reference gauge, are at the same height.

When using gas, the head correction, physically, is still there —there’s going to be a difference in reading, but, the density of gas is much smaller. The other aspect of gas is that the density of gas is not thought of as a constant. With a liquid, we think of it as being non-compressible, so that the density is the same, no matter what the pressure is. With gas, that is not true. It compresses, so as we go higher in pressure, we have a higher density.

So, a good approximation for most pressures, and most gas media is about 3 PPM or 0.0003% of a reading. So, if the calibration that we’re doing is once again on that 0.25%, or a quarter percent gauge, one inch of head height has more or less no bearing on our calibration.

If we’re trying to calibrate something much more accurate, or if there’s a really large head-height difference… let’s say that you’re calibrating a gauge in situ, that is, you’re calibrating it in its place of usage (onsite calibration), and the gauge is at the top of a boiler, or the top of a smoke stack, and you’re standing 20 feet below it. Well, it’s possible that that change in elevation could result in a head-height issue.

So, the key takeaway when it comes to head pressure is, for many applications, it’s not going to be a major impact if you’re working with relatively low accuracy devices, but the easiest way to handle it is to make sure that your references and test are relatively close to the same vertical height, and that, with low pressure, working with a liquid media, oil or water, then it’s going to be much more of an impact.

So it helps if you’re using gas for very lower pressure, or low pressure, and oil or water for higher pressure, because then your head heights become less of an impact.
Back to Top of Page ↑

The Ideal Gas Law

Finally, let’s look at what’s called The Ideal Gas Law. If any of you are football fans, you may have heard this mentioned in the news over the last few months, as it’s the scientific explanation on how the air could have been let out of the footballs without it actually being let out by the New England Patriots.

The Ideal Gas Law basically says: pV = nRT,
what each of those refer to is…

  • p is pressure,
  • V is the volume of the system,
  • n is the number of molecules,
  • R is the gas constant,
  • and T is temperature.

The key takeaway here, is that the pressure of the system is highly related to the temperature of the system, the volume of the system, and the number of molecules in the system.

And what the Ideal Gas Law means is, if we want to change the pressure in the system, then we’ll need to change either the volume of the system, the number of molecules in the system, or the temperature. Also, it means that if the pressure in the system is changing, whether we like it or not, it’s one of those three things that’s changing as well.

So, if our football started out at 14 psi and went down to 12 psi, either somebody removed molecules from the system, i.e there’s a leak, the volume of the system has changed, or the temperature changed, and the pressure changed with the change of temperature.

That relationship between pressure and temperature is very important. And, at least, we’ll refer to it as adiabatic effects. As the pressure increases, temperature also increases. When the pressure change stops, the temperature will slowly return to ambient. So there’s a direct relationship between pressure and temperature. Pressure increases, temperature increases. Temperature decreases, then the pressure also decreases.

So what ends up happening, is if you have an analog / dial gauge, say it’s hooked up to a screw press, and you increase the pressure, and you increase it quickly, by changing the volume in the system, the pressure will go up. And as you stop at the point that you want to go to (say you go to 5,000 psi) the temperature effects will cause that pressure to start decaying. And you’ll look, and then go, “Man, I’ve got a leak in my system. I better find that leak.”

The reality is you may not have a leak. It may just be the temperature effects causing it to look like a leak. So all the molecules are more or less staying in the system, it’s just the temperature decreasing is causing the pressure to decrease. The result is you’ll spend hours chasing a leak that doesn’t exist.
Back to Top of Page ↑

Pressure Gauge Calibration Process

Now that we’ve got that foundation of the fundamentals of the physics, let’s look at the actual process of calibrating a pressure gauge. We can split it into three groups; there’s the actions we do prior to the test, there’s the actual test, and then there’s the posttest actions we’ll do afterwards.

1. Clean the Device Under Test (DUT)

First, one must think about, “Do I need to clean the device under test?” Why do we need to clean it? Well, the inside of the device under test, the wetted parts, may be contaminated. Perhaps the gauge was used on an oil, or used with water, perhaps it’s got solid particles in it, and so forth.

Why does that matter? Well, we’ll contaminate the medium that we’re doing our calibration in. It will contaminate the gas or the liquid.

Why is that bad? It will contaminate the reference, which could cause errors if you’re expecting everything to be gas, but there’s oil in it, then you’ll have some fluid expecting everything to be gas but there is oil in it then you’ll have some fluid head errors, or you could cause a restriction of the system. And you may cause damage to your reference or to some aspect to your system, and you could possibly contaminate future devices.

So the next thing you calibrate, so perhaps the first thing you calibrate that day is a gauge full of oil. You pressurize it on your system, depressurize it. The oil from your gauge goes into your calibration system. Then let’s say that the next gauge that you test, needs to be hydrocarbon-free; can’t have any oil in it. Well, you now stand a very good chance of contaminating that gauge with the oil from the previous system or the previous device. That could cause a problem.

So what do we do about it? Well, we’re cleaning the device under test. Modern pressure sensors they’ll often times have a relatively small internal cavity. So they can be relatively simple to clean. A more traditional dial gauge, like a Bourdon tube style dial gauge, can provide more difficulty because it will have larger internal volume, and it’ll be much more difficult to get through. So perhaps you’ll have to disassemble it -- it’s not always practical -- perhaps purge the system. What we mean there is to run some sort of cleaning fluid through it, to do that.

And the next step on that is the third thing here where we fill the drain using a solvent. So we can fill it full of, say, alcohol, and then drain that out of it, perhaps pull a vacuum on it. Lots of methods to purge the system.

At the very minimum, you should probably do a visual inspection of the device under test before connecting it and make sure you’re not going to contaminate your reference standard.

2. Leaks Test

We talked first about the adiabatic effects, that you generate pressure, and then the changes in temperature cause it to appear to have a leak even though it may not actually have a leak. But what happens if you do have a leak? Leaks can potentially result in measurement errors. And basically what’s happening there is that remember our Pascal’s Principle is relying upon a closed system. We no longer have a closed system. The physics kind of go out the window.

With an open system with a leak, what can happen is you’ll have a flow through the system. And that flow, if there’s then a flow restriction somewhere, you’ll have a differential pressure across that flow restriction. This is how we measure the flow in a pipeline, but here we end up causing an error between our standard and our device under test. So whenever possible, we want to eliminate leaks. One, they can cause errors and two, they’ll slow down our process. It’ll take longer to generate the pressure, especially when we’re generating it manually. And it’ll take longer to stabilize the pressure at a set point.

So how we do check for leaks, and what do we do about them? So first, you’ll generate the pressure going to the device under test, full scale. You want to wait there for a period of time. And the reason you’re waiting is to nullify those adiabatic effects. Remember, we don’t want to chase a leak that doesn’t exist.

So first thing we’ll do is we’ll wait for the temperature effects to dissipate. We’ll then measure the pressure drop over a given period of time. So perhaps, we’ll get ourselves a stopwatch and after, we’ll measure the pressure at the beginning, and we’ll measure it a minute later, and we’ll look at how much it has dropped.

Now the next obvious question is, “Well Tim, how much can it drop? At what point is it a problem?” And that’s not an easy question to answer because it’s going to be dependent upon the volume of the system and the accuracy requirements. And also things like, do you have a flow restriction of the system? Because remember, we’re trying to eliminate that pressure differential. So if you don’t have a flow restriction, you won’t have a pressure differential.

But the reality is that a leak-free system is always better to do the calibration with. Now a 100% leak-free system, as I said here, are like unicorns. It’s very difficult to make it 100% leak-free. Now if you’re dealing with liquids, it becomes a little bit easier. You can spot the puddle of oil on the floor. But especially in lower pressured gas systems, it can be very difficult to make it 100% leak-free. So the key thing is to connect everything up, pressurize, allow for stabilization, and then watch the pressure drop over a given period of time. With a permanent setup or with a calibration system, you’ll very quickly, after doing a couple of tests with a couple of different devices under test, get an idea of what is a normal leak rate with that system, and with that, you can develop a specification for yourself on what is acceptable and what is not.

As I said, this is a step that is often times skipped but in my opinion is one of the most important when it comes to calibrating a device because yes, you spend some time at the beginning checking for a leak, but it’s going to keep you from questioning it later, it’ll eliminate errors, and it’ll make the overall process go faster.

3. Exercise the DUT

Finally before we do our actual calibration, we’ll look at exercising the device under test and exercising the reference. And what does this mean? It does that not mean that we put them in jogging suits and have them go jog a mile. We do not have them do push-ups and sit-ups. But instead, we’re exercising the elastic elements in the gauge. What that means is that we’ll, try to mimic the actual usage of the device. We’ll cycle the pressure causing, if there’s an elastic element, for it to stretch and go back to zero. The common thing to do is about three times, just to make sure that the element has been stretched and that the gauge will now operate like it operates in its normal usage and that we’ll get the best performance out of our reference device. So pretty simple is just go to full scale, vent the system, repeat two more times.

So now that we’ve cleaned the device under test, we’ve connected it all up, we’ve checked for a leak, we’ve exercised the system. Now we’re ready to actually start taking points, to actually start recording data and performing our test.

4. Stabilize Pressure for Dwell Time

As we mentioned before, the goal here is to have a stable pressure. And so when we change the pressure, we’ll want to wait to make sure it’s stabilized. But for how long? Once again, it’s not a simple answer where I can say there’s one value that works for all devices. Some respond faster than others, but usually you want to wait at least 30 seconds to make sure that everything is stabilized.

And then you say, “Okay, well if I’m waiting for everything to stabilize, how do I know it’s stable? So how stable is stable?” As I mentioned, the pressure’s impacted by changes of volume, temperature and leaks. It must be sufficiently stable that the operator can determine the pressure reading. So, if it’s one of those that you’re looking at your gauge, your device under test and the needle is rapidly moving from the pressure to zero, and you try to time it exactly when it passes the cardinal point, you may lose some repeatability on your pressure calibration.

Instead you want to stabilize out the pressure. Now an important thing to consider here is that, sometimes we make this more difficult that it needs to be by looking at too great of a resolution. We get very proud of our pressure standards and how accurate they are, and we try to show every digit that they can possibly show, and the end result is that we’ll often be looking at a digit off on the far right that is rapidly changing, and we say, “Okay, it’s obviously not stable,” but that digit is well past the resolution of our device under test, and the fact that it’s changing doesn’t really impact our ability to determine the pressure sufficiently for the calibration.

So those extra digits can kill you. The common approach is to make sure that you have at least one more digit on your reference than you have on your device under test. It’s a good rule of thumb. More than that, well you can think of it as like, “Okay, well I’m getting more resolution, more sensitivity in my standard. That’s a good thing.” Well, you may be making it more difficult on yourself than you need to be.

5. Set a Cardinal Point for Calibration

And I’m finally looking at another aspect of the taking the measurement is what we refer to as a cardinal point calibration. And what this boils down to is that for many, it can be very difficult that when a pointer on the dial gauge… when the pointer is in-between two points, how do we resolve which point it’s at, or do we estimate it and say, “Okay, well it’s between 1 and 2, and it’s halfway in between, so it’s 1.5, or is it 1.4?” It can become very difficult and difficult to get good repeatability between operators.

One approach here is to use what’s called a cardinal point calibration and that is instead of setting your reference to be at exactly 10 psi, you’ll set your pressure so that your device under test is reading exactly 10 psi, and now your reference is reading something different, 9.993 or 10.05. By recording that data, you can be much more confident in the reading of device under test, whereas the digital output on your reference is very simple to read, and it can be recorded easily. This gives you, in essence, better resolution on your pressure calibration.

6. Dithering to Remove System Friction

A unique thing to look at on calibrating dial gauges or basically a mechanical a gauge that has mechanical linkage is what we call dithering. Dithering is basically the process of tapping on the mechanical gauge to make sure that there’s no friction in the system that’s causing the mechanical components to keep them from reading properly. We want to remove that friction.

So we’ll tap on the face of the gauge before taking our point to make sure it’s good. I show a hammer here. It’s normally not a good idea to hit your device in excess with a hammer but more of just a simple tap will normally work.

When should you do it and when should you not? If you know how the gauge is used under its normal operation, this will go into it. If it’s used behind a thick glass pane behind the block wall where the normal operator isn’t going to be dithering on it during usage, then you probably shouldn’t dither during calibration either. But if it’s used in a normal application where the user could be tapping on the gauge or dithering it during operation, then you should do the same when you calibrate it.

7. Clean the DUT Post-Test

When we look that what’s due after the test, now that we’ve taken our points, we’ve adjusted if necessary and collected our as left data, now we’re done with it, we’ve disconnected, what should we do at the end? One recommendation would be the cleaning of the device under test. Why do we need to do that? In case we have introduced any contaminants to it during the calibration process, we want to make sure that that doesn’t go out into the actual usage of the gauge. On the same token, you need to make sure that you clean it in such way that you do not impact the actual measurement of the gauge because you are providing data that is the as left data. Any actions you take after that cannot impact that data, or it’s an invalid calibration.
Back to Top of Page ↑

Pressure Calibration Equipment Selection

So what equipment should we use when we calibrate, perform pressure calibrations? First we need to look at what media we should choose. Should we choose gas or should we choose a liquid media?
Back to Top of Page ↑

Gas Media for Pneumatic (Low) Pressure Calibration

Gas media, or what’s also referred to as nomadic systems, are normally used for lower pressures. Lower is a relative term. Where is that cut-off? It may be as low as 300 psi. Why 300 psi is rather simple to generate by hand with a pressure pump of some sort, be it a hand pump or a bench top pump. Although many of those are going to higher pressures now. You’ll often times see gas used up to perhaps 1,000 psi or 2,500 psi, perhaps even 3,000 psi, as that sort of pressure range is available through a gas cylinder nitrogen bottle, for example.

It can also be used for higher pressures. The highest normally you’ll see is maybe up to 15,000 psi, but those are only, for the most, part special situations where oil cannot be used, perhaps as an oxygen cleaning device or similar.

So there you use gas because you have to, not because it’s easier. It becomes much more difficult using gas at pressures above, say, 3,000 psi and extra safety precautions will be required as it’s a lot of compressed gas, or a lot of stored energy.

With gas, you normally generate the pressure by increasing the number of molecules in the system. You might compress ambient air with a hand pump or perhaps you put more gas in the system by opening a valve to a more regulator to a supply bottle. Gas is also often times used because it’s clean. Now, while if you’re compressing ambient air and your ambient air is extremely dirty, then perhaps it’s not that clean, but contamination is normally not nearly the issue than it is with oil or water, as far as contaminating your reference.
Back to Top of Page ↑

Liquid Media for Hydraulic (High) Pressure Calibration

Oil is often used for higher pressures. Above 300 psi, above 1,000 or 3,000 psi and can be used all the way up to 60, 75,000, even 100,000 psi. If you’re using it with very low pressures like the example I gave out with a hundred psi, then head high errors become an issue. But with higher pressures becomes not an issue. It’s perfectly viable.

Oil compared to water helps lubricate the system and extends the life of the equipment, and it’s relatively easy to generate higher pressures. Oil or liquid doesn’t compress, so when you change the volume in the system, it changes the pressure quite quickly. The key challenge on using a hydraulic system is making sure that all the gas is out of the system, because if there is gas in the system, when you change the volume, you’ll compress that gas, and so even though you change the volume dramatically, the pressure doesn’t increase. Once you have the gas out of the system, then any change in the volume causes a rapid change in the pressure as well.

So it’s very key to make sure that you purge the gas, you prime the system, you get it completely full of the liquid. You can do that through a purging or priming type pump where you can vacuum fill the system. It’s two common approaches.

Now most of the benefits of oil also happen with water. So why use water, why use oil? First, you use water if the pressure is too high to use gas and oil contamination of the device under test is allowable. Many people look at oil as being messy and so they want water because it’s considered easier to get, it’s cleaner. But the reality is it has limitations.

First off, it’s not a very good lubricant. In fact, it’s horrible at that, and it can possibly corrode the system if the system is not properly designed to work with water. And the other aspect is not all water is created equal. Even devices that are designed to work with water are designed to work with clean water. Usually distilled water. If you just fill the system with tap water, or you start with distilled water but you introduce a lot of contaminants from your device in a test, you’ll end up potentially causing damage to your system.

So, you’ll need to make sure that the water is clean.

My recommendation is that, for lower pressures, use gas whenever possible. For the higher pressures use oil and only use water if you absolutely have to. If you can’t use oil, then use water, but for the most part use, use mineral oil or a silicone oil whenever possible.
Back to Top of Page ↑

Choosing a Pressure Calibration Reference Device

Next, when we’re setting up our system, we need to look at our pressure reference device and that conversation normally revolves around “is this accurate enough to do the job that we’re doing?”

That’s not a very easy question to answer sometimes. We normally look at that as a ratio between the accuracy of the standard and the accuracy of the device on its [inaudible 00:37:39] and a rule of thumb is four to one, but that’s just a guideline, there are some applications that a smaller ratio is acceptable, or if you if you use a guard banding approach you can use a smaller ratio, and there’s other times where where the application or the customer requires a larger ratio, but four to one is a rule of thumb that that many people will use. The key thing when you’re making that comparison is to compare apples to apples.

If, your device on your test is 0.1%, so relatively high accuracy dial gauge, and the standard is 0.02%. 0.1 divided by .02 gives us 5, pretty simple math. Hey, it’s five to one, but if the reference has a full scale of a thousand, and the device on your tests has a full scale of a hundred psi, then what happens is that instead of a five to one in favor of the reference being more accurate we’re actually at 0.5 to one. That is, our device on your test is twice as accurate as our standard.

So, it’s kind of like the world of personal finance. When people start talking to you in percentages, get worried, make sure that that they’re not, that that it’s just not confusing you. Instead, it’s always best to compare actual pressure units, psi to psi. The accuracy of the device on your test or the specifications of the device on your test is 0.1 psi and the specification on the standard would ideally be 0.02 psi.
Back to Top of Page ↑

Pressure Generation & Control

So, to generate the pressure, as we said we go back to the ideal gas law. We can change the volume in the system using a screw press. We can change the number of molecules in the system by letting gas in, by metering it in with one value, and or letting gas out by going through an exhaust valve, or we can change the temperature in the system. Normally, we’re not changing the temperature in order to control or generate the pressure. The temperature change, instead, is what’s working against us from having a stable pressure. So, we need to make sure when we develop our system that we have the ability to generate the pressure, and we have the ability to measure the pressure.
Back to Top of Page ↑

Deadweight Testers

A traditional approach to do this is with what’s called a deadweight tester, and I’ve got a Fluke Calibration Deadwight Tester pictured here on the bottom right. It’s one of our p3100’s. The pressure’s measured by a floating piston with masses used to apply the force. It generates a very stable pressure because the piston will naturally sink while you’re doing the calibration, and the sinking of the piston offsets the adiabatic effects. The output of the device can be very stable over time. That’s both a stable pressure, and as I mentioned there, but stable over time in that the measurement components are mechanical in nature. They’re taken care of, they don’t drift over time, and so, you’ll find that they can become much more stable. It’s often times much more accurate than a digital gage or similar, but it’s impacted by many influences, including gravity, ambient temperature, ambient air density, and so forth.

So, to get the best accuracy you have to take those into consideration. You pretty much always have to take gravity into consideration. It does require the use of heavy weights. So, it’s not necessarily the most portable device, and because you change the pressure by stacking more weights on it, your ability to do a cardinal point calibration becomes difficult.

Need help finding the right DWT?
Check out Transcat’s Fluke Calibration Deadweight Tester Selection Guide!
Back to Top of Page ↑

Pressure Comparators

A solution that’s becoming more and more accepted and popular is a pressure comparator. So, now you no longer have a piston with a stack of weights regulating the pressure. Instead, your reference standard is a digital reference gauge or similar that’s measuring the pressure. It’s easy to use. You just read the display, and you don’t have to worry about gravity corrections or temperature corrections and so forth. Simply read the display and you know what the pressure is.

It’s sufficient accuracy for many applications. Although it’s said the deadweight tester is more accurate, and can give you better measurement. It’s not impacted by gravity. You don’t have to carry weights around. So it can be, in a more portable application, and easier to use from a physical labor standpoint.

But the pressure will not be as stable. With the deadweight tester the sinking piston offsets the temperature effects and causes everything to be stable. You don’t have that phenomenon in play here, so you are impacted by adiabatic effects, so you will have to go up to the pressure, perhaps go past it, and then go down so you can sort of stabilize it, perhaps wait a little longer at a set point. You can get a stable pressure, obviously. It can be used in that fashion, but you don’t have the laws of physics on your side as you do with the deadweight tester.

And it does require a more routine calibration as a digital reference gage is more likely to drift with usage or with time versus the deadweight tester. Some examples here, the one on the left is a pneumatic device the p5510 pneumatic pressure comparator, has a pump so you can generate up to 300 psi, or you can generate a vacuum. It has two test boards on it, one so you can connect your reference gage and one to connect to the device on your test. The one on the right, this is the p5515 hydraulic pressure comparator.

Once again, you’ve got two tests boards, one for your reference, one for your device on your test. This one works on liquid, so you’ve got a reservoir there in the middle, a priming pump to get the air out of the system, to get the pressure generation started, and then the screw press in the front to change the volume, so you can both generate pressure and then finely tune that pressure to get the reading you want. An example reference gauge is given there at the bottom, which is the Fluke Calibration 2700g reference pressure gauge – 0.02% of full scale, specification, so it’s quite accurate and capable of handling many jobs, available in ranges up to 10,000 psi.
Back to Top of Page ↑

Pressure Gauge Calibration Summary

So, in conclusion, dial gauges can properly and efficiently calibrated by following the common techniques and using the right equipment for the pressure range and application. I would like to mention that, with Transcat we are running a special limited-time offer with the purchase of a gas or hydraulic pressure calibrator that has a combination of the p551x, 5515, 5514, 5513, and 5510, with the 2700g. With this special time offer you can get, depending on the purchase, one or two free gauges. For more information you can check the Transcat website or the Fluke calibration website. So, at this time, I will turn it back over to Nicole, if we have any questions to answer.
Back to Top of Page ↑

Still have questions? We're here to help!


Questions and Answers

1. Is it okay to use oil or water to calibrate low pressure gauges? ➩

2. Should we be charting the pressure drop in the gauge to be able to differentiate between temperature changes in actual leaks? ➩

3. On the subject of head correction, in the calibration lab I worked in, with the Air Force program in an Air Force base in Phoenix, we mostly used ppc2 to calibrate our pressure gauges. We corrected for head height by measuring the distance in height between the test port of the ppc2 to the center of the board on top of the pressure gauge. In your experience, at what point, or what point would reference in a digital pressure gauge? ➩

4. The majority of the gauges we have are pneumatic ones. When our calibration vender comes to perform the calibration on the gauges they bring pressure gauges as the standard that is attached to a system that has the pressure increased by a hand pump. The system then is used, uses distilled water. Most of our gauges are under 160 psi. How harmful is this to our gauges and our systems it's attached to? The systems are mainly used; these are mainly used on our autoclaves or fermenters. ➩

5. Is there a rule for using certain size charts for certain pressures? Example, can we use a 2,500-pound chart to pressure test 10,000 psi? ➩


1. Is it okay to use oil or water to calibrate low pressure gauges?
Tim: Okay is a relative term. For low pressure, be it, say 300 psi or below, or so forth, you’ll need to be very careful of your head height and make sure you don’t have any additional air that you’re not accounting for, but it can be used. You’ll just need to be careful of that, and you’ll also want to make sure all of the gas out of your system, so that you can easily generate the lower pressures, because if you have gas in your system, you go to generate a small pressure, you’ll turn the screw press, nothing will happen, and then all of sudden when that gas is compressed, all of a sudden, you turn the screw press just a little bit and you’re already at 1,000 psi and you’ve, you’ve overpressured it. I would recommend using gas, but if you have to use oil or water, you can.

Nicole:Okay. And, this is a reminder if you’d like to submit questions you can do so through the question box in the right side in your webinar control. [Inaudible 00:46:58] I have a couple of long ones. Let me see here.
Back to All Q&As ↑

2. Should we be charting the pressure drop in the gauge to be able to differentiate between temperature changes in actual leaks?
Tim: Yes, that’s actually bringing up a very good point. And what they’re kind of going to there is that with a pressure drop caused by temperature it will slow down over time. So, if you were to chart it, you will see kind of a curve to it. Where it will start out with a big change in pressure and towards the end, it’ll end up being a smaller and smaller change. Whereas an actual leak, it will be much more linear with time. So, it’ll start out big and will stay big, or it will start out small and it will stay small.

Nicole: Alright.
Back to All Q&As ↑

3. On the subject of head correction, in the calibration lab I worked in, with the Air Force program in an Air Force base in Phoenix, we mostly used ppc2 to calibrate our pressure gauges. We corrected for head height by measuring the distance in height between the test port of the ppc2 to the center of the board on top of the pressure gauge. In your experience, at what point, or what point would reference in a digital pressure gauge?
Tim: Okay, what the question is asking, is getting to, is that, when I said there is a difference in height between the two devices, the question is, well, where on the device are we talking about? Is it the, say you have a 10-inch diameter dial gauge. So, okay, well, do I have to line up the bottom of the gauge, or the top of the gauge, or the middle of the gauge? Where do I need to line it up to say okay that’s, the same height? And so what you’ll, you’ll have is a reference line, as some vertical point on the gauge that is the, the reference plane. That’s where the pressure’s being measured, and where it matters. The reality is to say, okay, is that always the center, or is that always the bottom of the test port? There is no, 100% answer there. The key thing is that on your calibration, you should specify what reference plane you used. So, if it’s the, the bottom of the test port, or if it’s the center of the gauge, you need to specify which one you used, so the user of the gauge can consider that in their application. If all else fails, if you don’t know which one to use, try to find the original calibration report from the manufacturer. Hopefully they’ve specified it. It’s very common to now use the bottom of the test port, if it’s a vertically mounted gauge, to use the bottom of the test port as the reference plane. Or perhaps, they’ll have even put a line on the gauge somewhere saying this is the reference plane, but the bottom of the test port is normally the, what you’ll see quite often nowadays.

Nicole: Okay, I’ve got another long one
Back to All Q&As ↑

4. The majority of the gauges we have are pneumatic ones. When our calibration vender comes to perform the calibration on the gauges they bring pressure gauges as the standard that is attached to a system that has the pressure increased by a hand pump. The system then is used, uses distilled water. Most of our gauges are under 160 psi. How harmful is this to our gauges and our systems it's attached to? The systems are mainly used; these are mainly used on our autoclaves or fermenters.
Tim: Without knowing the exact gauges and usage it’d be impossible for me to say are you, you’re 100% good or you’re 100% definitely have a problem. If you’re calibrating gauges like that, the one with distilled water is good, that’s going to lead to less contamination than say using oil, so that is a common thing. If I was calibrating a hundred, 160 psi gauges in an application that requires them to stay clean, I would probably use gas myself, but, the overall what they’re doing is not wrong. Although you need to make sure they’re not over pressuring the gauge and that the media they’re using is clean, so it may have been distilled water when they first put it in, but you need to make sure that it’s remained clean so they’re not contaminating your devices.

Nicole: Okay.
Back to All Q&As ↑

5. Is there a rule for using certain size charts for certain pressures?
Example, can we use a 2,500-pound chart to pressure test 10,000 psi?

Tim: Can you repeat the question?

Nicole: Yeah, and I’m trying to read this. It is 1,000 psi. So, it says, is there a rule for using certain size charts for certain pressures? Example, can we use a 2,500 and they have the pound symbol, so.

Tim: Okay.

Nicole: Okay, 2,500-pound chart to, to pressure test 1,000 psi?

Tim: Okay, so the example I gave where, on the test read to making sure your reference is okay, I showed, okay, using 1,000 psi on a gauge to calibrate 100 psi is not good. Is there a point where it is good? And the answer is you’ll have to do a little bit of math. If your standard has a full scale 2,500 psi, and so it has a specification of say, .02%, then you’ll need to look at the specification on the device on your test, convert both those to pressure values and then compare those. And if you’re within the ratio that you’re aiming for, say four to one, then you’re okay. So, it’s often times it can be fully viable to calibrate 2,500 psi or calibrate, excuse me, calibrate 1,000 psi device using a 2,500 psi device, but it depends on the accuracy that you need. In that example, if you need four to one, you’re 2,500 psi is .02%, that means it’s plus or minus 0.5 psi, so four times that, as long as your device on your test is plus or minus two psi, then you’re okay.

Nicole: Thank you. Okay, that concludes our time for today. If anyone has further questions, or would like to find out more about Transcat products and service offerings please contact us at 800-800-5001, on the web at transcat.com or you can email me directly at nvanwert@transcat.com and I’ll make sure we get this on to answer your question. Thank you very much for joining us today and thank you, Tim. We hope you got something out of the presentation and that you continue to join us for future Transcat e-learning webinars. Thanks everyone.
Back to All Q&As ↑


Have more questions? Contact Transcat today!