Transcat Calibration servicesTranscat Calibration services
Transcat Webinars and Online Learning

Field vs. Laboratory Pressure Calibration: How to Decide

Tim Francis | Fluke Calibration

The question of whether to perform a pressure calibration in the field or in a controlled laboratory setting is one that presents a challenge to many technicians. In this 30-minute webinar, Tim Francis of Fluke Calibration outlines the key considerations that help guide that decision making process and details some common challenges in pressure calibration.

Overcoming the Challenges in Pressure Calibration

Thank you for joining us on our call today. My name is Nicole VanWert from Transcat, and I'm going to be your moderator this afternoon. Our webinar topic is In the Field or in the Shop, Overcoming the Challenges in Pressure Calibration. This topic is being presented by Tim Francis from Fluke Calibration. Tim is the Product Marketing Manager for Pressure and Flow and Fluke Calibration. He began his career in Pressure Metrology in 2002 at Rufka, where he served in a number of different engineering roles, including Applications Engineer. He joined Fluke Calibration in 2010 following the Rufka acquisition. He holds a Bachelor's degree in Computer Science from the University of Texas at Austin, and he has an MBA from Arizona State University. Tim works in Fluke Calibrations in the Phoenix Arizona facility. We expect today's presentation to to last roughly 30 minutes, and then we'll open it up for Q&A. During any time of the presentation you can send questions through the question box to the right in your webinar controls. I also want to mention that this webinar is being recorded. Each of you will receive a follow up email with a link to the recorded webinar and the slides of today's presentation. At this time, I'm going to turn the presentation over to Tim.

Agenda: Overcoming the Challenges in Pressure Calibration

Thank you, Nicole. Welcome everyone, thank you for joining us today. As Nicole said, my Name is Tim Francis from Fluke Calibration, and I have the honor and privilege of talking to you today about Pressure Calibration. A brief agenda of what we're going to be going over:

  1. We'll cover the vocabulary, make sure that we're all using the same terms to mean the same things, especially in terms of this presentation, but just overall. There's a lot of vocabulary that's unique to pressure calibration, and in order to make sure that we're increasing our overall knowledge base, we need to make sure that we're using the right vocabulary and the same vocabulary to mean the same things.
  2. We'll then talk about some of the different challenges that those doing pressure calibration face, and especially those that affect the decision and the challenges in doing calibration work both in the field and in a laboratory type of environment.
  3. And as Nicole said, we'll leave some time at the end for questions and answers.

Back to Top of Page ↑

Pressure Calibration Vocabulary

So let's move first into the vocabulary. And the first subject that we're going to cover is what's called reference modes. You'll often see pressure values followed by certain words like gauge or absolute. You can also see it kind of truncated or shortened to say PSIG or PSIA. And the question is what exactly does this gauge or absolute mean? What is a PSIG, and what's a PSIA and are they different from each other, and how does that affect our calibration process? What do we have to do differently if the pressure's in question are gauge verses absolute?

Reference Modes: Gauge, Absolute, & Differential Pressure

The first thing we need to consider is that measurements are relative. Kind of goes back to Einstein, the theory of relativity, everything is relevant, a relative measurement is definitely one of those as well. But what I mean by that is you don't say I live five miles. That just doesn't make sense. You have to say I live five miles from here, or I live five miles from my work. Everything has a starting point. If you're looking for directions in your phone, it's going to tell you how to get there from where you are. It's not just... It just doesn't make sense to say directions to this place without saying what the starting point is. Well, pressure is the same way, you gotta have a starting point. Saying that the pressure is 5 PSI and only saying 5 PSI, doesn't say the whole story. It doesn't give you all the information that you need. The long way of saying it, the more proper way, would be to say something like the pressure is 5 PSI above atmosphere. But that's kind of a long winded statement, and will not be very effective when you have a lot of different pressure values that you're looking at. So the terms gauge, absolute, and differential are simply shorthand for this.

Gauge Pressure Mode

So look at gauge mode. Here everything is reference to atmospheric pressure. So if you have your dial gauge or pressure transmitter, your pressure measuring device, and you have its test port opened up to atmosphere, just opened up to the air in the room, then it's reading zero and that means your test pressure is the same as atmosphere. This is probably the most common reference mode, just because it has a lot of usefulness.

When looking at this mode you can see values that are both positive and negative, and simply, that's saying that if it's positive then that means you have more pressure in your test vessel, in your system than what you have in the atmosphere in the room. If it's negative, then that means you have less pressure and the number is how much less pressure than what's in the room, what's in the atmosphere. And gauge mode can you usually be the easiest to measure.

Slide 1

There's a lot of different devices that can measure gauge mode pressures. They can be easily operated and we'll call easily zeroed. And so they're abundant and useful. Some examples of different gauge mode measurements are things like the pressure in your tires. So when you're filling air in your tires and you've got your little tire gauge, the pressure that it's displaying, that it's outputting, is a gauge mode measurement. What you're interested in is how much more pressure is in my car tires than in the atmosphere around me? So it has, and it's used quite frequently in many different process measurements. You could also, for example, use it to measure the pressure in a football, like perhaps if you're a New England Patriot.

Absolute Pressure Mode

Absolute mode. Now your reference to a perfect vacuum, and we see that in a diagram to the right. So in this diagram we show that there's this blue line that's moving all around, that is our barometric pressure, or atmospheric pressure. Gauge is simply the difference between that blue line and our test pressure. So that's represented by this red. And it can be a plus or a minus, plus being above pressure, above atmosphere, minus being below. Absolute pressure is reference to a perfect vacuum. So that's where one end of the line is down at zero. And so this blue line up at the top is an absolute pressure, and shown in this case to be a stable, absolute pressure. So if you have a reading in absolute mode and it says zero, that means it is a perfect vacuum. If it's 0.0000, that's a perfect vacuum, no pressure in the system, no pressure whatsoever. That's kind of a theoretical point because it's extremely difficult to get every single molecule out of the system, if not impossible.

Slide 2

But what this also tells us is absolute pressure can only be positive numbers. You can't have negative number of molecules in a vessel. Makes sense. You can have molecules in the vessel, you can have zero if you try really hard, and theoretically, you can't have negative. So if somebody were to say the pressure was -1 PSI absolute, that wouldn't make any sense, that would not be a viable pressure measurement. Absolute pressure can be a more difficult measurement, but an extremely useful measurement. And the challenge on what makes it a more difficult measurement is re-zeroing the measuring device, re-zeroing the reference. What we mean by that is just about every measurement device will suffer from what we call zero drift. That is that over time the reading of the device will drift, where it will have an offset throughout its entire range. This is what we call zero drift. With a gauge mode device, it's very simple to resolve this. Open the device up to atmosphere, and if it's open to atmosphere, then you know it's seeing zero-gauge pressure. So you simply take whatever the reading is, you subtract that off, so that you get a zero reading. And you subtract that value off the entire range. Most pressure calibrators have a function to do exactly that. With absolute mode it gets much more difficult. Cause the only place where it should specifically read zero is at a perfect vacuum, which we said is pretty much impossible. So to re-zero an absolute device, you really need a device that's more accurate than it, that you zeroed against. So if you turn your device on, you open it up to atmosphere, it's reading something like 14.7 PSI absolute. You then have a very accurate barometer that says no actually, the barometric pressure in the room is 14.685, and so you know that ok, we can correct what... Instead of 14.7, it should be reading 14.68. That means you need that other device. And so a lot of times instead of zeroing it as a routine process, you can zero it as part of the pressure calibration process once a year or so, six months, a year, whatever the calibration interval is. And so you have to include that zero drift in the overall accuracy of the instrument. It's not an impossible situation, but it does make it more difficult than your normal gauge mode calibration.

One example of absolute mode is the barometric pressure measurement. So as we said, the pressure in the room is by definition in gauge mode is zero. So that's not very useful to us. If we want to know what the actual barometric pressure is and that's useful information to have to figure out air density and other factors. But if we want to know what that barometric pressure is, then we have to look at what is the pressure in the room in comparison to a perfect vacuum. And standard barometric pressure, which is defined at sea level, is approximately 14.7 PSI absolute. That barometric pressure will change depending upon weather and other conditions.

Another example of an absolute mode measurement is the altitude of an airplane. 14.7 PSI is the barometric pressure at sea level, as you go up in altitude, then the barometric pressure gets lower. An airplane's altimeter makes use of that fact to determine what the altitude of the airplane is. It's basically measuring the pressure outside of the aircraft in absolute format, and can then calculate out what the actual altitude of the aircraft is based upon that pressure reading. Another example in the oil and gas industry, are down hole tools. If you're measuring the pressure at the bottom of an oil well, you can't really reference that to atmospheric pressure. You can't open one side of your device up to atmospheric pressure, because you're a few thousand feet below the ground. So the pressure measurement from a down hole tool is often times an absolute measurement.

Differential Pressure Mode

The third main reference mode is what we call differential mode. And that's where your pressure is reference to another pressure. In many ways you can think of all pressures of being differential mode, just gauge and absolute being special types of that. Where the other pressure that your reference to is either atmosphere in the case of a gauge mode or a perfect vacuum in the case of absolute mode. Differential mode is just saying there's reference to another pressure. And one of the key examples of that is a line pressure, where it's reference to a much higher pressure. Which is like what we see up here where we have these two yellow lines. The bottom line could be thought of as the line pressure, and the differential measurement is just how far different are we from that line pressure. So zero means there's no difference in the two pressures, whereas a positive pressure means that your test pressure is slightly higher or a negative would mean that your test pressure is lower than your line pressure.

As I said, gauge mode can be thought of as a special case of differential mode, where the line pressure is atmosphere. And this can be... Well, I would say gauge mode is pretty easy, differential mode can be much more difficult, especially if you're looking at the case of that high line pressure, where the pressure that you're comparing against is much higher than your differential pressure. An example of that situation is a flow measurement in a pipeline. The overall line pressure of the pipeline may be a couple of thousand PSI. To determine the flow of the fluid through that pipeline, you may put an orifice in place. As the fluid flows through that orifice, it'll create a pressure differential across the orifice, where the pressure on one side is higher than the pressure on the other side. And if you know what that pressure difference is, then you would be able to calculate out what the flow of the fluid is. That pressure drop is much lower than the overall line pressure. So if the line pressure's a couple of thousand PSI, that pressure drop will most likely be lower than say thirty PSI. So it can make the overall measure pretty difficult.

Slide 3

See Transcat’s Webinar on Differential Meter Calibration for more details on this topic

Another example of differential mode is what we call a draft range measurement. And that's where you're looking at the difference in pressure of two rooms. A key example of that would be rooms that you have a positive pressure in, such as the operating room at a hospital. Their specifically designed so that when you open the door, none of the germs from outside of the operating room get sucked into the operating room. But instead, the other way happens, that the air from inside the operating room moves out toward the outside. And you do that by having a higher pressure inside of the operating room versus the outside room. So your interest in seeing what the difference is between those two rooms, and that in essence is a differential pressure measurement. We refer to it as draft range because it's a pressure that can be generated just by the draft of air.

High and Low Pressure Measurement

That leads us into our next discussion which is how high is high pressure and how low is low pressure? There isn't an official definition. So when somebody says this is high pressure, that may mean something different for different people. What I've provided here is a general outline, is what I'll use during the course of this presentation.

Vacuum Pressure Calibration

The term vacuum is anything less than atmosphere. You can then get into low vacuum and high vacuum, and high vacuum is actually the lowest number of molecules. Draft range is that low differential or gauge mode pressures. So it's the difference in pressure between two rooms. A very small amount. Often times it will be measured here in the US in a unit like inches of water, or internationally use Pascal or Millibar.

Low Pressure Calibration

Low pressure are pressures we'll normally use a pneumatic device to do the calibrations. Oftentimes, three hundred PSI or less. That's basically because that's pressure that's easy to generate by hand with a pump. You are seeing those pressures, that pressure range kind of increase over time. But traditionally, what we would call that low pressure or pneumatic pressures are in that 300 PSI or so.

Slide 4

Medium Pressure Calibration

Medium pressure is a range where you can see both pneumatic or hydraulic being used. An example here, so generally between three hundred and two thousand PSI. And the break off there is because a pneumatic pressure is still readily available, and most places, up to two thousand PSI because you could use a nitrogen bottle as your gas supply.

High Pressure Calibration

High pressure is almost always hydrologic, unless you're concerned with contamination, you can't use hydraulic. And that's two thousand to about twenty thousand PSI. And so these are pressures where you use hydrologic because it's easier to generate higher pressures and it's safer.

Very High Pressure Calibration

And then very high pressure, looking at things greater than twenty thousand PSI. Fluke Calibration products can include measurement devices up to sixty thousand or even seventy-five thousand PSI. Sometimes there's need for higher, but it's a very rare case. As I said, there are no official rules on this, this is kind of just a general statement to give a frame of reference. For some people who focus in on specialize in one particular range, they may consider everything else as being high pressure. For example, if all you do every day is work in the draft range, then even three hundred PSI is really high pressure for you.
Back to Top of Page ↑

Pressure Calibration Challenges

Now there are some challenges that are created by these different topics that we're looking at here. Specifically reference mode. We'll look at each of these questions individually. If you only have a gauge standard, how do you calibrate an absolute device? Or you see things like -15 to 30 PSI, but I can't get down to -15. We'll talk about some of the challenges with differential devices, and also some of the unique terms, like sealed gauge. What does that mean? How is that different from gauge mode? Let's look first at a situation where you're needing to calibrate something that's absolute mode device, but all you have is a gauge standard.

How to Calibrate an Absolute Device with a Gauge Standard?

First thing you need to consider is there will be an offset that's equal to barometric pressure. So if you have an absolute device as your test and gauge mode standard sitting next to each other, one of them, and everything's opened up to atmosphere, one of them will be reading about 14.7 PSI and the other one will be reading zero. And that offset, that amount of the offset will change with the barometric pressure. So when you first turn them on they may be reading 0 and 14.7 PSI, as you sit there, the barometric pressure in your room changes, one of them will continue reading zero and the other will read say 14.3 PSI. So you've got an error that's more or less offset. But the amount of that offset will change with barometric conditions.

If you're working with high pressures, so if you're at ten thousand, twenty thousand PSI, or higher, these problems may not be an issue for you. You can just estimate what the barometric pressure is, add it to your standard reading. So if your device test is reading 14.7 at atmosphere and your standard is reading zero, you just add 14.7 to your standard as an estimate of barometric pressure, and make the assumption that it's not going to change enough that at ten thousand PSI or twenty thousand PSI, that the amount that it changes will matter. Cause there you may only be looking at what's the pressure plus or minus a PSI, plus or minus two PSI, plus or minus ten PSI. If that's the case, then if your barometric pressure changes by .1 PSI, it doesn't really affect your overall measurement.

With lower pressures, that's not necessarily valid. In low pressure situations, you'll need to measure barometric pressure and continually adjust your standard reading to compensate. So if you have a barometer at atmosphere where your standard's reading zero, your barometer is reading 14.68. So you know, ok, my actual pressure is 14.68. As you go up in pressure, your standard may be reading a thousand and your barometer is reading 14.7. So now you know that the actual pressure is 1014.7. That will allow you to continuously adjust and take into account the changes in barometric pressure when you're trying to calibrate an absolute device using a gauge standard.

Why Can’t My Vacuum Pump Measure to -15 PSI?

A common question we get is ok, your data sheet says -15, but I can't actually get to -15. The thing's broke, what's wrong with it? Well, let's step back and look at it a bit. At barometric pressure, we've defined that barometric pressure is zero PSI gauge. It's also approximately 14.7 PSI absolute. So we're now saying that I want 15 PSI below barometric pressure. That's -15 PSI gauge, which is -0.3 PSI absolute. That's not physically possible, as we said. Absolute pressures cannot be negative. Absolute pressure is always positive. So it's just not physically possible. It doesn't matter how good of a vacuum pump that you have, how leak tight your system is, you can't go any lower than -1 times your barometric pressure. And that's going to different at different locations throughout the world. And the biggest determiner is altitude. So like Denver, Colorado, at a mile high you're approximately 12.1 PSI absolute. So the lowest you can possibly go in a perfectly leak tight system with most powerful vacuum pump is -12.1. Honolulu, Hawaii, right there at the beach, you're at sea level, you can get to -14.7. But if you climb to the top of that volcano, where you're much higher, in fact, you're more than a mile up, now your atmospheric pressure is only 8.7 PSI absolute. So the lowest you can possibly go is -8.7. So why do you we say -15? Generally speaking, that would be the...

Slide 5

It's rare for atmospheric pressure to be greater than 15 PSI. And in reality, it's more just a rounding situation. You have to be cognizant of your location, your barometric pressure, and seeing what it is. So even sea level may not be possible to you get -14.7, but at the same time, that's your best case, is at sea level. So if the auditor or your manager is complaining saying, hey, this device has a range from -14 to zero or -14 to 30, but you only went down to -12, tell him, well, we need to do this calibration at the beach. Let me go to the beach and then we can get it down to -14 for you.

Calibrating at a High Line Differential

A very big challenge is calibrating something at a high line differential. Once again, what we mean by that is where the line pressure, the starting point, is a very high pressure. And then we're measuring, the test pressure being slightly different than that line pressure. So the example here is a pressure transmitter used on a pipeline. So that pipeline has a pressure of about 1500 PSI absolute, and it's measuring the drop in pressure across an orifice. That drop in pressure is about 30 PSI. That would be 30 PSI differential. Generally speaking, it's always best to calibrate a device as similar to how it's used in the real world as possible. So that way your calibration accurately covers the potential errors that the device is used in its normal application. If we're going to do that in this case, then we need to apply 1500 PSI to one side, 1500 PSI absolute to one side of the transmitter, and 1530 PSI absolute to the other side.

Slide 6

That's difficult to do and to measure correctly. We could do it by, we could have two absolute mode standards and say 2000 PSI absolute full scale. And we apply those pressures and read them off of there. But our overall error in terms of the differential will be quite high. So that's difficult to do, so it's commonplace to normally leave one side open to atmosphere and apply 30 PSI to the other side. So there, you're still measuring the same differential. So the differential style device will always have two test ports, a high and a low. The pressure on the high side is 30 PSI greater than the low side. The difference being that when it's actually used, it's still 30 PSI different, but the low side is 1500 and the high side is 1530. Here it's zero on one side, or atmosphere on one side, and 30 PSI higher on the other. Is it ok to do that? It depends on who you ask. But in most applications, it is acceptable.

Sealed Gauge Calibrations

There's also a term called sealed gauge. What does that mean? It's kind of a hybrid of gauge mode and absolute mode. In this case, if the test port is open to atmosphere, it reads zero, just like a gauge mode device. But if the barometric pressure changes, then the measurement will change with it, like an absolute mode device. So basically it's a gauge like reading, but the backside of the measure device, of the sensor is sealed off and is not actually open to atmosphere. So it doesn't stay at zero with atmosphere changes. So it's similar to using an absolute mode device and subtracting a default barometric pressure value from it. And this is useful for high pressure applications where the variations of barometric pressure don't affect your overall measurement.

So we talked about the agenda and we looked at some of the challenges associated with the vocabulary. Let's look at some more challenges that affect calibrations. First thing is always when you consider safety.

Pressure Calibration Safety Considerations

A pressurized vessel has some inherent safety risks. So make sure that all equipment is rated to the proper pressures to the work you're being performed. There's different ways to do this. Some shops will only stock pressure lines and fittings that's the highest pressure they possibly work at. That works well if the highest pressure is relatively low or they're only working in one region of pressure. It doesn't necessarily work if you do both draft range and 20,000 PSI. Because the lines that will work for 20,000 PSI don't work very well for one inch of water. You need to make sure you use the proper PPE or personal protection equipment, such as safety glasses.And when working in the field be cognizant of the dangers around you. And the safety of the calibration technician should be a key concern when determining if you should do a calibration in the field or back in the laboratory. If the device is installed in a location where it's not safe to do the calibration, then it should be taken back to the laboratory or the shop to do that calibration. You also need to make sure that you have the equipment that's right for that area. If it's in a intrinsically unsafe or a hazardous environment, potentially explosive environment, then you need to make sure that you have intrinsically safe devices.

Environmental Impact on Pressure Calibration

Temperature

We also need to look at the impacts from the environment on our calibration. For example, temperature. Almost everything that measures pressure is also impacted by temperature. So a pressure calibration manufacturer, such as Fluke, will characterize and compensate our pressure measurement devices, our calibrators so that they can work at different temperatures. But they'll still perform best over a given temperature range. Depending upon the device, that may be zero to 50 degrees celsius, or perhaps wider, or it may be as limited as maybe say 18 to 29 degrees celsius.

Gravity

Some pressure measurement devices, specifically deadweight testers, are impacted by gravity. The acceleration of gravity will alter the pressure value that the dead weight tester is generating. And that can be by a very large amount, without going into too much detail. So when you're, so one of the was that the environment impacts our calibration is local gravity. The acceleration of gravity is different at different locations, and we need to take that into consideration at the location we're doing the calibration. If our dead weight tester has been built for a specific location, for our shop, and we're moving it a hundred miles or so down the road, then the local gravity may change, may be different, and we need to take that into account.

Head Height

Another thing we need to consider is what we call head height. That is the difference in vertical elevation between the reference and the device on our test. And what it will do is it will create an error. Basically the weight of the fluid will generate its own pressure. You can think about that as if you're in a swimming pool, the pressure at the top of the pool is much less than the pressure at the bottom of the pool. Same thing applies here, and the phenomena occurs whether it's gas or oil. Even a column of gas can create a pressure, even though we don't think of it as weighing anything, it really does. So three feet of gas, that could be less than a .01 percent reading. The three feet of oil is about 10 PSI. What this means is that at lower pressures, gas is a preferable medium, whereas because at low pressures, 10 PSI might make a very big difference. But when you're measuring 10,000 PSI, 10 PSI might not matter nearly as much. Large drafts, temperature swings, or changes in barometric pressure can make it difficult to get pressure to stabilize. That's a huge thing we need to consider when we're looking at the environment that we're doing our calibration in.

Contamination

Another thing to consider is contamination. Dirt, debris and liquids from the device that you're testing can cause problems. They can damage your calibration equipment, they can cross contaminate other devices in your test, and just play havoc in your overall system. How do we combat it? Clean the devices before we connect to them, make use of filters, transmitters or separators, have separate pressure lines and fittings for different types of fluids, so that you have one set that's used with oil and another set that's just used with air. So that way you don't get oil into your air systems. Examples of filters or traps or separator can be used, shown here at the top is an example of a separator. Here, you've got on the bottom where you can connect to you reference, at the top is where you can connect your device on your test. And there's a diaphragm in the middle.

Slide 7

So when you change the pressure below it pushes up on the diaphragm and changes the pressure above. But the media never actually comes in contact with each other. This works great on a higher pressure oil system. A trap is like this device below that has the clear acrylic walls. So your device on your test is connected to the top, and your reference is connected below, for the debris to go from the device on your test to the main unit. It's going to go up or go down and back up and back down again. And that acts as a gravity trap that will keep the debris from reaching the standard.
Back to Top of Page ↑

Field Calibration vs. In Lab Calibration: How to Decide

So finally that brings us to the question, how do we decide if we do our work in the field or in the lab? We need to consider a few questions:

  1. Can it be done safely in the field?
  2. Does the environment allow for the calibration to be done properly?
    Obviously our one would hope that the device that we're testing, that the environment that it's installed in is appropriate for that device. But we also have to look at what environment is appropriate for our reference standard. It may only work under a much smaller temperature range, or it needs, we need to know local gravity for it. So is the environment appropriate for our calibration overall? Or are there current wind drafts or other things that cause it to not be able to get a stable enough pressure to get a good reading from our reference? And can you easily connect to the device on your test? Is it a situation where you're going to have to run a thousand feet of hose up to the device on your test in order to apply pressure to it? Well that's a thousand feet of head height offset. Or is it a situation where it's better to calibrate a similar device? So if it's a pressure transmitter, you've got two of them that are very similar to one another or is the same model and range and everything, calibrate one in the shop, rotate it into the field, bring the other one back for calibration. These things need to be considered when making that choice.
  3. And then finally we need to make sure that we have the right equipment.
    So make sure you have devices that cover the proper range and mode. If you've got a device that's only a gauge mode device reference, but you routinely need to do absolute mode calibrations, then look at are there pressure modules or other such things, or maybe you need to get another device that can do absolute mode, so you can make sure you're not inducing additional error that doesn't need to be there. It's also not always appropriate to calibrate low pressure with a higher range instrument. The errors and inaccuracies of the measuring device are often times a function of its range. So if the device, if you've got a gauge that's a thousand PSI and you're trying to measure 1 PSI with it, it may not work very well for you. But once again, that's something you can possibly, you can get around by adding pressure modules that are similar with it. Example here is like the Fluke 3130 that we show here with a Fluke 750P pressure module. The 3130 has an internal range of 300 PSI gauge. If I wanted to calibrate absolute mode devices, I can simply connect to absolute mode 750 P module, or if I wanted to calibrate very low ranges, like a draft range, or high pressures, I could possibly do the same by adding those range 750 P modules as well. Other things to consider is portable or bench. Portable like instruments like the Fluke 718, they can be used on the bench and work well there. But depending on them, that's not necessarily what they're designed for. So they're designed for the field and they'll work much better in the field, so they aren't always ergonomic for the bench. But depending upon your use case and your situation, it could work just fine for you.

So in conclusion, pressure calibrations can be challenging, but these challenges can be overcome with the right processes and the right equipment. So I will turn it over to Nicole now to go over any questions.

Ok, before we go into the questions, I just wanted to make everyone aware of a special limited time offer that's going on right now with Fluke and Fluke Calibration. There's a gift with purchase offer where for purchases over $250, there's a tiered gift with purchase. There's seven tiers, starting at $250, and I think the top level might be over $10,000 purchases. But you can choose the gift with purchase, valued at up to $1200. So if you'd like details on that, you can go to Transcat.com/deals.

Please Note: promotions detailed in this webinar may be expired
Back to Top of Page ↑

Still have questions? We're here to help!


Questions and Answers

1. What is the difference between gauge and bi-directional gauge? ➩

2. Can relative humidity affect pressure calibration? ➩

3. When a device under test doesn't define the inches of water temperature, what is the industry standard temp? ➩

4. If gauges are going to be used at ambient temperatures, should they be calibrated at that temperature or under temperature control? ➩

5. When calibrating to 87,250 PSI, is a dead weight tester applicable? ➩

6. You mentioned gravity as a consideration when calibrating. With a distance of only a hundred miles, that was your example. How different is the gravity between two places? ➩

7. What method would you recommend for sourcing low differential pressure? For example, .25 inches of water column. ➩

8. What should the accuracy of the reference instrument be in comparison to the test instrument? ➩


1. What is the difference between gauge and bi-directional gauge?
Ok, you also see the term bi-directional gauge, and that simply means that it can read both negative, so subatmospheric pressures, as well as gauge pressures, as well as positive pressures. So gauge mode often times can mean just zero and above. It's how some companies will consider it. Whereas, bi-directional means that it would read from say -5 to +5 PSI, or -15, -14.7 up to 15 PSI.
Back to All Q&As ↑

2. Can relative humidity affect pressure calibration?
Ok, so the question is can relative humidity effect pressure calibration? Well, at some level just about anything can affect pressure calibration. So the short answer is yes. How it can depends on the instrumentation you're using. There are some measurement devices, some pressure sensors that are susceptible to humidity, where it's moister then they may give a different reading. Another thing is the humidity, a very dry climate, for example, could possibly affect the electronics that are being used inside the instrument. Or another example is that a dead weight tester is impacted by the density of the air, which is impacted somewhat by the humidity. Just about every instrument will have a humidity specification, saying, ok, the humidity must be within this range and normally non-condensing for the instrument to work correctly. So you should pay attention to those situations.
Back to All Q&As ↑

3. When a device under test doesn't define the inches of water temperature, what is the industry standard temp?
So first off, there's a measurement unit called inches of water, and it's based off of pretty much the same phenomenon as head height, in that pressure can be defined as the density of the fluid, times the height, times gravity. And the pressure unit inches of water are the pressure that's associated with a calm water that's one-inch-high times standard gravity times the density of water. Well the density of water is not constant. It is different depending on the temperature of the water. So while you have this unit measurement called inches of water, you have in reality, you have three or four different versions of it, where it's inches of water at different reference temperatures. The most common ones would be like 4 degrees Celsius, because that's where the temperature of the water is most well-known and that's where it's the highest density. Or also 20 degrees Celsius, that's a very common reference temperature, especially for dimensional measurements. So the question is if the device just says inches of water and doesn't say what the reference temperature is, then which one do you choose and is there an industry standard? To the best of my knowledge, there is not an industry standard across the multiple different vertical industries and stuff. You'll see perhaps in like the natural gas industry, I think it's more of a 23 degrees C, but I'm not a hundred percent positive on that. And other industries will pick others. I would say always contact the manufacturer of the device and find out. If they don't know it may mean the difference in the densities is such that it's not going to impact the overall, that the accuracy of the device is for lack of a better word, so bad, that the differences in density won't matter. But some people are just also not aware of the situation. I would suggest that your laboratory set up a policy that you always use the inches of water that's defined by the manufacturer. So especially if it says what it is, you make sure you match that on your reference. If it's not defined, then you pick one as your definition. Personally I'm a big fan of 4 degrees C. And that on your calibration reports where you're using inches of water, that you define what the measurement conversion factor is that you used. So each of these, while their history and definition is based off of a calm of water, is not very frequently that they're actually measured with a calm of water anymore. They're using some sort of electronic sensor and so forth, and it's just converting from say Pascal to inches of water. So if you include what the conversion factor is, then everybody knows ok, here's how to get back to a real measurement.
Back to All Q&As ↑

4. If gauges are going to be used at ambient temperatures, should they be calibrated at that temperature or under temperature control?
So the more general question is, do you need to calibrate a device at the same temperature where it's going to be used?

Generally speaking, you can calibrate the device at ambient temperature and be ok. And not necessarily at a control temperature. When we manufacture a new device, we will characterize it at multiple temperatures. So we'll put it in an oven, run our environmental chamber, run it at low temperatures, high temperatures, look at the pressure output, add those different temperatures, install a bunch of coefficients into the device and correct for that. The calibration is then put on top of that as the final characterization or the final calibration of the device. For most devices, the behavior with temperature doesn't change with time. So the overall behavior of the device will change, it will drift, so you can calibrate it at one temperature and know it will work at all the temperatures within its range. There are situations depending upon the design of the device, where the calibration instructions will specifically say it needs to be calibrated at a specific temperature, plus or minus X number of degrees. And they do that because that final calibration run is assuming that it's being used, or requires it be used at that particular temperature so that you don't end up having the temperature effects in that final calibration run, but those stay as part of the compensation process. So generally speaking, pressure calibrations are oftentimes, when you calibrate a device on just an ambient temperature, always check the user manual of the device you're calibrating and make sure that that's ok. It's rare that as part of the calibration process that you'll need to put it in an environmental chamber or something like that to keep it at a constant temperature.
Back to All Q&As ↑

5. When calibrating to 87,250 PSI, is a dead weight tester applicable?
There are dead weight testers that will go up to that high of pressure. I can't speak to them and their usability as we don't actually... The highest that we manufacture up to is approximately 75,000 PSI. But you will see a particular design of dead weight tester that can go to that high of pressure. Not very common though.
Back to All Q&As ↑

6. You mentioned gravity as a consideration when calibrating. With a distance of only a hundred miles, that was your example. How different is the gravity between two places?
Yeah, so how different is gravity between two places? It really depends because if it's a function of altitude, the distance from the equator, and geological conditions. So it's possible that you can find two locations very far apart that have similar gravities. But at the same time, if you're going from the south to the north and you're going up a mountain or there's a geological formation difference, then you can see a very large difference in the acceleration of gravity. There are a couple of websites available, one that's specifically for the United States, by the National Geological Society, I believe. And then also one done by PTB in Germany for the entire world, where you can put in the latitude and longitude and altitude of a location or elevation of a location, and it will provide you with the estimate of the local gravity of that location. So in your particular case, you can put in the two different locations that you're interested in, be it your shop versus a hundred miles down the road, and see how different those two are. To find those tools, obviously you can search the internet. But if you go to FlukeCal.com, we have an application note that talks about this and provides links to those two schools.
Back to All Q&As ↑

7. What method would you recommend for sourcing low differential pressure? For example, .25 inches of water column.
So low differential pressure. .25 inches of water column. To give everybody reference, that's a quarter inch of water, that's definitely a draft range pressure measurement, one where the wind in the room, if somebody sneezes, you can possibly see it in your pressure measurement. A question is well how do you source that kind of pressure? There's different tools that are out there to do that. You can use a small variable volume. There's also pressure controllers that are capable of doing that. Example is one from a company called Cetra. There's a portable device that's capable of providing a stable pressure. We also, we manufacture laboratory devices that can provide very stable pressures at that sort of pressure range. There's a whole set of challenges associated with having a stable low draft range pressure like that. Cause like I said, if somebody sneezes, that will cause your pressure to move around. So you need to make sure that your reference is isolated from atmosphere, so that when somebody opens the door, it doesn't change your pressure. And look at ways to perhaps, one, increase the volume on the reference side and on the test side, but also make sure those volumes are more or less the same, so as the temperature changes, it affects both sides equally. Those sort of tools, you can use a manual device, as I said, a variable volume, where it's just changing the volume in the system a slight bit. Or there's also automated pressure controllers available that can make it easy to do that.
Back to All Q&As ↑

8. What should the accuracy of the reference instrument be in comparison to the test instrument?
Ok, so that is a very wide open question, as well. So what should the accuracy of the reference be in comparison to the test instrument? So normal calibration, you've got a device that you're testing and you're going to have some device that's more accurate than your reference. And the question is well how much more accurate does it need to be? There's not a proper answer I can give in thirty seconds or less. There are some rule of thumbs that have been used historically. At one time a rule of thumb was something like 10-1, but it's much more common now to see 4-1. So if your device on your test is plus or minus 0.1, then your reference would need to be something like 0.025. So it's 4-1. The long answer, the more correct answer is you'll need to do an analysis of the uncertainty associated with the two devices and consider that in your overall determination. There are situations, and 4-1 happens to be a situation where statistically, the uncertainty, the accuracy of your standard, has minimal impact upon the accuracy of your device in your test. There are situations where lower than 4-1, 3.5-1, 3-1, 2.5-1 is actually acceptable and very common. So it shouldn't just be immediately discounted as not being acceptable, but should be evaluated for the actual application. If you have a mission critical measurement, one that if the device is out it's going to result in a large amount of recall or potential safety hazard, then you may want to err on the side of caution and have a higher test ratio.
Back to All Q&As ↑


Have more questions? Contact Transcat today!