Error of % + digits?...

On 6/20/2020 9:28 PM, Commander Kinsey wrote:
On Sat, 20 Jun 2020 16:24:41 +0100, Ralph Mowery <rmowery28146@earthlink.net> wrote:

In article <op.0miefkhkwdg98l@glass>, CFKinsey@military.org.jp says...

I\'d need to contract OCD to understand that. There\'s only one thing in question here, how close is the reading to the correct value. You can\'t split that into two. 3.1416 is better than 3.14, and that\'s it. All you can state with a reading is it\'s correct to within a certain percentage.

Try this.

A doctor does a very complicated operation on your left arm like a joint
replacement. It all goes very well. Very precise.

However he should have done the operation on the right arm that was
causing trouble. Not accurate.

Nope, because the first one is 100% useless. I wouldn\'t call that precise at all, as he was out by half a metre.

That is why a voltmeter can show 3 digits and be accurate to only the
last digit being in question by one number either way, but a 5 digit
volt meter can show many numbers, but if it is not calibrated corrctly
the 2nd digit to the 5 th digit could be way off and the meter not
accurate at all.

Showing those extra two numbers is pointless if they\'re wrong. All that matters is how many volts difference between the actual voltage and what is shown.

You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.

Now let\'s compare two different hypothetical meters, both 100%
accurate. Let\'s say that meter A has 3.5 digits (max count 1999)
and meter B is 4.5 digits (19999). Use them to measure a battery
cell of exactly 1.612345V.

Meter A will display 1.612V whereas meter B will show 1.6123V.
Meter B allows you to evaluate the result to a higher degree of
precision.

Further suppose that both meters are not perfectly accurate and
read 1% low. A will show 1.596V while B will read 1.5962V. B is
still more precise in showing you what it thinks the voltage is.
An order of magnitude more precise, in fact, even though both
meters are off by -1%.

That\'s how the term \'precision\' is used in engineering. Perhaps
what\'s confusing you is the fact that the term is more loosely
applied in everyday language.

As to the +/- 3 count (or 1 or whatever) possible error, it\'s an
*uncertainty*, not a fixed inaccuracy, in digitizing an analog
quantity. It will take too long to explain in detail here. Let me
put it this way: If you measure the example voltage above
multiple times with a meter with +/-3 count uncertainty, you may
get a reading that varies from measurement to measurement by up
to 6 points in the last digit. That\'s not a percentage inaccuracy.
 
On 6/20/2020 9:28 PM, Commander Kinsey wrote:
On Sat, 20 Jun 2020 16:24:41 +0100, Ralph Mowery <rmowery28146@earthlink.net> wrote:

In article <op.0miefkhkwdg98l@glass>, CFKinsey@military.org.jp says...

I\'d need to contract OCD to understand that. There\'s only one thing in question here, how close is the reading to the correct value. You can\'t split that into two. 3.1416 is better than 3.14, and that\'s it. All you can state with a reading is it\'s correct to within a certain percentage.

Try this.

A doctor does a very complicated operation on your left arm like a joint
replacement. It all goes very well. Very precise.

However he should have done the operation on the right arm that was
causing trouble. Not accurate.

Nope, because the first one is 100% useless. I wouldn\'t call that precise at all, as he was out by half a metre.

That is why a voltmeter can show 3 digits and be accurate to only the
last digit being in question by one number either way, but a 5 digit
volt meter can show many numbers, but if it is not calibrated corrctly
the 2nd digit to the 5 th digit could be way off and the meter not
accurate at all.

Showing those extra two numbers is pointless if they\'re wrong. All that matters is how many volts difference between the actual voltage and what is shown.

You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.

Now let\'s compare two different hypothetical meters, both 100%
accurate. Let\'s say that meter A has 3.5 digits (max count 1999)
and meter B is 4.5 digits (19999). Use them to measure a battery
cell of exactly 1.612345V.

Meter A will display 1.612V whereas meter B will show 1.6123V.
Meter B allows you to evaluate the result to a higher degree of
precision.

Further suppose that both meters are not perfectly accurate and
read 1% low. A will show 1.596V while B will read 1.5962V. B is
still more precise in showing you what it thinks the voltage is.
An order of magnitude more precise, in fact, even though both
meters are off by -1%.

That\'s how the term \'precision\' is used in engineering. Perhaps
what\'s confusing you is the fact that the term is more loosely
applied in everyday language.

As to the +/- 3 count (or 1 or whatever) possible error, it\'s an
*uncertainty*, not a fixed inaccuracy, in digitizing an analog
quantity. It will take too long to explain in detail here. Let me
put it this way: If you measure the example voltage above
multiple times with a meter with +/-3 count uncertainty, you may
get a reading that varies from measurement to measurement by up
to 6 points in the last digit. That\'s not a percentage inaccuracy.
 
In article <U6sHG.61028$Nj4.43851@fx24.ams1>, nobody@nowhere.com says...
You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.

Sometimes it is precision.

I worked at a company making polyester from raw materials. In a room
was a panel with about 10 temperature gauges. At a certain time all
gauges were marked and a sample of the material was sent to the lab. If
it came back good, then the object was to keep all the gauges on the
mark. It did not matter how far off the gauges were from the actual
temperature. No mater how well we calibrated the guages there were
several other factors that we had no control over. Such as the
thermocouples they were connected to. The specifications were +- 3 deg
C. on the thermocouples from the factory. If the temperature varied
more than 1 deg C at 300 deg C it could mess up the material.

So the object was precision and not accuracy.
 
In article <U6sHG.61028$Nj4.43851@fx24.ams1>, nobody@nowhere.com says...
You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.

Sometimes it is precision.

I worked at a company making polyester from raw materials. In a room
was a panel with about 10 temperature gauges. At a certain time all
gauges were marked and a sample of the material was sent to the lab. If
it came back good, then the object was to keep all the gauges on the
mark. It did not matter how far off the gauges were from the actual
temperature. No mater how well we calibrated the guages there were
several other factors that we had no control over. Such as the
thermocouples they were connected to. The specifications were +- 3 deg
C. on the thermocouples from the factory. If the temperature varied
more than 1 deg C at 300 deg C it could mess up the material.

So the object was precision and not accuracy.
 
On 2020-06-20 17:58, Commander Kinsey wrote:
[...]
Showing those extra two numbers is pointless if they\'re wrong. All
that matters is how many volts difference between the actual voltage
and what is shown.

Engineers distinguish between accuracy, a measure of how close
a observed value is to the true value, and resolution, which
is a measure of the device\'s ability to resolve small changes.
Either specification is useful in its own right, and professional
instrumentation will always have both specs. So even if the
last digit or two of a measuring device are not accurate, they
may still be useful.

You may want to check audio ADCs and DACs for example, which
have atrocious accuracy, but excellent resolution. An example
of the opposite might be a voltage reference, which has excellent
accuracy, but no resolution at all.

Of course in general, there is a tendency of accurate instruments
to have a better resolution too.

Jeroen Belleman
 
On 20/06/20 19:15, Ralph Mowery wrote:
In article <U6sHG.61028$Nj4.43851@fx24.ams1>, nobody@nowhere.com says...

You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.




Sometimes it is precision.

I worked at a company making polyester from raw materials. In a room
was a panel with about 10 temperature gauges. At a certain time all
gauges were marked and a sample of the material was sent to the lab. If
it came back good, then the object was to keep all the gauges on the
mark. It did not matter how far off the gauges were from the actual
temperature. No mater how well we calibrated the guages there were
several other factors that we had no control over. Such as the
thermocouples they were connected to. The specifications were +- 3 deg
C. on the thermocouples from the factory. If the temperature varied
more than 1 deg C at 300 deg C it could mess up the material.

So the object was precision and not accuracy.

I once worked for a company that made an instrument that
measured cable attenuation to 0.001dB +- 0.1dB. The customers
didn\'t care about the 0.1dB, since all they were interested
in was the /stability/ of the 0.001dB and the ability to
measure small changes.

Why? Because the instrument measured the attenuation change
as a function of temperature, and each temperature cycle
test took 7 days. Yes, it was a /large/ drum of undersea cable.
 
Hi,

I just bought an amp clamp meter, and it states the error is \"+/- 1.9% +
3 digits\".  What does the \"3 digits\" part mean?

I\'ll try to explain that with a simplified model of
a digital meter (please everybody correct me if it\'s
oversimplified and wrong):

The typical digital meter consist of some kind of
processing of the signal to be measured and an
A/D-converter that converts it\'s analog input
signal to a number that is displayed.

The input processing serves to transform the quantity
to be measured into an analog signal that is properly
adapted to cover the range of possible input signals
of the A/D-converter.

Let\'s say we want to measure an AC current of 10A with
a clamp meter like yours. Let the display of the meter
have 3 1/3 digits, so the range of displayable numbers
goes from 0000 to 1999 with an additional decimal point
somewhere.

The A/D-converter will not be able to directly convert
a 10A current, so we pickup the current to be measured
with a transformer, the wire carrying your 10A current
being the primary and a coil internal to the clamp
assembly being the secondary winding.

An AC current flow through the wire will induce an AC voltage
in the secondary winding. Since the A/D-converter may not
directly accept AC voltages, further processing may be
required, such as amplification or voltage division and
e.g. True-RMS detection of the AC voltage. All this
processing will end up in a voltage that is suitable for
the A/D-converter - say, 1V DC for 10A of AC current.

All the (analog) signal processing described here will
not be free of unwanted influences and processing errors.
The transformer at the input could e.g. pick up unwanted
magnetic fields, the amplifier could exhibit noise and
nonlinearities, the TRMS detection could exhibit some errors.

All these error sources or influences may be described in
the meter\'s specification as a percentage - e.g. the +/- 1.9%
you mentioned.

Now, the A/D converter converts the analog input voltage
into a number. One method to do this, when speed is not
a critical factor, is (dual) slope integration.

Let\'s assume for a moment that the input voltage is static,
i.e. the 1V DC mentioned before.

Basically, the conversion works by comparing the input
voltage (to be measured) to a linearly rising voltage
(ramp). Similar to a stopwatch, a counter starts when
the reference voltage begins to rise and a comparator
stops it when the ramp voltage is equal to the input voltage.

In our example with 1V input, the counter may stop at a
count of 1000. With the knowledge that, by means of the
input processing and the calibration of the meter, this
corresponds to 10A AC current. The meter would probably
display 10.00 (A).

But: At some time in the process, the counter will switch
from 999 to 1000 in a very short (almost zero) time.
That means, that the input voltage may just be a tiny little bit
less and the counter is stopped at 999, not at 1000.

That means, that for any input signal, you always have +/-1 digit
display uncertainty because you cannot know whether the
counter maybe was just before switching to the next count.

With a specification of +/-3 digits, the A/D converter has
a greater uncertainty when counting. For example, even at
a constant input of 1V, the internal counter may be less
precise and stop at 997, 998, 999, 1000, 1001, 1002 or 1003,
even if the input signal doesn\'t change. You can think of
this as a stopwatch that may be off some counts each time
you make a measurement.

This type of error is not related to the input signal
processing, so it is not very meaningful to express the
error as a percentage of the measured value. It is usually
expressed as a number of digits, because the error is mainly
caused by the process of converting input signals to numbers.

Of course, I know that this very simple single slope integration
is not used in meters, dual slope is the least you can do.
Also, the A/D conversion may contribute to the percentage error spec.
The (over)simplification is just a means to explain why there
are two numbers in the specification.

Just my two cents,

Dieter
 
On Sat, 20 Jun 2020 19:18:24 +0100, Jeroen Belleman <jeroen@nospam.please> wrote:

On 2020-06-20 17:58, Commander Kinsey wrote:
[...]

Showing those extra two numbers is pointless if they\'re wrong. All
that matters is how many volts difference between the actual voltage
and what is shown.

Engineers distinguish between accuracy, a measure of how close
a observed value is to the true value, and resolution, which
is a measure of the device\'s ability to resolve small changes.
Either specification is useful in its own right, and professional
instrumentation will always have both specs. So even if the
last digit or two of a measuring device are not accurate, they
may still be useful.

I can see that, although when I\'ve had an instrument with more digits than its accuracy, it usually has a fluctuation of its own (perhaps through interference from inadequate shielding), so I can\'t actually tell if the real value has changed.

You may want to check audio ADCs and DACs for example, which
have atrocious accuracy, but excellent resolution. An example
of the opposite might be a voltage reference, which has excellent
accuracy, but no resolution at all.

Of course in general, there is a tendency of accurate instruments
to have a better resolution too.
 
In sci.electronics.equipment Commander Kinsey <CFKinsey@military.org.jp> wrote:
I just bought an amp clamp meter, and it states the error is \"+/-
1.9% + 3 digits\". What does the \"3 digits\" part mean?

The \"3 digits\" part is a measure of absolute error (i.e., the amount of
error that does not depend upon the magnitude of the value being
measured). The percentage part is a measure of relative error (i.e.,
the amount of error that does depend upon the magnitude of the value
being measured).

So if you clamp around a wire and get a reading of 1.234A on the
display, then the actual current in the wire could be anywhere within
this range:

1.234 +/- (1.234 * 0.019) +/- 0.003
 
On 2020-06-20 17:58, Commander Kinsey wrote:
[...]
Showing those extra two numbers is pointless if they\'re wrong. All
that matters is how many volts difference between the actual voltage
and what is shown.

Engineers distinguish between accuracy, a measure of how close
a observed value is to the true value, and resolution, which
is a measure of the device\'s ability to resolve small changes.
Either specification is useful in its own right, and professional
instrumentation will always have both specs. So even if the
last digit or two of a measuring device are not accurate, they
may still be useful.

You may want to check audio ADCs and DACs for example, which
have atrocious accuracy, but excellent resolution. An example
of the opposite might be a voltage reference, which has excellent
accuracy, but no resolution at all.

Of course in general, there is a tendency of accurate instruments
to have a better resolution too.

Jeroen Belleman
 
In sci.electronics.equipment Commander Kinsey <CFKinsey@military.org.jp> wrote:
On Thu, 18 Jun 2020 15:38:46 +0100, Pimpom <nobody@nowhere.com> wrote:

On 6/18/2020 6:33 PM, Commander Kinsey wrote:
I just bought an amp clamp meter, and it states the error is \"+/-
1.9% + 3 digits\". What does the \"3 digits\" part mean?


If your meter should read, say 1.875 A, the correct reading could
be anywhere from 1.872 to 1.878. This is a possible error in the
display presented to you in the analog-digital display conversion
process. The +/-1.9% possible error is about the measurement
taken including - but not only - any error made by the sensor.

Thanks, I wonder why all my other meters only list a % error. Is it
included within it somehow, or are they just lying, or do some meters
not have this error?

One generally finds the percentage plus digits error measures on more
expensive equipment. Less expensive equipment more often than not only
lists a percentage and nothing more.
 
In sci.electronics.equipment Pimpom <nobody@nowhere.com> wrote:
On 6/20/2020 4:29 AM, Commander Kinsey wrote:
On Fri, 19 Jun 2020 23:55:42 +0100, Ralph Mowery <rmowery28146@earthlink.net> wrote:

In article <op.0mg7zmz6wdg98l@glass>, CFKinsey@military.org.jp says...

But what I\'m surprised at is a ?5 multimeter (not clamp) not giving a digits error. Maybe precision on a simple voltmeter is cheap as chips nowadays?



You have to be careful how you throw precision and accurecy around.

A meter that shows 4 digits is more precice than one that shows only 3
digits, however the 4 digit one may only be 1% accurate and the 3 digit
one may be .5% accurate.

It is easy to get precision, but difficule to be accurate. Think of it
as shooting a gun. Precision may be how close the bullets land to each
other where ever they land on the target, but to be accurate the bullets
have to land on the center of the target. Such as all the bullets could
land very close to each other, but not even hit the target.

As I mentioned, a good meter will not have a digits error outside the +-
one digit due to rounding.

That didn\'t help. I interchange the two. I just want to know how close to the correct reading the readout is. Adding another digit doesn\'t improve anything if it\'s incorrect. And shooting all the bullets in one place doesn\'t help if they all miss.


Take pi as an example. It can be said that 3.14 is accurate as a
three-digit value, but 3.1416 is more precise because it has a
higher resolution.

OTOH, deriving it from 22/7 or 3.1429 has the same 5-digit
resolution and is just as precise as far as the number it
represents is concerned but is less accurate.

In this particular case, 3.1416 is both more precise and more
accurate than 3.14 but that\'s not always the case with measurements.

My mechanical slide caliper has a resolution of 0.001 inch. This
means that it can display measurements with a precision of 1 mil,

What if your caliper had a resolution of 1 mil +/- 3 counts on the last
digit? That\'s the issue with multimeters that have completely bogus digits
at the end. Those number are just noise and serve no purpose at all. They
don\'t even compare to all bullets missing the target but landing in the
same wrong spot.

but that doesn\'t guarantee that a measurement taken with it will
be accurate to 1 mil. I may not always press the jaws snugly
enough and the scale may not be perfectly accurate.
 
In sci.electronics.equipment Commander Kinsey <CFKinsey@military.org.jp> wrote:
On Thu, 18 Jun 2020 15:38:46 +0100, Pimpom <nobody@nowhere.com> wrote:

On 6/18/2020 6:33 PM, Commander Kinsey wrote:
I just bought an amp clamp meter, and it states the error is \"+/-
1.9% + 3 digits\". What does the \"3 digits\" part mean?


If your meter should read, say 1.875 A, the correct reading could
be anywhere from 1.872 to 1.878. This is a possible error in the
display presented to you in the analog-digital display conversion
process. The +/-1.9% possible error is about the measurement
taken including - but not only - any error made by the sensor.

Thanks, I wonder why all my other meters only list a % error. Is it
included within it somehow, or are they just lying, or do some meters
not have this error?

One generally finds the percentage plus digits error measures on more
expensive equipment. Less expensive equipment more often than not only
lists a percentage and nothing more.
 
In sci.electronics.equipment Commander Kinsey <CFKinsey@military.org.jp> wrote:
On Sat, 20 Jun 2020 16:24:41 +0100, Ralph Mowery <rmowery28146@earthlink.net> wrote:

In article <op.0miefkhkwdg98l@glass>, CFKinsey@military.org.jp says...

I\'d need to contract OCD to understand that. There\'s only one thing in question here, how close is the reading to the correct value. You can\'t split that into two. 3.1416 is better than 3.14, and that\'s it. All you can state with a reading is it\'s correct to within a certain percentage.

Try this.

A doctor does a very complicated operation on your left arm like a joint
replacement. It all goes very well. Very precise.

However he should have done the operation on the right arm that was
causing trouble. Not accurate.

Nope, because the first one is 100% useless. I wouldn\'t call that precise at all, as he was out by half a metre.

That is why a voltmeter can show 3 digits and be accurate to only the
last digit being in question by one number either way, but a 5 digit
volt meter can show many numbers, but if it is not calibrated corrctly
the 2nd digit to the 5 th digit could be way off and the meter not
accurate at all.

Showing those extra two numbers is pointless if they\'re wrong. All that matters is how many volts difference between the actual voltage and what is shown.

agreed. The problem with the bullets and the target story is that when
explained, we somehow perfectly know where the bullets are- be in on
target or a small grouping somewhere else. Cheapo meters won\'t give
CONSISTENT or REPEATABLE results, not matter how \"precise\" they pretended
to be, or how accurate the spec sheet claims, especially considering the
last digit(s) may be totaly wrong, and random. It\'s like having crappy or
dirty test leads or a component. You\'ll get all the digits in the world,
but they keep changing. You won\'t even be able to pick a reading.

Keep in mind that \"calibrated\" equipment doesn\'t even have to be precise
or accurate. An example would be an adjustable power supply with digital
readout. Say it\'s always reads high by 0.7 volts. It\'s not precise or
accurate, but by knowing the offset it can used with success and may even
have great regulation.

On the other hand say you have an alibaba special power supply that\'s
\"accurate\" to +/- 0.35 volts, with terrible regulation that oscillates.

What power supply is better?

So the point is cheapo equipment can have lots of bogus digits and
readings that flop up and down, while better equipment can be more
consistently wrong, which can be compensated for. Precision and accuracy
mean little by themselves if you need multiple readings.
 
In sci.electronics.equipment Ralph Mowery <rmowery28146@earthlink.net> wrote:
In article <U6sHG.61028$Nj4.43851@fx24.ams1>, nobody@nowhere.com says...

You keep saying that it\'s only the accuracy that matters. That\'s
true to some - and only some - extent.




Sometimes it is precision.

I worked at a company making polyester from raw materials. In a room
was a panel with about 10 temperature gauges. At a certain time all
gauges were marked and a sample of the material was sent to the lab. If
it came back good, then the object was to keep all the gauges on the
mark. It did not matter how far off the gauges were from the actual
temperature. No mater how well we calibrated the guages there were
several other factors that we had no control over. Such as the
thermocouples they were connected to. The specifications were +- 3 deg
C. on the thermocouples from the factory. If the temperature varied
more than 1 deg C at 300 deg C it could mess up the material.

So the object was precision and not accuracy.

If the goal was keep the needle on their marks it does\'t have to mean
anything was precise. Maybe your guages had no faces, or read mA instead
of degrees, and bent needles. As long as your +/- 3 degree thermocouples
and controllers did not jump up and down + and then -3 degrees all the
time, you were good.

It\'s like the zener diode or voltage standard that came up in this thread.
Those have no precision. They may not even be accurate. They might be
consistent though. Accuracy and precision by themselves can be useless
where time or multiple readings are needed.
 
On 6/26/2020 9:53 AM, Cydrome Leader wrote:
In sci.electronics.equipment Pimpom <nobody@nowhere.com> wrote:

My mechanical slide caliper has a resolution of 0.001 inch. This
means that it can display measurements with a precision of 1 mil,

What if your caliper had a resolution of 1 mil +/- 3 counts on the last
digit? That\'s the issue with multimeters that have completely bogus digits
at the end. Those number are just noise and serve no purpose at all. They
don\'t even compare to all bullets missing the target but landing in the
same wrong spot.
You seem intent on picking an argument by inserting a statement
that agrees with the following sentences. BTW, my caliper is not
digital, so the matter of +/- count is irrelevant.

but that doesn\'t guarantee that a measurement taken with it will
be accurate to 1 mil. I may not always press the jaws snugly
enough and the scale may not be perfectly accurate.
 
In sci.electronics.equipment Commander Kinsey <CFKinsey@military.org.jp> wrote:
On Sat, 20 Jun 2020 16:24:41 +0100, Ralph Mowery <rmowery28146@earthlink.net> wrote:

In article <op.0miefkhkwdg98l@glass>, CFKinsey@military.org.jp says...

I\'d need to contract OCD to understand that. There\'s only one thing in question here, how close is the reading to the correct value. You can\'t split that into two. 3.1416 is better than 3.14, and that\'s it. All you can state with a reading is it\'s correct to within a certain percentage.

Try this.

A doctor does a very complicated operation on your left arm like a joint
replacement. It all goes very well. Very precise.

However he should have done the operation on the right arm that was
causing trouble. Not accurate.

Nope, because the first one is 100% useless. I wouldn\'t call that precise at all, as he was out by half a metre.

That is why a voltmeter can show 3 digits and be accurate to only the
last digit being in question by one number either way, but a 5 digit
volt meter can show many numbers, but if it is not calibrated corrctly
the 2nd digit to the 5 th digit could be way off and the meter not
accurate at all.

Showing those extra two numbers is pointless if they\'re wrong. All that matters is how many volts difference between the actual voltage and what is shown.

agreed. The problem with the bullets and the target story is that when
explained, we somehow perfectly know where the bullets are- be in on
target or a small grouping somewhere else. Cheapo meters won\'t give
CONSISTENT or REPEATABLE results, not matter how \"precise\" they pretended
to be, or how accurate the spec sheet claims, especially considering the
last digit(s) may be totaly wrong, and random. It\'s like having crappy or
dirty test leads or a component. You\'ll get all the digits in the world,
but they keep changing. You won\'t even be able to pick a reading.

Keep in mind that \"calibrated\" equipment doesn\'t even have to be precise
or accurate. An example would be an adjustable power supply with digital
readout. Say it\'s always reads high by 0.7 volts. It\'s not precise or
accurate, but by knowing the offset it can used with success and may even
have great regulation.

On the other hand say you have an alibaba special power supply that\'s
\"accurate\" to +/- 0.35 volts, with terrible regulation that oscillates.

What power supply is better?

So the point is cheapo equipment can have lots of bogus digits and
readings that flop up and down, while better equipment can be more
consistently wrong, which can be compensated for. Precision and accuracy
mean little by themselves if you need multiple readings.
 
In article <rd400g$egj$2@reader1.panix.com>, presence@MUNGEpanix.com
says...
So the object was precision and not accuracy.

If the goal was keep the needle on their marks it does\'t have to mean
anything was precise. Maybe your guages had no faces, or read mA instead
of degrees, and bent needles. As long as your +/- 3 degree thermocouples
and controllers did not jump up and down + and then -3 degrees all the
time, you were good.

One good example of what we had is this.

In a vat of material is a test hole. In that hole is a rod about 3/8
inch in diameter and a foot long. At the end there are two
thermocouples and two RTDs. The thermocouples wires go about 100 feet
to a PLC (similar to a computer) card that converts the milivolts to
digital that is then displayed on a compute screen. The RTDs go about
10 feet to a converter that converts the change in resistance to a 4 to
20 miliamp signal. That goes to a card on the PLC and then to the
computer display.

While the computer will display to 3 decimal places at 300 deg C from
the lowest to the highest temperature shown on the display can be around
3 deg differnet and all 3 be within the limits of the equipmnet.

At a certain time a sample is sent to the lab and one of the computer
displays is set as a standard and the object of the PLC is to keep the
actual temperature , whatever it actually is, to that \'standard\'. Not
too accurate as to temperature, but very precice. The operators only
needed to keep that one computer display as close to that \'mark\' as they
can if for some reason the PLC messes up and they have to adjust the
control manual.
 
In article <rd400g$egj$2@reader1.panix.com>, presence@MUNGEpanix.com
says...
So the object was precision and not accuracy.

If the goal was keep the needle on their marks it does\'t have to mean
anything was precise. Maybe your guages had no faces, or read mA instead
of degrees, and bent needles. As long as your +/- 3 degree thermocouples
and controllers did not jump up and down + and then -3 degrees all the
time, you were good.

One good example of what we had is this.

In a vat of material is a test hole. In that hole is a rod about 3/8
inch in diameter and a foot long. At the end there are two
thermocouples and two RTDs. The thermocouples wires go about 100 feet
to a PLC (similar to a computer) card that converts the milivolts to
digital that is then displayed on a compute screen. The RTDs go about
10 feet to a converter that converts the change in resistance to a 4 to
20 miliamp signal. That goes to a card on the PLC and then to the
computer display.

While the computer will display to 3 decimal places at 300 deg C from
the lowest to the highest temperature shown on the display can be around
3 deg differnet and all 3 be within the limits of the equipmnet.

At a certain time a sample is sent to the lab and one of the computer
displays is set as a standard and the object of the PLC is to keep the
actual temperature , whatever it actually is, to that \'standard\'. Not
too accurate as to temperature, but very precice. The operators only
needed to keep that one computer display as close to that \'mark\' as they
can if for some reason the PLC messes up and they have to adjust the
control manual.
 
In sci.electronics.basics Ralph Mowery <rmowery28146@earthlink.net> wrote:
In article <rd400g$egj$2@reader1.panix.com>, presence@MUNGEpanix.com
says...

So the object was precision and not accuracy.

If the goal was keep the needle on their marks it does\'t have to mean
anything was precise. Maybe your guages had no faces, or read mA instead
of degrees, and bent needles. As long as your +/- 3 degree thermocouples
and controllers did not jump up and down + and then -3 degrees all the
time, you were good.



One good example of what we had is this.

In a vat of material is a test hole. In that hole is a rod about 3/8
inch in diameter and a foot long. At the end there are two
thermocouples and two RTDs. The thermocouples wires go about 100 feet
to a PLC (similar to a computer) card that converts the milivolts to
digital that is then displayed on a compute screen. The RTDs go about
10 feet to a converter that converts the change in resistance to a 4 to
20 miliamp signal. That goes to a card on the PLC and then to the
computer display.

While the computer will display to 3 decimal places at 300 deg C from
the lowest to the highest temperature shown on the display can be around
3 deg differnet and all 3 be within the limits of the equipmnet.

At a certain time a sample is sent to the lab and one of the computer
displays is set as a standard and the object of the PLC is to keep the
actual temperature , whatever it actually is, to that \'standard\'. Not
too accurate as to temperature, but very precice. The operators only
needed to keep that one computer display as close to that \'mark\' as they
can if for some reason the PLC messes up and they have to adjust the
control manual.

What\'s the control loop if the PLC dies? How do people control temperatures
manually? Is there a foot pedal to stomp on to switch the heaters on and
off?


There\'s a couple machines I fuss with that use platinum junction RTDs and
we have alarm limits set. If the machine drifts into an alarm state,
outside of a warmup period that\'s pretty much the end of the day and
everything stops until it can be fixed. The loops on these machines are
tuned to maintain and hold a set point of less than 1 degree F. The
displays are all wrong, show fake levels of precision, and read in C, but
are wrong by several degrees, even if you do the math. We gave up trying to
calibrate the displays against what the real temperature with the offset
features when the probes were last changed. It just isn\'t worth the time.
Those machines are not accurate, they\'re not precise (as measured with
their own instrumentation), but they will absolutely hold a stable
temperature if you can determine the set points yourself.
 

Welcome to EDABoard.com

Sponsor

Back
Top