11) List out the Various Static Characteristics.
Accuracy, Precision, Sensitivity, Linearity, Reproducibility, Repeatability,
Resolution, Threshold
12) Define Error.
Error is define as Difference Between Standard value and Measured Value
Normally It is Mentioned In %
% Error = Absolute Error / Standard Value X 100
13) How the Measurement Data’s are evaluated?
By using statistical evaluation methods data’s are evaluated .
the measured date are evaluated by two method
(i) Using Standard Formula
(ii) Comparing Previous Results
14) What is the importance of standard?
All the instruments are calibrated at the time of manufacturer against measurement standards. A standard of measurement is a physical representation of a unit of
Measurement. A standard means known accurate measure of physical quantity.
15) List out the Classification of standard.
Primary standard, Secondary standard, International standards
16) What is the Difference between Accuracy & Precision?
It is the degree of closeness with which the reading approaches the true value of the quantity to be measured. The accuracy can be expressed in following ways: Point accuracy, Accuracy as percentage of scale span, Accuracy as percentage of true value,
17) Define Resolution.
If the input is slowly increased from some arbitrary input value, it will again be found that output does not change at all until a certain increment is exceeded. This increment is called resolution.
18) What is Sensitivity?
The sensitivity denotes the smallest change in the measured variable to which the instrument responds. It is defined as the ratio of the changes in the output of an instrument to a change in the value of the quantity to be measured. Mathematically it is expressed as,
19) Define Instrument Efficiency.
Instrument efficiency is defined as the ratio of measures quantity to the power taken by the instrument at full scale deflection.
20) What is Calibration?
Calibration is the process of making an adjustment or marking a scale so that the readings of an instrument agree with the accepted & the certified standard.
21) How the Secondary standard Differ from the Sub Standard?
Secondary standard are Closer to true value, Protected in the national laboratory .
Sub standard are almost closer to true value, It compared with the primary
Standard, Consider as an accurate one.
22) List out the Types of Error?
The types of errors are follows
i) Gross errors
ii) Systematic errors
iii) Random errors
23) What are the Ways used for Minimizing the Error?
i) Selecting a proper instrument and planning and proper procedure for
the measurement
ii) Applying the proper correction factors
iii) Calibrating the instrument carefully against a standard
24) What is the use Swamping Resistance?
To minimize the error , a swamping resistance made of manganin or constantan is connected in series with the coil .
25) What are the Systems used to Measure the Data?
i) Metric system
ii) SI system
Accuracy, Precision, Sensitivity, Linearity, Reproducibility, Repeatability,
Resolution, Threshold
12) Define Error.
Error is define as Difference Between Standard value and Measured Value
Normally It is Mentioned In %
% Error = Absolute Error / Standard Value X 100
13) How the Measurement Data’s are evaluated?
By using statistical evaluation methods data’s are evaluated .
the measured date are evaluated by two method
(i) Using Standard Formula
(ii) Comparing Previous Results
14) What is the importance of standard?
All the instruments are calibrated at the time of manufacturer against measurement standards. A standard of measurement is a physical representation of a unit of
Measurement. A standard means known accurate measure of physical quantity.
15) List out the Classification of standard.
Primary standard, Secondary standard, International standards
16) What is the Difference between Accuracy & Precision?
It is the degree of closeness with which the reading approaches the true value of the quantity to be measured. The accuracy can be expressed in following ways: Point accuracy, Accuracy as percentage of scale span, Accuracy as percentage of true value,
17) Define Resolution.
If the input is slowly increased from some arbitrary input value, it will again be found that output does not change at all until a certain increment is exceeded. This increment is called resolution.
18) What is Sensitivity?
The sensitivity denotes the smallest change in the measured variable to which the instrument responds. It is defined as the ratio of the changes in the output of an instrument to a change in the value of the quantity to be measured. Mathematically it is expressed as,
19) Define Instrument Efficiency.
Instrument efficiency is defined as the ratio of measures quantity to the power taken by the instrument at full scale deflection.
20) What is Calibration?
Calibration is the process of making an adjustment or marking a scale so that the readings of an instrument agree with the accepted & the certified standard.
21) How the Secondary standard Differ from the Sub Standard?
Secondary standard are Closer to true value, Protected in the national laboratory .
Sub standard are almost closer to true value, It compared with the primary
Standard, Consider as an accurate one.
22) List out the Types of Error?
The types of errors are follows
i) Gross errors
ii) Systematic errors
iii) Random errors
23) What are the Ways used for Minimizing the Error?
i) Selecting a proper instrument and planning and proper procedure for
the measurement
ii) Applying the proper correction factors
iii) Calibrating the instrument carefully against a standard
24) What is the use Swamping Resistance?
To minimize the error , a swamping resistance made of manganin or constantan is connected in series with the coil .
25) What are the Systems used to Measure the Data?
i) Metric system
ii) SI system
No comments:
Post a Comment