Menu
For free
Registration
home  /  Relationship/ Federal Agency for Education of the Russian Federation. Jilavdari I.Z

Federal Agency for Education of the Russian Federation. Jilavdari I.Z

Send your good work in the knowledge base is simple. Use the form below

Students, graduate students, young scientists who use the knowledge base in their studies and work will be very grateful to you.

Posted on http://www.allbest.ru

MINISTRY OF EDUCATION AND SCIENCE OF THE RF

FEDERAL STATE BUDGET EDUCATIONAL INSTITUTION OF HIGHER PROFESSIONAL EDUCATION

"East Siberian State University technology and management"

Department: IPIB

« Physical Basics measurements and standard"

Completed by: 3rd year student

Eliseeva Yu.G.

Checked by: Matuev A.A.

Introduction

1. Physical basis of measurements

2. Measurement. Basic Concepts

3. Uncertainty and measurement error

4. Basic principles of creating a system of units and quantities

5. International system of units, C

6. Implementation of the basic quantities of the system (Si)

7. Metrological characteristics of SI

8. Principles, methods and techniques of measurements

Conclusion

Biographical list

Introduction

Technical progress, modern development of industry, energy and other sectors are impossible without improving traditional and creating new methods and measuring instruments (MI). The work program “Physical measurements and standards” includes consideration of fundamental physical concepts, phenomena and patterns used in metrology and measurement technology. With the development of science, technology and new technologies, measurements cover new physical quantities (PV), the measurement ranges are significantly expanding towards measuring both ultra-small and very large PV values. Requirements for measurement accuracy are constantly increasing. For example, the development of nanotechnologies (non-contact lapping, electron lithography, etc.) makes it possible to obtain the dimensions of parts with an accuracy of several nanometers, which imposes corresponding requirements on the quality of measurement information. The quality of measurement information is determined by the nano-level of metrological support for technological processes, which gave impetus to the creation of nanometry, i.e. metrology in the field of nanotechnology. In accordance with the basic measurement equation, the measurement procedure is reduced to comparing an unknown size with a known one, which is the size of the corresponding unit of the International System of Units. In order to translate legalized units into practical use in various fields, they must be physically implemented. Reproduction of a unit is a set of operations for its materialization using a standard. This may be a physical measure, a measuring instrument, a standard sample or a measuring system. The standard that ensures the reproduction of a unit with the highest accuracy in the country (compared to other standards of the same unit) is called the primary standard. The size of the unit is transmitted “top to bottom”, from more accurate measuring instruments to less accurate ones “along the chain”: primary standard - secondary standard - working standard of the 0th digit... - working measuring instrument (RMI). The subordination of the measuring instruments involved in the transfer of the size of the standard unit to the RSI is established in the testing schemes of the measuring instruments. Standards and reference measurement results in the field of physical measurements provide established benchmarks to which analytical laboratories can relate their measurement results. The traceability of measurement results to internationally accepted and established reference values, together with the established uncertainties of measurement results, described in International Document ISO/IEC 17025, form the basis for comparisons and recognition of results at the international level. This essay “Physical foundations of measurements”, which is intended for 1-3 year students of engineering specialties (direction “Machine-building technologies and equipment”), focuses on the fact that the basis of any measurements (physical, technical, etc.) are physical laws, concepts and definitions. Technical and natural processes are determined by quantitative data characterizing the properties and states of objects and bodies. To obtain such data, there was a need to develop measurement methods and a system of units. Increasingly complex relationships in technology and economic activity have led to the need to introduce a unified system of units of measurement. This was manifested in the legislative introduction of new units for measured quantities or the abolition of old units ( For example, changing the power unit to one horsepower per watt or kilowatt). As a rule, new definitions of units are introduced after the natural sciences have indicated a way to achieve increased accuracy in determining units and calibrating scales, clocks and everything else with their help, which then finds application in technology and Everyday life. Leonhard Euler (mathematician and physicist) also gave a definition of a physical quantity that is acceptable for our days. In his “Algebra” he wrote: “First of all, everything that is capable of increasing or decreasing, or something to which something can be added or from which something can be taken away is called a quantity. However, it is impossible to define or measure one quantity except by taking as a known quantity another quantity of the same kind and indicating the ratio in which it stands to it. When measuring quantities of any kind, we come, therefore, to the fact that, first of all, a certain known quantity of the same kind is established, called a unit of measurement and dependent "solely from our arbitrariness. Then it is determined in what relation a given quantity stands to this measure, which is always expressed in terms of numbers, so that a number is nothing more than the ratio in which one quantity 10 stands to another, taken as one." . Thus, to measure any physical (technical or other) quantity means that this quantity must be compared with another homogeneous physical quantity taken as a unit of measurement (with a standard). Quantity (number) physical quantities changes over time. You can cite big number definitions of quantities and corresponding specific units, and this set is constantly growing due to the growing needs of society. For example, with the development of the theory of electricity, magnetism, atomic and nuclear physics, quantities characteristic of these branches of physics were introduced. Sometimes, in relation to the quantity being measured, the formulation of the question is first slightly changed. For example, it is impossible to say: this is “blue” and that is “half blue,” because it is impossible to indicate a unit with which both shades of color could be compared. However, one can instead ask about the spectral density of radiation in the wavelength range l from 400 to 500 nm (1 nanometer = 10-7 cm = 10-9 m) and find that the new formulation of the question allows for the introduction of a definition that corresponds not to “half blue”, and the concept “half the intensity”. The concepts of quantities and their units of measurement change over time and in the conceptual aspect. An example is the radioactivity of a substance. The initially introduced unit of measurement of radioactivity, 1 curie, associated with the name Curie, which was allowed for use until 1980, is designated as 1 Ci, and is reduced to the amount of a substance measured in grams. Currently active radioactive substance A refers to the number of disintegrations per second and is measured in becquerels. In the SI system, the activity of a radioactive substance is 1 Bq = 2.7?10-11 Ci. Dimension [A] = becquerel = s -1. Although the physical effect is definable and a unit can be set for it, the quantitative characterization of the effect turns out to be very difficult. For example, if a fast particle (say, an alpha particle produced during the radioactive decay of matter) gives up all its kinetic energy during inhibition in living tissue, this process can be described using the concept of radiation dose, i.e., energy loss per unit of mass. However, taking into account the biological impact of such a particle is still a subject of debate. Emotional concepts have so far not been quantifiable; it has not been possible to determine the units corresponding to them. The patient cannot quantify the degree of his discomfort. However, temperature and pulse rate measurements, as well as laboratory tests characterized by quantitative data, can be of great assistance to the doctor in establishing a diagnosis. One of the goals of the experiment is to search for parameters that describe physical phenomena that can be measured by obtaining numerical values. It is already possible to establish a certain functional relationship between these measured values. Complex experimental study the physical properties of various objects is usually carried out using the results of measurements of a number of basic and derivative quantities. In this regard, the example of acoustic measurements, which is included in this manual as a section, is very typical. standard physical measurement error formula

1. Physical basis of measurements

Physical quantity and its numerical value

Physical quantities are properties (characteristics) of material objects and processes (objects, states) that can be measured directly or indirectly. The laws connecting these quantities with each other have the form of mathematical equations. Each physical quantity G is the product of a numerical value and a unit of measurement:

Physical quantity = Numerical value H Unit of measurement.

The resulting number is called the numerical value of the physical quantity. Thus, the expression t = 5 s (1.1.) means that the measured time is five times the repetition of a second. However, to characterize a physical quantity, just one numerical value is not enough. Therefore, the corresponding unit of measurement should never be omitted. All physical quantities are divided into basic and derived quantities. The main quantities used are: length, time, mass, temperature, current strength, amount of substance, light intensity. Derived quantities are obtained from fundamental quantities, either by using expressions for the laws of nature, or by expedient determination through multiplication or division of the fundamental quantities.

For example,

Speed ​​= Path/Time; t S v = ; (1.2)

Charge = Current H Time; q = I? t. (1.3)

To represent physical quantities, especially in formulas, tables or graphs, special symbols are used - quantity designations. In accordance with international agreements, appropriate standards have been introduced for the designation of physical and technical quantities. It is customary to type designations of physical quantities in italics. Subscripts are also denoted in italics if they are symbols, i.e. symbols of physical quantities, not abbreviations.

Square bracketscontaining a quantity designation indicate the unit of measurement of the quantity, for example, the expression [U] = V reads as follows: “The unit of voltage is equal to the volt.” It is incorrect to enclose a unit of measurement in square brackets (for example, [V]). Curly brackets ( ) containing quantity designations mean “the numerical value of the quantity,” for example, the expression (U) = 220 is read as follows: “the numerical value of the voltage is 220.” Since each value of a quantity is the product of a numerical value and a unit of measurement, for the above example it turns out: U = (U)?[U] = 220 V. (1.4) When writing, it is necessary to leave an interval between the numerical value and the unit of measurement of a physical quantity, for example: I = 10 A. (1.5) Exceptions are the designations of units: degrees (0), minutes (") and seconds ("). Too large or small orders of numerical values ​​(relative to 10) are abbreviated by introducing new digits of units, called the same as the old ones, but with the addition of a prefix. This is how new units are formed, for example 1 mm 3 = 1? 10-3 m. The physical quantity itself does not change, i.e. when a unit is decreased by F times, its numerical value will increase, accordingly, by F times. Such invariance of a physical quantity occurs not only when the unit changes tenfold (to the power of n times), but also with other changes in this unit. In table 1.1 shows the officially accepted abbreviations for the names of units. 14 Prefixes to SI units Table 1.1 Designation Prefix Latin Russian Logarithm of the power of ten Prefix Latin Russian Logarithm of the power of ten Tera T T 12 centi c s -2 Giga G G 9 milli m m -3 Mega M M 6 micro m mk -6 kilo k k 3 nano n n -9 hecto h g 2 pico p n -12 deca da yes 1 femto f f -15 deci d d -1 atto.

2. Measurement. Basic Concepts

Measurement concept

Measurement is one of the most ancient operations in the process of human cognition of the surrounding material world. The entire history of civilization is a continuous process of formation and development of measurements, improvement of means of methods and measurements, increasing their accuracy and uniformity of measures.

In the process of its development, humanity has gone from measurements based on the senses and parts of the human body to scientific foundations measurements and use for these purposes of the most complex physical processes and technical devices. Currently, measurements cover all physical properties matter practically regardless of the range of changes in these properties.

With the development of mankind, measurements have become increasingly important in economics, science, technology, and production activities. Many sciences began to be called exact due to the fact that they can establish quantitative relationships between natural phenomena using measurements. Essentially, all progress in science and technology is inextricably linked with the increasing role and improvement of the art of measurement. DI. Mendeleev said that “science begins as soon as they begin to measure. Exact science is unthinkable without measure.”

Of no less importance are measurements in technology, production activities, when taking into account material assets, when ensuring safe working conditions and human health, and in preserving the environment. Modern scientific and technological progress is impossible without the widespread use of measuring instruments and numerous measurements.

In our country, more than tens of billions of measurements are carried out per day, over 4 million people consider measurement as their profession. The share of measurement costs is (10-15)% of all social labor costs, reaching (50-70)% in electronics and precision engineering. About a billion measuring instruments are used in the country. When creating modern electronic systems (computers, integrated circuits, etc.), up to (60-80)% of the costs fall on measuring the parameters of materials, components and finished products.

All this suggests that it is impossible to overestimate the role of measurements in the life of modern society.

Although man has been making measurements since time immemorial and this term seems intuitively clear, it is not easy to define it accurately and correctly. This is evidenced, for example, by the discussion on the concept and definition of measurement, which took place not so long ago on the pages of the journal “Measuring Technology”. As an example, below are various definitions concepts of “measurement” taken from literature and regulatory documents of different years.

A measurement is called cognitive process, which consists in comparing, through a physical experiment, a given quantity with a certain value taken as a unit of comparison (M.F. Malikov, Fundamentals of Metrology, 1949).

Finding the value of a physical quantity experimentally using special technical means (GOST 16263-70 on terms and definitions of metrology, no longer in force).

A set of operations for the use of a technical means that stores a unit of physical quantity, ensuring that the relationship (explicitly or implicitly) of the measured quantity with its unit is found and the value of this quantity is obtained (Recommendations on interstate standardization RMG 29-99 Metrology. Basic terms and definitions, 1999 ).

A set of operations aimed at determining the value of a quantity (International Dictionary of Terms in Metrology, 1994).

Measurement-- a set of operations to determine the ratio of one (measured) quantity to another homogeneous quantity, taken as a unit stored in a technical device (measuring instrument). The resulting value is called the numerical value of the measured quantity; the numerical value together with the designation of the unit used is called the value of the physical quantity. The measurement of a physical quantity is carried out experimentally using various means measurements - measures, measuring instruments, measuring transducers, systems, installations, etc. The measurement of a physical quantity includes several stages: 1) comparison of the measured quantity with a unit; 2) transformation into a form convenient for use (various display methods).

· The measurement principle is a physical phenomenon or effect underlying measurements.

· Method of measurement - a method or set of methods for comparing a measured physical quantity with its unit in accordance with the implemented measurement principle. The measurement method is usually determined by the design of the measuring instruments.

A characteristic of measurement accuracy is its error or uncertainty. Measurement examples:

1. In the simplest case, applying a ruler with divisions to any part, essentially compare its size with the unit stored by the ruler, and, having made a count, obtain the value of the value (length, height, thickness and other parameters of the part).

2. Using a measuring device, the size of the quantity converted into the movement of the pointer is compared with the unit stored by the scale of this device, and a count is made.

In cases where it is impossible to carry out a measurement (a quantity is not identified as a physical quantity, or the unit of measurement of this quantity is not defined), it is practiced to estimate such quantities on conventional scales, for example, the Richter Scale of earthquake intensity, the Mohs Scale - a scale of mineral hardness.

The science that studies all aspects of measurement is called metrology.

Classification of measurements

By type of measurement

Main article: Types of measurements

According to RMG 29-99 “Metrology. Basic terms and definitions" identifies the following types of measurements:

· Direct measurement is a measurement in which the desired value of a physical quantity is obtained directly.

· Indirect measurement - determination of the desired value of a physical quantity based on the results of direct measurements of other physical quantities that are functionally related to the desired quantity.

· Joint measurements—simultaneous measurements of two or more different quantities to determine the relationship between them.

· Cumulative measurements are simultaneous measurements of several quantities of the same name, in which the desired values ​​of the quantities are determined by solving a system of equations obtained by measuring these quantities in various combinations.

· Equal-precision measurements - a series of measurements of any quantity, performed with measuring instruments of equal accuracy under the same conditions with the same care.

· Uneven precision measurements - a series of measurements of any quantity performed by measuring instruments that differ in accuracy and (or) under different conditions.

· Single measurement - a measurement performed once.

· Multiple measurement - a measurement of a physical quantity of the same size, the result of which is obtained from several consecutive measurements, that is, consisting of a number of single measurements

· Static measurement is a measurement of a physical quantity that is taken, in accordance with a specific measurement task, to be unchanged throughout the measurement time.

· Dynamic measurement - measurement of a physical quantity that changes in size.

· Relative measurement - measurement of the ratio of a quantity to a quantity of the same name, which plays the role of a unit, or measurement of a change in a quantity in relation to a quantity of the same name, taken as the initial one.

It is also worth noting that various sources additionally distinguish these types of measurements: metrological and technical, necessary and redundant, etc.

By measurement methods

The direct assessment method is a measurement method in which the value of a quantity is determined directly from the indicating measuring instrument.

· The method of comparison with a measure is a measurement method in which the measured value is compared with the value reproduced by the measure.

· Zero measurement method - a method of comparison with a measure, in which the resulting effect of the influence of the measured quantity and measure on the comparison device is brought to zero.

· The method of measurement by substitution is a method of comparison with a measure, in which the measured quantity is replaced by a measure with a known value of the quantity.

· The addition measurement method is a method of comparison with a measure, in which the value of the measured quantity is supplemented with a measure of the same quantity in such a way that the comparison device is affected by their sum equal to a predetermined value.

· Differential measurement method - a measurement method in which the measured value is compared with a homogeneous value that has known value, slightly different from the value of the measured quantity, and at which the difference between these two quantities is measured.

According to the conditions determining the accuracy of the result

· Metrological measurements

· Measurements of the highest possible accuracy achievable with the existing level of technology. This class includes all high-precision measurements and, first of all, reference measurements associated with the highest possible accuracy of reproduction of established units of physical quantities. This also includes measurements of physical constants, primarily universal ones, for example the measurement absolute value free fall acceleration.

· Control and verification measurements, the error of which, with a certain probability, should not exceed a certain specified value. This class includes measurements performed by state control (supervision) laboratories for compliance with the requirements of technical regulations, as well as the state of measuring equipment and factory measuring laboratories. These measurements guarantee the error of the result with a certain probability not exceeding a certain predetermined value.

· Technical measurements, in which the error of the result is determined by the characteristics of the measuring instruments. Examples of technical measurements are measurements performed during the production process in industrial enterprises, in the service sector, etc.

In relation to the change in the measured quantity

Dynamic and static.

Based on measurement results

· Absolute measurement - a measurement based on direct measurements of one or more basic quantities and (or) the use of the values ​​of physical constants.

· Relative measurement - measurement of the ratio of a quantity to a quantity of the same name, which plays the role of a unit, or measurement of a change in a quantity in relation to the quantity of the same name, taken as the initial one.

Classification of measurement series

By accuracy

· Equal-precision measurements - results of the same type obtained when measuring with the same instrument or a device similar in accuracy, by the same (or similar) method and under the same conditions.

· Unequal measurements - measurements made when these conditions are violated.

3. Uncertainty and measurement error

Similar to errors, measurement uncertainties can be classified according to various criteria.

According to the method of expression, they are divided into absolute and relative.

Absolute measurement uncertainty-- uncertainty of measurement, expressed in units of the measured quantity.

Relative uncertainty of measurement result-- the ratio of absolute uncertainty to the measurement result.

1. Based on the source of measurement uncertainty, like errors, it can be divided into instrumental, methodological and subjective.

2. Based on the nature of their manifestation, errors are divided into systematic, random and gross. IN "Guide to the Expression of Uncertainty of Measurement" there is no classification of uncertainties on this basis. At the very beginning of this document it is stated that before statistical processing of measurement series, all known systematic errors must be excluded from them. Therefore, the division of uncertainties into systematic and random was not introduced. Instead, uncertainties are divided into two types according to the estimation method:

* uncertainty assessed by type A (type A uncertainty)- uncertainty, which is assessed by statistical methods,

* uncertainty assessed by type B (type B uncertainty)-- uncertainty that is not assessed by statistical methods.

Accordingly, two assessment methods are proposed:

1. assessment by type A - obtaining statistical estimates based on the results of a number of measurements,

2. Type B assessment - obtaining estimates based on a priori non-statistical information.

At first glance, it seems that this innovation consists only of replacing existing terms of known concepts with others. Indeed, only random error can be estimated by statistical methods, and therefore Type A uncertainty is what was previously called random error. Similarly, the NSP can only be estimated on the basis of a priori information, and therefore there is also a one-to-one correspondence between type B uncertainty and the NSP.

However, introducing these concepts is quite reasonable. The fact is that when making measurements using complex methods, including a large number of sequentially performed operations, it is necessary to evaluate and take into account a large number of sources of uncertainty in the final result. At the same time, their division into NSP and random may turn out to be falsely orienting. Let's give two examples.

Example 1. A significant part of the uncertainty of an analytical measurement can be the uncertainty in determining the calibration dependence of the device, which is the NSP at the time of measurements. Therefore, it must be estimated based on a priori information using non-statistical methods. However, in many analytical measurements, the main source of this uncertainty is the random weighing error in preparing the calibration mixture. To increase the accuracy of measurements, you can apply multiple weighing of this standard sample and find an estimate of the error of this weighing using statistical methods. This example shows that in some measurement technologies, in order to improve the accuracy of the measurement result, a number of systematic components of measurement uncertainty can be estimated by statistical methods, i.e., they can be type A uncertainties.

Example 2. For a number of reasons, for example, in order to save production costs, the measurement technique provides for no more than three single measurements of one value. In this case, the measurement result can be determined as the arithmetic mean, mode or median of the obtained values, but statistical methods Uncertainty estimates for this sample size will provide very rough estimates. It seems more reasonable to a priori calculate the uncertainty of measurement based on standardized indicators of SI accuracy, i.e., its assessment according to type B. Consequently, in this example, unlike the previous one, the uncertainty of the measurement result, a significant part of which is due to the influence of factors of a random nature, is an uncertainty of type B.

At the same time, the traditional division of errors into systematic, NSP and random also does not lose its significance, since it more accurately reflects other characteristics: the nature of the manifestation as a result of measurement and the causal relationship with the effects that are sources of errors.

Thus, the classifications of uncertainties and measurement errors are not alternative and complement each other.
There are also some other terminological innovations in the Guide. Below is a summary table of terminological differences between the concept of uncertainty and the classical theory of accuracy.

Terms are approximate analogues of the concept of uncertaintyand classical theory of accuracy

Classical theory

Uncertainty concept

Measurement result error

Uncertainty of the measurement result

Random error

Uncertainty assessed by type A

Uncertainty assessed by type B

RMS deviation (standard deviation) of measurement result error

Standard uncertainty of measurement result

Confidence limits of the measurement result

Expanded uncertainty of measurement result

Confidence probability

Probability of coverage

Quantile (coefficient) of the error distribution

Coverage factor

The new terms listed in this table have the following definitions.

1. Standard uncertainty-- uncertainty expressed as standard deviation.

2. Expanded Uncertainty-- a quantity that specifies the interval around a measurement result within which the majority of the distribution of values ​​that can reasonably be attributed to the measured quantity is expected to lie.

Notes

1. Each value of expanded uncertainty is associated with the value of its coverage probability P.

2. An analogue of expanded uncertainty is the confidence limits of measurement error.

3. Probability of coverage-- probability, which, in the opinion of the experimenter, corresponds to the expanded uncertainty of the measurement result.

Notes

1. An analogue of this term is the confidence probability corresponding to the confidence limits of error.

2. The coverage probability is selected taking into account information about the type of uncertainty distribution law.

4. Fundamentals of constructing systems of units of physical quantities

Systems of units of physical quantities

The basic principle of constructing a system of units is ease of use. To ensure this principle, some units are randomly selected. Arbitrariness is contained both in the choice of the units themselves (the basic units of physical quantities) and in the choice of their size. For this reason, by defining the basic quantities and their units, very different systems of units of physical quantities can be constructed. It should be added to this that derived units of physical quantities can also be defined differently. This means that a lot of unit systems can be built. Let us dwell on the general features of all systems.

Main common feature- clear definition of the essence and physical meaning main physical units and quantities of the system. It is desirable, but as stated in the previous section, not necessary, that the underlying physical quantity can be reproduced with high accuracy and can be transmitted by the measuring instrument with minimal loss of accuracy.

The next important step in building a system is to establish the size of the main units, that is, agree and legislate the procedure for reproducing the main unit.

Since all physical phenomena are interconnected by laws written in the form of equations expressing the relationship between physical quantities, when establishing derived units, it is necessary to select a constitutive relation for the derived quantity. Then, in such an expression, the coefficient of proportionality included in the defining relation should be equated to one or another constant number. Thus, a derived unit is formed, which can be given the following definition: “ Derived unit of physical quantity- a unit, the size of which is associated with the sizes of the basic units by relationships expressing physical laws, or definitions of the corresponding quantities.”

When constructing a system of units consisting of basic and derived units, two most important points should be emphasized:

First, the division of units of physical quantities into basic and derivatives does not mean that the former have any advantage or are more important than the latter. IN different systems the basic units can be different, and the number of basic units in the system can also be different.

Secondly, one should distinguish between equations of connection between quantities and equations of connection between their numerical values ​​and values. The coupling equations are relations in general view, independent of units. Equations of connection between numerical values can have different kind depending on the chosen units for each of the quantities. For example, if you choose the meter, kilogram of mass and second as the basic units, then the relationships between mechanical derivative units, such as force, work, energy, speed, etc., will differ from those if the basic units are chosen centimeter, gram, second or meter, ton, second.

Characterizing various systems of units of physical quantities, remember that the first step in building systems was associated with an attempt to relate basic units to quantities found in nature. So, during the era of the Great French Revolution in 1790-1791. It was proposed that the unit of length should be considered one forty-millionth of the earth's meridian. In 1799, this unit was legalized in the form of a prototype meter - a special platinum-iridium ruler with divisions. At the same time, the kilogram was defined as the weight of one cubic decimeter of water at 4°C. To store the kilogram, a model weight was made - a prototype of the kilogram. As a unit of time, 1/86400 of the average solar day was legalized.

Subsequently, the natural reproduction of these values ​​had to be abandoned, since the reproduction process is associated with large errors. These units were established by law according to the characteristics of their prototypes, namely:

· the unit of length was defined as the distance between the axes of the lines on the platinum-iridium prototype of the meter at 0 °C;

· mass unit - mass of the platinum-iridium prototype kilogram;

· unit of force - the weight of the same weight at the place of its storage at the International Bureau of Weights and Measures (BIPM) in Sevres (Paris area);

· unit of time - sidereal second, which is 1/86400 of a sidereal day. Since, due to the rotation of the Earth around the Sun, in one year there are one more sidereal day than solar days, a sidereal second is 0.99 726 957 from a solar second.

This basis of all modern systems of units of physical quantities has been preserved to this day. Thermal (Kelvin), electrical (Ampere), optical (candela), chemical (mole) units were added to the mechanical basic units, but the basics have been preserved to this day. It should be added that the development of measuring technology and in particular the discovery and implementation of lasers in measurements made it possible to find and legitimize new, very accurate ways of reproducing the basic units of physical quantities. We will dwell on such points in the following sections devoted to individual types of measurements.

Here we will briefly list the most commonly used systems of units in the natural sciences of the 20th century, some of which still exist in the form of non-systemic or slang units.

In Europe over the past decades, three systems of units have been widely used: CGS (centimeter, gram, second), MKGSS (meter, kilogram-force, second) and the SI system, which is the main international system and preferred in the territory of the former USSR “in all fields of science , technology and National economy, as well as when teaching."

The last quotation, taken in quotation marks, is from the state standard of the USSR GOST 9867-61 “International System of Units”, put into effect on January 1, 1963. We will dwell on this system in more detail in the next paragraph. Here we just point out that the main mechanical units in the SI system are the meter, kilogram-mass and second.

GHS system has been around for over a hundred years and is very useful in some scientific and engineering fields. The main advantage of the GHS system is the logic and consistency of its construction. When describing electromagnetic phenomena, there is only one constant - the speed of light. This system was developed between 1861 and 1870. British Electrical Standards Committee. The GHS system was based on the system of units of the German mathematician Gauss, who proposed a method for constructing a system based on three basic units - length, mass and time. Gauss system I used millimeter, milligram and second.

For electrical and magnetic quantities, two different versions of the SGS system have been proposed - the absolute electrostatic system SGSE and the absolute electromagnetic system SGSM. In total, in the development of the GHS system, there were seven different systems, which had the centimeter, gram and second as their main units.

At the end of the last century there appeared MKGSS system, the basic units of which were the meter, kilogram-force and second. This system has become widespread in applied mechanics, heat engineering and related fields. This system has many shortcomings, starting with confusion in the names of the basic unit, the kilogram, which meant kilogram-force as opposed to the widely used kilogram-mass. There was not even a name for the unit of mass in the MKGSS system and it was designated as i.e. m (technical unit of mass). Nevertheless, the MKGSS system is still partially used, at least in determining engine power in horsepower. Horsepower- power equal to 75 kgf m/s - is still used in technology as a slang unit.

In 1919, the MTS system was adopted in France - meter, ton, second. This system was also the first Soviet standard for mechanical units, adopted in 1929.

In 1901, the Italian physicist P. Giorgi proposed a system of mechanical units built on three basic mechanical units - meter, kilogram of mass And second. The advantage of this system was that it was easy to relate to the absolute practical system of electrical and magnetic units, since the units of work (joule) and power (watt) in these systems were the same. Thus, the opportunity was found to take advantage of the comprehensive and convenient GHS system with the desire to “seam” electrical and magnetic units with mechanical units.

This was achieved by introducing two constants - the electrical permeability (e 0) of the vacuum and the magnetic permeability of the vacuum (m 0). There is some inconvenience in writing formulas that describe the forces of interaction between stationary and moving electric charges and, accordingly, in determining the physical meaning of these constants. However, these shortcomings are largely compensated by such conveniences as the unity of expression of energy when describing both mechanical and electromagnetic phenomena, because

1 joule = 1 newton, meter = 1 volt, coulomb = 1 ampere, weber.

As a result of the search for the optimal version of the international system of units in 1948 IX General Conference on Weights and Measures, based on a survey of member countries of the Metric Convention, adopted an option that proposed taking the meter, kilogram of mass and second as the basic units. It was proposed to exclude the kilogram-force and related derivative units from consideration. The final decision, based on the results of a survey of 21 countries, was formulated at the Tenth General Conference on Weights and Measures in 1954.

The resolution read:

“As the basic units of a practical system for international relations, accept:

unit of length - meter

unit of mass - kilogram

unit of time - second

unit of current - Ampere

unit of thermodynamic temperature - degree Kelvin

unit of luminous intensity - a candle."

Later, at the insistence of chemists, the international system was supplemented by the seventh basic unit of quantity of a substance - the mole.

In the future, the international SI system or in English transcription Sl (System International) was somewhat clarified, for example, the temperature unit was named Kelvin instead of “degree Kelvin”, the system of standards of electrical units was reoriented from Ampere to Volt, since a standard of potential difference was created based on the quantum effect - the Josephson effect, which made it possible to reduce the error in reproducing the unit potential difference - Volta - is more than an order of magnitude. In 1983, at the XVIII General Conference on Weights and Measures, a new definition of the meter was adopted. According to the new definition, a meter is the distance traveled by light in 1/2997925 of a second. Such a definition, or rather a redefinition, was needed in connection with the introduction of lasers into the reference technology. It should immediately be noted that the size of the unit, in this case the meter, does not change. Only the methods and means of its reproduction change, characterized by less error (greater accuracy).

5 . International System of Units (SI)

The development of science and technology increasingly demanded unification of units measurements. A unified system of units was required, convenient for practical use and covering various areas measurements. In addition, it had to be coherent. Since the metric system of measures was widely used in Europe since the beginning of the 19th century, it was taken as the basis during the transition to a unified international system of units.

In 1960, the XI General Conference on Weights and Measures approved International system of units physical quantities ( Russian designation SI, International SI) based on six base units. The decision was made:

Give the system based on six basic units the name “International System of Units”;

Set an international abbreviation for the name of the SI system;

Enter a table of prefixes for the formation of multiples and submultiples;

Create 27 derived units, indicating that other derived units can be added.

In 1971, a seventh base unit of quantity of matter (the mole) was added to the SI.

When constructing the SI, we proceeded from the following basic principles:

The system is based on basic units that are independent of each other;

Derived units are formed using the simplest communication equations and only one SI unit is established for each type of quantity;

The system is coherent;

Along with SI units, non-system units widely used in practice are allowed;

The system includes decimal multiples and submultiples.

AdvantagesSI:

- versatility, because it covers all measurement areas;

- unification units for all types of measurements - the use of one unit for a given physical quantity, for example, for pressure, work, energy;

SI units by size convenient for practical use;

Go to it increases the level of measurement accuracy, because the basic units of this system can be reproduced more accurately than those of other systems;

This is a single international system and its units common.

In the USSR, the International System (SI) was introduced by GOST 8.417-81. As SI continued to develop, the class of supplementary units was removed from it, a new definition of the meter was introduced, and a number of other changes were introduced. Currently, the Russian Federation has an interstate standard GOST 8.417-2002, which establishes the units of physical quantities used in the country. The standard states that SI units, as well as decimal multiples and submultiples of these units, are subject to mandatory use.

In addition, it is allowed to use some non-SI units and their submultiples and multiples. The standard also specifies non-systemic units and units of relative quantities.

The main SI units are presented in the table.

Magnitude

Name

Dimension

Name

Designation

international

kilogram

Electricity

Thermodynamic temperature

Quantity of substance

The power of light

Derived units SIs are formed according to the rules for the formation of coherent derived units (see example above). Examples of such units and derived units that have special names and designations are given. 21 derived units were given names and designations according to names of scientists, for example, hertz, newton, pascal, becquerel.

A separate section of the standard provides units not included in the SI. These include:

1. Non-system units, allowed for use on a par with SI due to their practical importance. They are divided into areas of application. For example, in all areas the units used are ton, hour, minute, day, liter; in optics diopter, in physics electron-volt, etc.

2. Some relative and logarithmic values and their units. For example, percent, ppm, white.

3. Non-systemic units, temporarily allowed for use. For example, nautical mile, carat (0.2 g), knot, bar.

A separate section provides rules for writing unit symbols, using unit symbols in the headings of table graphs, etc.

IN applications The standard contains rules for the formation of coherent derived SI units, a table of relationships between some non-systemic units and SI units, and recommendations for the selection of decimal multiples and submultiples.

The following are examples of some derived SI units.

Units whose names include names of basic units. Examples: unit of area - square meter, dimension L 2, unit designation m 2; unit of flux of ionizing particles - second to the minus first power, dimension T -1, unit symbol s -1.

Units having special names. Examples:

strength, weight - newton, dimension LMT -2, unit designation N (international N); energy, work, amount of heat - joule, dimension L 2 MT -2, designation J (J).

Units whose names are formed using special names. Examples:

moment of force - name newton meter, dimension L 2 MT -2, designation Nm (Nm); specific energy- Name joule per kilogram, dimension L 2 T -2, designation J/kg (J/kg).

Decimal multiples and submultiples formed using multipliers and prefixes, from 10 24 (yotta) to 10 -24 (yocto).

Joining the name two or more consoles in a row What is not allowed, for example, is not the kilogram, but the ton, which is a non-systemic unit allowed along with the SI. Due to the fact that the name of the basic unit of mass contains the prefix kilo, to form submultiple and multiple units of mass, the submultiple unit gram is used and prefixes are attached to the word “gram” - milligram, microgram.

The choice of a multiple or submultiple unit of the SI unit is dictated primarily by the convenience of its use, moreover, numeric values the obtained values ​​must be acceptable in practice. It is believed that numerical values ​​of quantities are most easily perceived in the range from 0.1 to 1000.

In some areas of activity, the same submultiple or multiple unit is always used, for example, in mechanical engineering drawings, dimensions are always expressed in millimeters.

To reduce the likelihood of errors in calculations, it is recommended to substitute decimal and multiple submultiple units only in the final result, and during the calculation process, express all quantities in SI units, replacing prefixes with powers of 10.

GOST 8.417-2002 provides writing rules designations of units, the main ones of which are as follows.

Unit symbols should be used letters or signs, and two types of letter designations are established: international and Russian. International designations are written in relations with foreign countries (contracts, supply of products and documentation). When used on the territory of the Russian Federation, Russian designations are used. At the same time, only international designations are used on plates, scales and shields of measuring instruments.

The names of units are written with a small letter unless they appear at the beginning of a sentence. The exception is degrees Celsius.

In unit notation do not use a dot as a sign of abbreviation, they are printed in roman font. Exceptions are abbreviations of words that are included in the name of a unit, but are not themselves names of units. For example, mm Hg. Art.

Unit designations used after numeric values ​​and placed on the line with them (without wrapping to the next line). Between the last digit and the designation should be left space, except for the sign raised above the line.

When specifying the values ​​of quantities with maximum deviations should include numeric values in brackets and unit designations should be placed after the brackets or placed both after the numerical value of the quantity and after its maximum deviation.

Letter designations of units included in work, should be separated dots on the midline, like multiplication signs. It is allowed to separate letter designations with spaces if this does not lead to misunderstanding. Geometric dimensions are indicated by the sign “x”.

In letter notations, the ratio of units as division sign should be applied only one trait: oblique or horizontal. It is allowed to use unit designations in the form of a product of unit designations raised to powers.

When using a slash, the unit symbols in the numerator and denominator should be placed in one line, the product of notation in the denominator should be in brackets.

When specifying a derived unit consisting of two or more units, it is not allowed to combine letter designations And names of units, i.e. for some they are designations, for others they are names.

The designations of units whose names are derived from the names of scientists are written with a capital letter.

It is allowed to use unit designations in explanations of quantity designations for formulas. Placing unit designations on the same line with formulas expressing relationships between quantities and their numerical values ​​presented in letter form is not allowed.

The standard highlights units by areas of knowledge in physics and the recommended multiples and submultiples are indicated. There are 9 areas of use of units:

1. space and time;

2. periodic and related phenomena;

Similar documents

    The essence of a physical quantity, classification and characteristics of its measurements. Static and dynamic measurements of physical quantities. Processing the results of direct, indirect and joint measurements, standardizing the form of their presentation and assessing uncertainty.

    course work, added 03/12/2013

    General rules for designing systems of units. Basic, supplementary and derived SI units. Rules for writing unit symbols. Alternative modern systems physical units. The essence of the Josephson effect. Planck's system of units.

    test, added 02/11/2012

    Classification of measuring instruments. The concept of the structure of standard measures. A single generally accepted system of units. Learning Physics electrical measurements. Classification of electrical measuring equipment. Digital and analogue measuring instruments.

    abstract, added 12/28/2011

    Systems of physical quantities and their units, the role of their size and meaning, the specifics of classification. The concept of unity of measurements. Characteristics of standards of units of physical quantities. Transferring the sizes of units of quantities: features of the system and methods used.

    abstract, added 12/02/2010

    abstract, added 01/09/2015

    The essence of the concept of "measurement". Units of physical quantities and their systems. Reproduction of units of physical quantities. Standard unit of length, mass, time and frequency, current, temperature and luminous intensity. Ohm standard based on the quantum Hall effect.

    abstract, added 07/06/2014

    Physical quantity as a property of a physical object, their concepts, systems and means of measurement. The concept of non-physical quantities. Classification by types, methods, measurement results, conditions that determine the accuracy of the result. The concept of measurement series.

    presentation, added 09.26.2012

    Basics of measuring physical quantities and the degree of their symbols. The essence of the measurement process, classification of its methods. Metric system of measures. Standards and units of physical quantities. Structure of measuring instruments. Representativeness of the measured value.

    course work, added 11/17/2010

    Quantitative characteristics the surrounding world. System of units of physical quantities. Characteristics of measurement quality. Deviation of the measured value of a quantity from the true value. Errors in the form of the numerical expression and in the pattern of manifestation.

    course work, added 01/25/2011

    Basic, supplementary and derived SI units. Rules for writing unit symbols. Alternative modern systems of physical units. Reference measures in metrology institutes. Specifics of the use of SI units in the field of physics and technology.

Test

Discipline: "Electrical measurements"


Introduction1. Measuring electrical circuit resistance and insulation2. Measurement of active and reactive power3. Measurement of magnetic quantitiesReferences
Introduction Problems of magnetic measurements. The field of electrical measuring technology that deals with measurements of magnetic quantities is usually called magnetic measurements. With the help of methods and equipment of magnetic measurements, a wide variety of problems are currently being solved. The main ones include the following: measurement of magnetic quantities (magnetic induction, magnetic flux, magnetic moment etc.); characterization magnetic materials; study of electromagnetic mechanisms; measurement of the magnetic field of the Earth and other planets; study physical and chemical properties materials (magnetic analysis); research magnetic properties atom and atomic nucleus; determination of defects in materials and products (magnetic flaw detection), etc. Despite the variety of problems solved using magnetic measurements, usually only a few basic magnetic quantities are determined: Moreover, in many methods of measuring magnetic quantities, it is not actually measured magnetic and electric the quantity into which a magnetic quantity is converted during the measurement process. The magnetic quantity we are interested in is determined by calculation based on the known relationships between magnetic and electrical quantities. The theoretical basis of such methods is Maxwell's second equation, which relates the magnetic field to the electric field; these fields are two manifestations of a special type of matter called the electromagnetic field. Other (not only electrical) manifestations of the magnetic field, for example mechanical, optical, are also used in magnetic measurements. This chapter introduces the reader only to some of the ways to determine its basic magnetic quantities and the characteristics of magnetic materials .

1. Measurement of electrical circuit resistance and insulation

Measuring instruments

Insulation measuring instruments include megohmmeters: ESO 202, F4100, M4100/1-M4100/5, M4107/1, M4107/2, F4101. F4102/1, F4102/2, BM200/G and others, produced by domestic and foreign companies. Insulation resistance is measured with megohm meters (100-2500V) with measured values ​​in Ohm, kOhm and MOhm.

1. Trained electrical personnel who have a certificate of knowledge testing and a qualification group for electrical safety of at least 3rd, when performing measurements in installations up to 1000 V, and not lower than 4th, when measuring in installations above 1000, are allowed to perform insulation resistance measurements. IN.

2. Persons from electrical engineering personnel with secondary or higher specialized education may be allowed to process measurement results.

3. Analysis of measurement results should be carried out by personnel involved in the insulation of electrical equipment, cables and wires.

Safety requirements

1. When performing insulation resistance measurements, safety requirements must be met in accordance with GOST 12.3.019.80, GOST 12.2.007-75, Rules for the operation of consumer electrical installations and Safety rules for the operation of consumer electrical installations.

2. The premises used for measuring insulation must meet the explosion and fire safety requirements in accordance with GOST 12.01.004-91.

3. Measuring instruments must meet the safety requirements in accordance with GOST 2226182.

4. Megger measurements may only be carried out by trained electrical personnel. In installations with voltages above 1000 V, measurements are carried out by two persons at a time, one of whom must have electrical safety ratings of at least IV. Carrying out measurements during installation or repair is specified in the work order in the line “Entrusted”. In installations with voltages up to 1000 V, measurements are carried out by order of two persons, one of whom must have a group of at least III. An exception is the tests specified in clause BZ.7.20.

5. Measuring the insulation of a line that can receive voltage from both sides is permitted only if a message has been received from the responsible person of the electrical installation that is connected to the other end of this line by telephone, messenger, etc. (with a reverse check) that the line disconnectors and switch are turned off and a poster “Do not turn on. People are working” is posted.

6. Before starting the tests, it is necessary to make sure that there are no people working on the part of the electrical installation to which the test device is connected, to prohibit persons located near it from touching live parts and, if necessary, to set up security.

7. To monitor the insulation condition of electrical machines in accordance with methodological instructions or measurement programs with a megohmmeter on a stopped or rotating, but not excited machine, can be carried out by operating personnel or, by their order, in the order of routine operation by employees of the electrical laboratory. Under the supervision of operating personnel, these measurements can also be performed by maintenance personnel. Insulation tests of rotors, armatures and excitation circuits can be carried out by one person with an electrical safety group of at least III, stator insulation tests - by at least two persons, one of whom must have a group of at least IV, and the second - not lower than III.

8. When working with a megger, touching the live parts to which it is connected is prohibited. After completion of work, it is necessary to remove the residual charge from the equipment being tested by briefly grounding it. The person removing the residual charge must wear dielectric gloves and stand on an insulated base.

9. Taking measurements with a megger is prohibited: on one circuit of double-circuit lines with a voltage above 1000 V, while the other circuit is energized; on a single-circuit line, if it runs in parallel with a working line with a voltage above 1000 V; during a thunderstorm or when it is approaching.

10. Measuring the insulation resistance with a megger is carried out on disconnected live parts from which the charge has been removed by first grounding them. Grounding from live parts should be removed only after connecting the megger. When removing grounding, you must use dielectric gloves.

Measurement conditions

1. Insulation measurements must be carried out under normal climatic conditions in accordance with GOST 15150-85 and under normal power supply conditions or as specified in the manufacturer’s passport - technical description for megohmmeters.

2. The value of the electrical insulation resistance of the connecting wires of the measuring circuit must exceed at least 20 times the minimum permissible value of the electrical insulation resistance of the product under test.

3. The measurement is carried out indoors at a temperature of 25±10 °C and a relative air humidity of no more than 80%, unless other conditions are provided in the standards or technical specifications for cables, wires, cords and equipment.

Preparing to take measurements

In preparation for performing insulation resistance measurements, the following operations are carried out:

1. Check the climatic conditions at the place where the insulation resistance is measured with the measurement of temperature and humidity and the compliance of the room with regard to explosion and fire hazard in order to select a megger for the appropriate conditions.

2. Check by external inspection the condition of the selected megohmmeter, connecting conductors, and the operability of the megohmmeter in accordance with the technical description for the megohmmeter.

3. Check the validity period of the state verification on the megohmmeter.

4. Preparation of measurements of cable and wire samples is carried out in accordance with GOST 3345-76.

5. When performing periodic preventative work in electrical installations, as well as when performing work at reconstructed facilities in electrical installations, the preparation of the workplace is carried out by the electrical technical personnel of the enterprise, where the work is performed in accordance with the rules of PTBEEEP and PEEP.

Taking measurements

1. The reading of the values ​​of electrical insulation resistance during measurement is carried out after 1 minute from the moment the measuring voltage is applied to the sample, but not more than 5 minutes, unless other requirements are provided for in the standards or technical conditions for specific cable products or other equipment being measured.

Before re-measurement, all metal elements of the cable product must be grounded for at least 2 minutes.

2. The electrical insulation resistance of individual cores of single-core cables, wires and cords must be measured:

for products without a metal sheath, screen and armor - between the conductor and the metal rod or between the conductor and grounding;

for products with a metal shell, screen and armor - between the conductive conductor and the metal shell or screen, or armor.

3. The electrical insulation resistance of multi-core cables, wires and cords must be measured:

for products without a metal sheath, screen and armor - between each current-carrying conductor and the remaining conductors connected to each other or between each conductive conductor; residential and other conductors connected to each other and grounding;

for products with a metal shell, screen and armor - between each current-carrying conductor and the remaining conductors connected to each other and to the metal shell or screen, or armor.

4. If the insulation resistance of cables, wires and cords is lower than the normative rules of PUE, PEEP, GOST, it is necessary to perform repeated measurements by disconnecting the cables, wires and cords from the consumer terminals and separating the current-carrying conductors.

5. When measuring the insulation resistance of individual samples of cables, wires and cords, they must be selected for construction lengths, wound on drums or in coils, or samples with a length of at least 10 m, excluding the length of end cuts, if in the standards or technical specifications for cables , wires and cords, other lengths are not specified. The number of construction lengths and samples for measurement must be specified in the standards or technical specifications for cables, wires and cords.

Minsk: BNTU, 2003. - 116 pp. Introduction.
Classification of physical quantities.
Size of physical quantities. The true value of physical quantities.
The main postulate and axiom of measurement theory.
Theoretical models of material objects, phenomena and processes.
Physical models.
Mathematical models.
Errors of theoretical models.
General characteristics of the concept of measurement (information from metrology).
Classification of measurements.
Measurement as a physical process.
Measurement methods as methods of comparison with a measure.
Direct comparison methods.
Direct assessment method.
Direct conversion method.
Substitution method.
Scale transformation methods.
Bypass method.
Follow-up balancing method.
Bridge method.
Difference method.
Null methods.
Unfolding compensation method.
Measuring transformations of physical quantities.
Classification of measuring transducers.
Static characteristics and static errors of SI.
Characteristics of the impact (influence) of the environment and objects on SI.
Bands and uncertainty intervals of SI sensitivity.
SI with additive error (zero error).
SI with multiplicative error.
SI with additive and multiplicative errors.
Measuring large quantities.
Formulas for static errors of measuring instruments.
Full and working ranges of measuring instruments.
Dynamic errors of measuring instruments.
Dynamic error of the integrating link.
Causes of additive SI errors.
The influence of dry friction on the moving elements of the SI.
SI design.
Contact potential difference and thermoelectricity.
Contact potential difference.
Thermoelectric current.
Interference due to poor grounding.
Causes of SI multiplicative errors.
Aging and instability of SI parameters.
Nonlinearity of the transformation function.
Geometric nonlinearity.
Physical nonlinearity.
Leakage currents.
Active and passive protection measures.
Physics of random processes that determine the minimum measurement error.
Capabilities of the human visual organs.
Natural limits of measurements.
Heisenberg uncertainty relations.
Natural spectral width of emission lines.
The absolute limit on the accuracy of measuring the intensity and phase of electromagnetic signals.
Photon noise of coherent radiation.
Equivalent noise radiation temperature.
Electrical interference, fluctuations and noise.
Physics of internal nonequilibrium electrical noise.
Shot noise.
Noise generation - recombination.
1/f noise and its versatility.
Impulse noise.
Physics of internal equilibrium noise.
Statistical model of thermal fluctuations in equilibrium systems.
Mathematical model of fluctuations.
The simplest physical model of equilibrium fluctuations.
Basic formula for calculating fluctuation dispersion.
The influence of fluctuations on the sensitivity threshold of devices.
Examples of calculating thermal fluctuations of mechanical quantities.
Free body speed.
Oscillations of a mathematical pendulum.
Rotations of an elastically suspended mirror.
Displacements of spring scales.
Thermal fluctuations in an electrical oscillatory circuit.
Correlation function and noise power spectral density.
Fluctuation-dissipation theorem.
Nyquist formulas.
Spectral density of voltage and current fluctuations in an oscillatory circuit.
Equivalent temperature of non-thermal noise.
External electromagnetic noise and interference and methods for their reduction.
Capacitive coupling (capacitive interference).
Inductive coupling (inductive interference).
Shielding conductors from magnetic fields.
Features of a conductive screen without current.
Features of a conductive screen with current.
Magnetic connection between a current-carrying screen and a conductor enclosed in it.
Using a current-carrying conductive screen as a signal conductor.
Protecting space from radiation from a current-carrying conductor.
Analysis of various signal circuit protection schemes by shielding.
Comparison of coaxial cable and shielded twisted pair.
Features of the screen in the form of a braid.
Influence of current inhomogeneity in the screen.
Selective shielding.
Suppression of noise in a signal circuit by its balancing method.
Additional noise reduction methods.
Nutrition breakdown.
Decoupling filters.
Protection against radiation of high-frequency noisy elements and circuits.
Digital circuit noise.
Conclusions.
Application of screens made of thin sheet metals.
Near and far electromagnetic fields.
Shielding effectiveness.
Total characteristic impedance and shield resistance.
Absorption losses.
Reflection loss.
Total absorption and reflection losses for a magnetic field.
The influence of holes on shielding efficiency.
The influence of cracks and holes.
Using a waveguide at a frequency below the cutoff frequency.
Effect of round holes.
Use of conductive spacers to reduce radiation in gaps.
Conclusions.
Noise characteristics of contacts and their protection.
Glow discharge.
Arc discharge.
Comparison of AC and DC circuits.
Contact material.
Inductive loads.
Principles of contact protection.
Transient suppression for inductive loads.
Contact protection circuits for inductive loads.
Chain with container.
Circuit with capacitance and resistor.
Circuit with capacitance, resistor and diode.
Contact protection for resistive loads.
Recommendations for choosing contact protection circuits.
Passport details for contacts.
Conclusions.
General methods for increasing measurement accuracy.
Method of matching measuring transducers.
An ideal current generator and an ideal voltage generator.
Coordination of generator power supply resistances.
Resistance matching of parametric converters.
The fundamental difference between information and energy chains.
Use of matching transformers.
Negative feedback method.
Bandwidth reduction method.
Equivalent noise transmission bandwidth.
Signal averaging (accumulation) method.
Signal and noise filtering method.
Problems of creating an optimal filter.
Method of transferring the spectrum of a useful signal.
Phase detection method.
Synchronous detection method.
Error of noise integration using RC chain.
SI conversion coefficient modulation method.
Application of signal modulation to increase its noise immunity.
Method of differential inclusion of two power supplies.
Method for correcting SI elements.
Methods to reduce the influence of the environment and changing conditions.
Organization of measurements.

UDC 389.6 BBK 30.10ya7 K59 Kozlov M.G. Metrology and standardization: Textbook M., St. Petersburg: Publishing house "Petersburg Printing Institute", 2001. 372 p. 1000 copies

Reviewers: L.A. Konopelko, Doctor of Technical Sciences, Professor V.A. Spaev, Doctor of Technical Sciences, Professor

The book sets out the basics of the system for ensuring the uniformity of measurements, which are currently generally accepted on the territory of the Russian Federation. Metrology and standardization are considered as sciences built on scientific and technical legislation, a system for creating and storing standards of units of physical quantities, a service of standard reference data and a service of reference materials. The book contains information about the principles of creating measuring equipment, which is considered as an object of attention of specialists involved in ensuring the uniformity of measurements. Measuring equipment is categorized according to types of measurements based on the standards of the basic units of the SI system. The main provisions of the standardization and certification service in the Russian Federation are considered.

Recommended by UMO as a textbook for the following specialties: 281400 - “Printing Production Technology”, 170800 - “Automated Printing Equipment”, 220200 - “Automated Information Processing and Management Systems”

The original layout was prepared by the publishing house "Petersburg Institute of Printing"

ISBN 5-93422-014-4

© M.G. Kozlov, 2001. © N.A. Aksinenko, design, 2001. © Petersburg Printing Institute Publishing House, 2001.

http://www.hi-edu.ru/e-books/xbook109/01/index.html?part-002.htm

Preface

Part I. METROLOGY

1. Introduction to metrology

1.1. Historical aspects of metrology

1.2. Basic concepts and categories of metrology

1.3. Principles of constructing systems of units of physical quantities

1.4. Reproduction and transmission of the size of units of physical quantities. Standards and exemplary measuring instruments

1.5. Measuring instruments and installations

1.6. Measures in metrology and measuring technology. Verification of measuring instruments

1.7. Physical constants and standard reference data

1.8. Standardization to ensure uniformity of measurements. Metrological dictionary

2. Fundamentals of constructing systems of units of physical quantities

2.1. Systems of units of physical quantities

2.2. Dimension formulas

2.3. Basic SI units

2.4. SI unit of length is meter

2.5. The SI unit of time is the second.

2.6. SI unit of temperature - Kelvin

2.7. Unit of force electric current SI systems - Ampere

2.8. Implementation of the basic SI unit, the luminous intensity unit, the candela

2.9. The SI unit of mass is the kilogram.

2.10. The SI unit of quantity of a substance is the mole.

3. Estimation of errors of measurement results

3.1. Introduction

3.2. Systematic errors

3.3. Random measurement errors

Part II. MEASURING TECHNOLOGY

4. Introduction to Measurement Technology

5. Measurements of mechanical quantities

5.1. Linear measurements

5.2. Roughness measurements

5.3. Hardness measurements

5.4. Pressure measurements

5.5. Mass and force measurements

5.6. Viscosity measurements

5.7. Density measurement

6. Temperature measurements

6.1. Temperature measurement methods

6.2. Contact thermometers

6.3. Non-contact thermometers

7. Electrical and magnetic measurements

7.1. Electrical measurements

7.2. Principles underlying magnetic measurements

7.3. Magnetic transducers

7.4. Instruments for measuring magnetic field parameters

7.5. Quantum magnetometric and galvanomagnetic devices

7.6. Induction magnetometric instruments

8. Optical measurements

8.1. General provisions

8.2. Photometric instruments

8.3. Spectral measuring instruments

8.4. Filter spectral devices

8.5. Interference spectral devices

9. PHYSICAL AND CHEMICAL MEASUREMENTS

9.1. Features of measuring the composition of substances and materials

9.2. Humidity measurements of substances and materials

9.3. Analysis of the composition of gas mixtures

9.4. Composition measurements of liquids and solids

9.5. Metrological support of physical and chemical measurements

Part III. STANDARDIZATION AND CERTIFICATION

10. Organizational and methodological foundations of metrology and standardization

10.1. Introduction

10.2. Legal basis of metrology and standardization

10.3. International organizations for standardization and metrology

10.4. Structure and functions of the bodies of the State Standard of the Russian Federation

10.5. State services for metrology and standardization of the Russian Federation

10.6. Functions of metrological services of enterprises and institutions that are legal entities

11. Basic provisions of the state standardization service of the Russian Federation

11.1. Scientific base of standardization of the Russian Federation

11.2. Bodies and services of standardization systems of the Russian Federation

11.3. Characteristics of standards of different categories

11.4. Catalogs and product classifiers as an object of standardization. Standardization of services

12. Certification of measuring equipment

12.1. Main goals and objectives of certification

12.2. Terms and definitions specific to certification

12.3. 12.3. Certification systems and schemes

12.4. Mandatory and voluntary certification

12.5. Rules and procedure for certification

12.6. Accreditation of certification bodies

12.7. Service certification

Conclusion

Applications

Preface

The content of the concepts of “metrology” and “standardization” is still the subject of debate, although the need for a professional approach to these problems is obvious. So in last years Numerous works have appeared in which metrology and standardization are presented as a tool for certification of measuring equipment, goods and services. By this way of posing the question, all concepts of metrology are belittled and given meaning as a set of rules, laws, and documents that make it possible to ensure high quality of commercial products.

In fact, metrology and standardization has been a very serious scientific pursuit since the founding of the Depot of Exemplary Measures in Russia (1842), which was then transformed into the Main Chamber of Weights and Measures of Russia, headed for many years by the great scientist D.I. Mendeleev. Our country was one of the founders of the Metric Convention, adopted 125 years ago. During the years of Soviet power, a system of standardization of countries of mutual economic assistance was created. All this indicates that in our country, metrology and standardization have long been fundamental in organizing the system of weights and measures. It is these moments that are eternal and should have government support. With the development of market relations, the reputation of manufacturing companies should become a guarantee of the quality of goods, and metrology and standardization should fulfill the role of state scientific and methodological centers that collect the most accurate measuring instruments, the most promising technologies, and employ the most qualified specialists.

In this book, metrology is considered as a field of science, primarily physics, which must ensure the uniformity of measurements at the state level. Simply put, in science there must be a system that allows representatives of different sciences, such as physics, chemistry, biology, medicine, geology, etc., to speak the same language and understand each other. The means to achieve this result are the components of metrology: systems of units, standards, reference materials, reference data, terminology, error theory, system of standards. The first part of the book is devoted to the basics of metrology.

The second part is devoted to a description of the principles of creating measuring equipment. The sections of this part are presented as types of measurements are organized in the Gosstandart system of the Russian Federation: mechanical, temperature, electrical and magnetic, optical and physicochemical. Measuring technology is considered as an area of ​​direct use of the achievements of metrology.

The third part of the book is a brief description of the essence of certification - the area of ​​activity of modern centers of metrology and standardization in our country. Since standards vary from country to country, there is a need to check all aspects of international cooperation (products, measuring equipment, services) against the standards of the countries where they are used.

The book is intended for a wide range of specialists working with specific measuring instruments in various fields of activity from trade to quality control of technological processes and environmental measurements. The presentation omits details of some sections of physics that do not have a defining metrological character and are available in the specialized literature. Much attention is paid to the physical meaning of using the metrological approach to solving practical problems. It is assumed that the reader is familiar with the basics of physics and has at least a general understanding of modern achievements of science and technology, such as laser technology, superconductivity, etc.

The book is intended for specialists who use certain instruments and are interested in providing the measurements they need in an optimal way. These are undergraduate and graduate students of universities who specialize in sciences based on measurements. I would like to see the presented material as a link between courses in general scientific disciplines and special courses on presenting the essence of modern production technologies.

The material is written based on a course of lectures on metrology and standardization given by the author at the St. Petersburg Institute of the Moscow State University of Printing Arts and at the St. Petersburg State University. This made it possible to adjust the presentation of the material, making it understandable for students of various specialties, from applicants to senior students.

The author expects the material to comply with the fundamental concepts of metrology and standardization based on experience personal work for almost a decade and a half in the State Standard of the USSR and the State Standard of the Russian Federation.