The development of the mechanical industry is reflected in the modernization of measurement technology, the ability to implement the principle of interchangeable production and other aspects related to the geometric characteristics of mechanical parts, tolerance adjustments and mechanical measurement.
These directly reflect the quality of products and the competitiveness of companies.
Mechanical measurement plays a significant role in mechanical manufacturing and is a crucial factor in ensuring product quality and production efficiency. The importance of measurement technology can be reflected in a number of ways, including:
Control the production process:
Measuring technology provides a control method for mechanical manufacturing, making it more accurate and improving the quality of mechanical manufacturing.
Improve product quality:
Measuring technology accurately measures the suitability of product materials and manufacturing technology, thereby improving product quality.
Increase competitiveness:
Advanced measurement technology can increase production efficiency, reduce costs and increase the competitiveness of companies.
Realize smart manufacturing:
With the continuous development of science and technology, the measurement range has been expanded, and measurements can be made from nanometers to several hundred meters.
Improve the process level:
Measuring technology can check whether the processed parts meet the design dimensions, whether the assembly accuracy meets the target value, and ensure the stability and reliability of the production process.
I. Basic Measurement Tasks
- Determine units of measurement and reference points.
- Select measuring instruments and measurement methods.
- Analyze measurement errors and measurement accuracy.
In manufacturing, to guarantee product quality, guarantee the interchangeability of components, analyze the processing technology of parts and take preventive measures to avoid the production of waste, it is necessary to measure and inspect the dimensions, angles, geometric shapes, relative positions of geometric elements, surface roughness and other technical conditions of raw parts and components.
Measurement refers to the comparison of the measured entity with the standard measurement unit, thereby determining the experimental process of the measured entity.
Inspection only needs to determine whether the part is qualified without measuring specific numerical values. Inspection is the general term for measurement and inspection.
Geometric measurement mainly refers to the measurement of dimension parameters and surface geometric shapes of various mechanical components.
Geometric parameters include length dimensions, angle parameters, coordinate (position) dimensions, surface geometric shape and position parameters, surface roughness, etc. Geometric measurement is an important measurement to ensure the quality of mechanical products and achieve interchangeable production.
Geometric measurement objects are diverse, and different measurement objects have different measured quantities.
For example, the measured quantities of holes and shafts are mainly diameters; measured quantities of box parts include length, width, height and hole spacing, etc.; Complex parts have complex measured quantities, such as screw helix errors and rolling cutters.
However, regardless of the shape, the measured parameters can be fundamentally classified into two types: length and angle, and complex quantities can be considered as combinations of length and angle.
The complete measurement process must include the following four elements:
(1) Measured object
From the point of view of the characteristics of geometric quantities, measurement objects can be divided into length, angle, shape error, surface roughness, etc.
Based on the characteristics of the measured parts, they can be divided into square parts, shaft parts, conical parts, box parts, cams, keys, threads, gears and miscellaneous tools.
(2) Unit of measurement
Length units include meters (m), millimeters (mm), and micrometers (μm), and angle units include degrees (°), minutes (′), seconds (″), radians (rad), and microradians (μrad) .
(3) Measurement method
It refers to the sum of the measuring methods, tools or instruments and measuring conditions used to complete the measurement task.
The basic measurement methods include direct measurement and indirect measurement, absolute measurement and relative measurement, contact measurement and non-contact measurement, one-sided measurement and comprehensive measurement, manual measurement and automatic measurement, process measurement and final measurement, active measurement and passive measurement , etc.
The corresponding measurement method should be selected in the most economical way based on the requirements of the measured object.
(4) Measurement accuracy
Measurement accuracy refers to the degree of consistency between the measurement result and the actual value of the measured object.
It is not the higher the accuracy the better, but the most economical way should be selected based on the accuracy requirements of the measured object.
II. Common knowledge of measurement
Units of measurement
China adopts legal measurement units based on the International System of Units.
1. Length units
In the mechanical manufacturing industry, millimeters (mm) and microns (μm) are commonly used units. Millimeters are the most commonly used measurement units in mechanical measurements.
When using millimeters, only the dimensional values need to be marked on the mechanical drawings, and the units can be omitted.
The main units of length measurement in English are feet (ft) and inches (in).
- 1 foot = 12 in.
- 1 in. = 25.4mm
2. Flat Angle Units
In legal measurement, the basic unit of plane angle is the radian (rad). A radian is the plane angle between two radii of a circle that cut an arc equal in length to the radius on the circumference.
In mechanical manufacturing, degrees (°) are commonly used as flat angle measurement units.
1° = π/180 (rad)
Unit name and symbol | Unit conversion | Unit name and symbol | Unit conversion |
Length I n mile mile feet in yard thousand · A |
1852m 1609.344m 0.3048m 0.0254m 0.9144m 25.4×10 ^{-6} l 10 ^{-10} me 10 ^{-15} l |
(') (“) Time |
（π/10800）rad （π/648000）rad 60's |
Area me ^{2} there is The mile ^{2} feet ^{2} in ^{2} |
10,000 meters ^{2} ^{100m2} 2.58999×10 ^{6} i ^{2} ^{0.0929030m2} 6.4516×10 ^{-4} i ^{2} |
Speed IN km/h m/min mile/h ft/s in/s |
0.514444m/s 0.277778m/s 0.0166667m/s 0.44704m/s 0.3048m/s 0.0254m/s |
Volume/Capacity me ^{3} me me) feet ^{3} in ^{3} UKgal USgal |
10 ^{-3} i ^{3} ^{0.0283168m3} 1.63871×10 ^{-5} i ^{3} 4.54609 cubic decimeters ^{3} |
Acceleration IN ^{2} ft/s ^{2} Girl Angular velocity |
10 ^{-2} IN ^{2}
(π/30) rad/s |
Classification of Measurement Methods
Classification based on whether the measured parameter is measured directly or not.
(1) Direct measurement
The measured quantity can be read directly from the reading device of the measuring instrument.
For example, using the chord height method to measure the diameter of a circle, measuring the diameter of the shaft or opening with a caliper or micrometer, and measuring the angle with a protractor.
(2) Indirect measurement
The measured quantity is obtained indirectly (as by calculation) based on the measured quantity that has a certain relationship with it.
For example, measuring the diameter of a circle by measuring the length of the chord S and the height of the chord H to calculate the diameter D of the circle.
To reduce measurement errors, direct measurement is generally used. Indirect measurement can be used when the measured quantity is not easily measured directly.
2. Classification based on whether the displayed value represents the entire measured quantity
(1) Absolute measurement
The actual value of the measured quantity can be read directly from the measuring instrument.
When using the absolute measurement method, the measuring range of the measuring instrument must exceed the size of the measured quantity.
(2) Relative measurement (comparative measurement)
Only the deviation of the measured quantity from the standard quantity can be obtained directly. Its measuring range is very narrow.
For example, using a gauge block as a reference, measuring length dimensions on an optical measuring machine.
Generally, the accuracy of relative measurement is higher than that of absolute measurement.
3. Classification based on the contact of the measuring head with the measured surface during measurement
(1) Contact measurement
During measurement, the measuring head of the measuring instrument comes into direct contact with the measured surface, and there is a mechanical measuring force, such as measuring dimensions with a micrometer.
(2) Non-contact measurement
During measurement, the measuring head of the measuring instrument does not directly contact the measured surface, but contacts the workpiece through other means (such as light, air, etc.), such as measuring the roughness of the surface with an optical profilometer.
Contact measurement may cause elastic deformation of the relevant parts of the measured surface and the measuring instrument, thereby affecting the measurement accuracy, while non-contact measurement has no such effect.
4. Classification based on the number of parameters measured in a single measurement
(1) Single-item measurement
Each parameter of the measured part is measured separately.
(2) Comprehensive measurement
Measures the comprehensive index that reflects the related parameters of the part.
Comprehensive measurement generally has higher efficiency and is more reliable to ensure parts interchangeability.
It is often used for inspection of finished parts. Single-item measurement can determine the errors of each parameter separately and is generally used for process analysis, process inspection and measurement of specified parameters.
III. Error and tolerance
1. Error
Processing error
During the production process of machined parts, it is difficult to achieve the ideal state of dimensional size, shape, microgeometry (surface roughness) and relative position of the parts, due to the influence of several factors, such as limitation of the machine tool accuracy, errors in the tool grinding angle and low rigidity of the process system.
Any machining method cannot produce absolutely precise parts. Even a batch of machined parts may differ due to several factors.
Even under the same processing conditions, the dimensions of the same batch of parts are also different.
To meet a given accuracy requirement, errors must be controlled within a specific range. To meet interchangeability requirements and bring the geometric parameters of parts with the same specifications closer together, processing errors must also be controlled.
The manifestation of processing errors generally takes several forms:
(1) Dimensional error: The error in the surface size of the part itself (such as the diameter error of a cylindrical surface) and the error in the surface size between parts (such as the distance between holes).
(2) Shape error: The degree to which the actual surface of the part deviates from the ideal surface in terms of shape, such as the cylindricity error of a cylindrical surface, the flatness error of a plane, etc.
(3) Position error: The degree to which the actual position of a surface, axis or plane of symmetry deviates from the ideal position, such as the parallelism error and the perpendicularity error between two surfaces.
(4) Surface quality: The microscopic roughness with small gaps and small peaks and valleys left on the surface of a part after processing.
These different types of errors are present simultaneously, among which dimensional error is the most basic. The precision of a part refers to the degree of conformity between the real and ideal values of geometric parameters.
The smaller the difference between the real and ideal values of the geometric parameters, that is, the smaller the error, the greater the machining precision.
Therefore, the accuracy of a part is expressed by the size of the error. It can be seen that the concepts of “accuracy” and “error” are just different focal points in evaluating the geometric parameters of a part, but essentially the same.
Measurement error
The difference between the actual measured value and the actual value of the measured geometric quantity is called measurement error. Measurement error is expressed as absolute error or relative error.
Absolute error: The absolute error δ is the difference between the actual measured value of the measured quantity and the true value, which is:
where X is the actual measured value (measured value) and X0 is the true value or agreed true value.
Relative error:
Relative error is the ratio between the absolute value of the absolute error and the real value of the measured geometric quantity. Since the true value of the measured geometric quantity cannot be obtained, the measured value of the measured geometric quantity is often used instead of the true value for estimation, that is:
There are several factors that contribute to measurement error, including:
1. Error in measuring tools:
Measuring tool error refers to the error inherent in the measuring tool itself, including errors in the design, manufacture and use of the measuring tool.
2. Method error:
Method error is the error caused by imperfect measurement method (including inaccurate calculation formulas, improper selection of measurement method, inaccurate installation and positioning of the part, etc.), which may cause measurement errors.
For example, in contact measurement, the measuring force of the measuring head may cause deformation of the measured part and the measuring device, resulting in measurement errors.
3. Environmental error:
Environmental error refers to the error caused by the environment that does not meet the standard measurement conditions during measurement, which may cause measurement errors.
For example, temperature, humidity, atmospheric pressure, lighting (causing parallax), vibration, electromagnetic fields, etc. that do not meet standards may cause measurement errors, among which the influence of temperature is particularly prominent.
For example, when measuring length, the prescribed standard ambient temperature is 20°C, but in actual measurement, the temperature of the measured part and measuring tool will produce deviations from the standard temperature and linear expansion coefficient of the measured material. part and measuring tool are different, which will produce some measurement errors.
Therefore, the ambient temperature should be reasonably controlled according to the measurement accuracy requirements to reduce the influence of temperature on measurement accuracy.
4. Human error:
Human error refers to errors caused by human factors, which can result in measurement errors.
For example, incorrect use of measuring instruments, inaccurate measurement alignment, reading or estimation errors by the person taking the measurement, etc., can cause measurement errors.
Measurement error classification:
1. Systematic error:
(1) Constant systematic error:
A constant systematic error is a measurement error whose absolute value and sign remain unchanged when the same quantity is measured several times under certain measurement conditions.
For example, the error of the gauge block used to adjust the instrument has the same influence on the measurement results of each measurement. This type of error can be eliminated from the measurement results using a correction method.
(2) Variable systematic error:
The absolute value and sign of the error during the measurement process change according to a certain rule.
For example, the indication error caused by eccentric installation of an indicator dial is a periodic variation following a sinusoidal law, and this measurement error can be eliminated by the compensation method.
2. Random error:
Random error is a measurement error that changes randomly, with unpredictable changes in absolute value and sign when measuring the same quantity several times under certain measurement conditions.
Random error is mainly caused by accidental or uncertain factors during the measurement process and is caused by many temporary and uncontrollable factors.
However, when repeated measurements are performed, errors follow statistical laws.
Therefore, probability theory and statistical principles are often used to deal with this.
In practical measurements, to reduce random errors, the same quantity can be measured several times, and the arithmetic mean can be taken as the measurement result.
3. Gross error:
Gross error refers to a measurement error that exceeds the expected measurement error under certain measurement conditions, which causes significant distortion of the measurement result. The measured value that contains gross errors is called an outlier.
The causes of gross errors can be subjective or objective. Subjective reasons include reading errors caused by the negligence of the person taking the measurement, and objective reasons include measurement errors caused by sudden external vibrations.
Since gross errors significantly distort measurement results, they must be eliminated in accordance with the criteria for identifying gross errors in the processing of measurement data.
It should be noted that the division between systematic errors and random errors is not absolute and can be transformed into each other under certain conditions.
In measurement, it is necessary to carry out serious, careful and meticulous observations and remove gross errors from a series of measurement data. In error analysis, mainly systematic errors and random errors are analyzed.
Although random errors cannot be corrected or eliminated, their size and patterns can be estimated using probability theory and statistical methods, and efforts should be made to reduce their impact.
Gross errors have a relatively large value and should be avoided as much as possible in measurements.
If gross errors have already occurred, they must be eliminated in accordance with the criteria for identifying gross errors. The commonly used criterion is the “3σ criterion”, also known as the three sigma rule.
2. Tolerance
To ensure the interchangeability of parts, tolerances are used to control errors.
The tolerance must be designed in accordance with standard regulations, and errors that inevitably occur in machining must be controlled to ensure that the finished parts are within the specified tolerance range for interchangeability.
Within the premise of satisfying functional requirements, the tolerance value must be set as high as possible to obtain the best economic benefit.
Thus, errors arise during the manufacturing process, while tolerances are determined by designers. If a part's error is within the tolerance range, it is a qualified part. However, if the error exceeds the tolerance range, it is a nonconforming part.
3. Meaningful Numbers and Processing Principles.
Selecting the number of digits in a measured result is a common problem encountered during the measurement process.
The number of significant digits in the measured result should not be too large, which may cause people to mistakenly believe that the measurement accuracy is high.
At the same time, it should not be too small, which may cause loss of accuracy. Therefore, the number of significant digits of the measurement result must be determined correctly based on the size of the measurement error.
For example, when measuring the length of an object with a steel ruler with a division value of 1 mm, the length is 123.4 mm, where 123 mm is read directly from the steel ruler and is accurate.
The last digit, 0.4 mm, is estimated by the human eye and is unreliable or questionable. The measured data should be expressed in this way, with the last digit being the questionable digit and the error occurring in this digit.
When the number of significant digits is determined, the principle for determining the last significant digit is as follows:
(1) If the first significant figure after the last significant figure is greater than 5, add 1 to the last significant figure, and if it is less than 5, disregard it.
(2) When the first digit after the last significant figure is 5, the last significant figure must be adjusted to an even number (add 1 when the last significant figure is odd and keep it the same when it is even).
For example, if significant figures are reserved to the third decimal place, the significant figures are as follows:
3.14159 – significant figures 3.142
(3) In addition and subtraction operations, the number of decimal places to reserve must be the smallest number of decimal places among all numbers, for example:
60.43 + 12.317 + 5.022 – 77.769 ≈ 77.77
(4) In multiplication and division operations, the number of significant figures must be the smallest, for example:
2352 × 0.211 = 496.272 ≈ 496
0.0222 × 34.5 × 2.01 = 1.539459 ≈ 1.54.
(5) The number of digits in logarithmic operations must be equal to the number of effective digits in the real number.
(6) In exponentiation operations, the number of significant figures in the exponent must be equal to the number of significant figures in the base.
(7) In square root operations, the number of significant figures must be equal to the number of significant figures in the radicand.
(8) When mathematical constants such as π and 2 are involved in the operation, determine their significant digits according to the above method. To ensure the accuracy of the final operation result, these constants can be selected appropriately by 1-2 digits.
(9) For values representing measurement accuracy, such as limiting measurement errors and standard deviations, only one or two significant figures should be considered, and the last digit should be consistent with the last digit of the corresponding measurement result.
For example,
34.0234 ± 0.00021 should be written as 34.0234 ± 0.0002.
4. Types and methods of mechanical measurement
Length measurement
Length measurement is a crucial aspect of mechanical measurement systems. There are several methods for measuring length, including:
- Calipers : These devices consist of a main scale and a sliding vernier scale. With an accuracy of up to 0.02 mm, they are often used for small-scale measurements.
- Micrometers : Similar to vernier calipers, micrometers offer greater precision, typically around 0.001 mm. They are used to measure the thickness or diameter of objects.
- Coordinate Measuring Machines (CMMs) : These are advanced instruments used to make highly precise measurements with an accuracy of 0.001 mm or better. They employ a touch probe to determine the three-dimensional coordinates of points on an object's surface.
Force and Torque Measurement
Force and torque are critical parameters in mechanical systems. Some common methods for measuring them are:
- Load cells : Load cells convert the mechanical force exerted on them into electrical signals. These devices are widely used in scales and load measuring systems.
- Strain gauges : These are glued to the surface of a test sample. As the sample deforms under tension, the strain gauge changes its electrical resistance, which can be correlated with the applied force.
- Torque wrenches and transducers : Used to measure and control the torque applied during assembly or maintenance operations.
Pressure measurement
Pressure measurement is essential in fluid mechanics applications. Some standard methods for measuring pressure are:
- Bourdon tubes : These are C-shaped or spiral tubes that deform under pressure, causing a pointer to move along a calibrated scale.
- Pressure Gauges : Pressure gauges measure pressure by comparing the height of a column of liquid in the device to a reference level.
- Pressure Transducers : These sensors convert pressure into electrical signals and are often used in automated systems for monitoring and control.
Temperature measurement
Temperature is a fundamental parameter in mechanical systems, affecting the properties and performance of materials. Common methods for measuring temperature include:
- Thermocouples : These devices consist of two different metal wires joined at one end, forming a junction. When heated, the junction generates a small voltage proportional to the temperature.
- Resistance Temperature Detectors (RTDs) : RTDs are based on the principle that the electrical resistance of certain materials increases with temperature, allowing for highly accurate measurements.
- Infrared thermometers : These non-contact devices measure temperature by detecting infrared radiation emitted by an object.
Flow Measurement
Flow measurement is necessary for fluid mechanics applications, for example in piping systems or process control. Some techniques for measuring flow are:
- Orifice plates : These are flat plates with a hole placed in the flow, creating a pressure drop proportional to the velocity of the fluid.
- Turbine flow meters : These meters use a turbine wheel placed in the flow, rotating at a speed proportional to the flow rate.
- Ultrasonic flow meters : These devices measure the transit time of ultrasonic waves in the fluid, which varies with flow speed, allowing accurate measurements without interrupting the flow.
These methods represent a selection of common techniques used for mechanical measurement, providing a basis for understanding the complexities and importance of accurate measurements in modern engineering applications.
common questions
What are the 3 most common types of measurement in engineering?
- Linear measurement : This involves measuring the distance between two points such as length, width and height. Examples include tape measures and calipers.
- Angular measurement : deals with measuring angles, as the name suggests. Universal bevel protractors and protractors are common tools used for angular measurement.
- Temperature measurement : This is crucial for evaluating the thermal properties of materials and processes in engineering. Thermocouples and thermistors are popular instruments used for temperature measurement.
What are 20 popular mechanical measuring instruments and their uses?
- Tape measure : Used to measure long distances.
- Caliper : Measures small linear dimensions.
- Micrometer : Measures the thickness or diameter of tiny objects.
- Protractor : Measures angles between two lines.
- Dial gauge : Measures variations in height or depth.
- Engineer's Square : Checks the straightness and flatness of surfaces.
- Bevel gauge : Measures angles other than right angles.
- Feeler gauge : Checks the gap between two parts.
- Plunger indicator indicator : Measures small linear displacements.
- Thermocouple : Measures temperature.
- Thermistor : Measures temperature.
- Anemometer : Measures air speed.
- Hydrometer : Measures the density or specific gravity of liquids.
- Barometer : Measures atmospheric pressure.
- U-tube pressure gauge : Measures the pressure difference in fluids.
- Level : Determines the horizontal plane.
- Clinometer : Measures slopes or slopes.
- Tachometer : Measures rotational speed.
- Stroboscope : Measures rotational speed using a flashing light.
- Flow meter : Measures the flow rate of fluid.
What are some essential tools for engineering measurements?
Several essential tools for engineering measurements include tape measures, calipers, micrometers, dial indicators, engineering squares, protractors, and thermocouples. Together, these instruments offer a reliable way to accurately measure various mechanical quantities.
What is the importance of measurement in the engineering field?
Measurement plays a fundamental role in engineering as it allows engineers to:
- Inspect and verify the accuracy of manufactured components.
- Ensure products meet desired specifications.
- Maintain consistency in manufacturing and assembly processes.
- Evaluate the efficiency and performance of machines.
- Conduct research and improve existing projects.
What are the standard units for measuring mechanical quantities?
The International System of Units (SI) is the most widely used system for measuring mechanical quantities. Some standard units include:
- Meter (m) long.
- Kilogram (kg) for mass.
- Second(s) for time.
- Kelvin (K) for temperature.
- Newton (N) for force.
- Joule (J) for energy.
What types of measurement are commonly used in mechanical engineering?
In mechanical engineering, several types of measurement are commonly used, such as linear, angular, and temperature measurements. Other crucial measurement types include force, pressure, fluid flow, and vibration. These measurements are essential for designing, manufacturing and maintaining mechanical systems and components.