1. Trang chủ
  2. » Khoa Học Tự Nhiên

Introduction to quantum metrology; quantum standards and instrumentation

287 4 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Introduction to Quantum Metrology
Tác giả Waldemar Nawrocki
Trường học Poznan University of Technology
Chuyên ngành Electronics and Telecommunications
Thể loại thesis
Năm xuất bản 2007
Thành phố Poznan
Định dạng
Số trang 287
Dung lượng 9,57 MB

Cấu trúc

  • Preface

  • Contents

  • 1 Theoretical Background of Quantum Metrology

    • Abstract

    • 1.1 Introduction

    • 1.2 Schrödinger Equation and Pauli Exclusion Principle

    • 1.3 Heisenberg Uncertainty Principle

    • 1.4 Limits of Measurement Resolution

    • References

  • 2 Measures, Standards and Systems of Units

    • Abstract

    • 2.1 History of Systems of Measurement

    • 2.2 The International System of Units (SI)

    • 2.3 Measurements and Standards of Length

    • 2.4 Measurements and Standards of Mass

    • 2.5 Clocks and Measurements of Time

    • 2.6 Temperature Scales

    • 2.7 Standards of Electrical Quantities

    • References

  • 3 The New SI System of Units---The Quantum SI

    • Abstract

    • 3.1 Towards the New System of Units

    • 3.2 Units of Measure Based on Fundamental Physical Constants

    • 3.3 New Definitions of the Kilogram

    • 3.4 New Definitions of the Ampere, Kelvin and Mole

    • 3.5 Quantum Metrological Triangle and Pyramid

    • References

  • 4 Quantum Voltage Standards

    • Abstract

    • 4.1 Superconductivity

      • 4.1.1 Superconducting Materials

      • 4.1.2 Theories of Superconductivity

      • 4.1.3 Properties of Superconductors

    • 4.2 Josephson Effect

    • 4.3 Josephson Junctions

    • 4.4 Voltage Standards

      • 4.4.1 Voltage Standards with Weston Cells

      • 4.4.2 DC Voltage Josephson Standards

      • 4.4.3 AC Voltage Josephson Standards

        • 4.4.3.1 Voltage Sources with Binary Divided Josephson Junctions

        • 4.4.3.2 Voltage Standard with Pulse-Driven Josephson Junctions

      • 4.4.4 Voltage Standard at GUM

      • 4.4.5 Comparison GUM Standard with the BIPM Standard

      • 4.4.6 Precision Comparator Circuits

    • 4.5 Superconductor Digital Circuits

      • 4.5.1 Prospective Development of Semiconductor Digital Circuits

      • 4.5.2 Digital Circuits with Josephson Junctions

    • 4.6 Other Applications of Josephson Junctions

      • 4.6.1 Voltage-to-Frequency Converter

      • 4.6.2 Source of Terahertz Radiation

    • References

  • 5 SQUID Detectors of Magnetic Flux

    • Abstract

    • 5.1 Quantization of Magnetic Flux

    • 5.2 RF-SQUID

      • 5.2.1 RF-SQUID Equation

      • 5.2.2 Measurement System with an RF-SQUID

    • 5.3 DC-SQUID

      • 5.3.1 DC-SQUID Equation

      • 5.3.2 Energy Resolution and Noise of the DC-SQUID

      • 5.3.3 Parameters of a DC-SQUID

    • 5.4 Measurement System with a DC-SQUID

      • 5.4.1 Operation of the Measurement System

      • 5.4.2 Input Circuit

      • 5.4.3 Two-SQUID Measurement System

      • 5.4.4 SQUID Measurement System with Additional Positive Feedback

      • 5.4.5 Digital SQUID Measurement System

    • 5.5 Magnetic Measurements with SQUID Systems

      • 5.5.1 Magnetic Signals and Interference

      • 5.5.2 Biomagnetic Studies

      • 5.5.3 Nondestructive Evaluation of Materials

    • 5.6 SQUID Noise Thermometers

      • 5.6.1 R-SQUID Noise Thermometer

      • 5.6.2 DC-SQUID Noise Thermometer

      • 5.6.3 Other Applications of SQUIDs

    • References

  • 6 Quantum Hall Effect and the Resistance Standard

    • Abstract

    • 6.1 Hall Effect

    • 6.2 Quantum Hall Effect

      • 6.2.1 Electronic Devices with 2-DEG

      • 6.2.2 Physical Grounds of the Quantum Hall Effect

      • 6.2.3 QHE Samples

      • 6.2.4 Quantum Hall Effect in Graphene

    • 6.3 Measurement Setup of the Classical Electrical Resistance Standard at the GUM

    • 6.4 Quantum Standard Measurement Systems

    • 6.5 Quantum Standard of Electrical Resistance in the SI System

    • References

  • 7 Quantization of Electrical and Thermal Conductance in Nanostructures

    • Abstract

    • 7.1 Theories of Electrical Conduction

    • 7.2 Macroscopic and Nanoscale Structures

    • 7.3 Studies of Conductance Quantization in Nanostructures

      • 7.3.1 Formation of Nanostructures

      • 7.3.2 Measurements of Dynamically Formed Nanowires

    • 7.4 Quantization of Thermal Conductance in Nanostructures

    • 7.5 Scientific and Technological Impacts of Conductance Quantization in Nanostructures

    • References

  • 8 Single Electron Tunneling

    • Abstract

    • 8.1 Electron Tunneling

      • 8.1.1 Phenomenon of Tunneling

      • 8.1.2 Theory of Single Electron Tunneling

    • 8.2 Electronic Circuits with SET Junctions

      • 8.2.1 SETT Transistor

      • 8.2.2 Electron Pump and Turnstile Device

    • 8.3 Capacitance Standard Based on Counting Electrons

    • 8.4 Thermometer with the Coulomb Blockade

    • References

  • 9 Atomic Clocks and Time Scales

    • Abstract

    • 9.1 Theoretical Principles

      • 9.1.1 Introduction

      • 9.1.2 Allan Variance

      • 9.1.3 Structure and Types of Atomic Standards

    • 9.2 Caesium Atomic Frequency Standards

      • 9.2.1 Caesium-Beam Frequency Standard

      • 9.2.2 Caesium Fountain Frequency Standard

    • 9.3 Hydrogen Maser and Rubidium Frequency Standard

      • 9.3.1 Hydrogen Maser Frequency Standard

      • 9.3.2 Rubidium Frequency Standard

      • 9.3.3 Parameters of Atomic Frequency Standards

    • 9.4 Optical Radiation Frequency Standards

      • 9.4.1 Sources of Optical Radiation

      • 9.4.2 Optical Frequency Comb

    • 9.5 Time Scales

    • 9.6 National Time and Frequency Standard in Poland

    • References

  • 10 Standards and Measurements of Length

    • Abstract

    • 10.1 Introduction

    • 10.2 Realization of the Definition of the Metre

      • 10.2.1 CIPM Recommendations Concerning the Realization of the Metre

      • 10.2.2 Measurements of Length by the CIPM Recommendation

    • 10.3 Iodine-Stabilized 633 nm He-Ne Laser

    • 10.4 Satellite Positioning Systems

      • 10.4.1 Positioning Systems

      • 10.4.2 Global Positioning System

      • 10.4.3 GLONASS Positioning System

      • 10.4.4 Galileo Positioning System

      • 10.4.5 Regional Positioning Systems: BeiDou, IRNSS and QZSS

    • References

  • 11 Scanning Probe Microscopes

    • Abstract

    • 11.1 Atomic Resolution Microscopes

      • 11.1.1 Operating Principle of a Scanning Probe Microscope

      • 11.1.2 Types of Near-Field Interactions in SPM

      • 11.1.3 Basic Parameters of SPM

    • 11.2 Scanning Tunneling Microscope

    • 11.3 Atomic Force Microscope

      • 11.3.1 Atomic Forces

      • 11.3.2 Performance of Atomic Force Microscope

      • 11.3.3 Measurements of Microscope Cantilever Deflection

      • 11.3.4 AFM with Measurement of Cantilever Resonance Oscillation

    • 11.4 Electrostatic Force Microscope

    • 11.5 Scanning Thermal Microscope

    • 11.6 Scanning Near-Field Optical Microscope

    • 11.7 Opportunities of Scanning Probe Microscopy Development

    • References

  • 12 New Standards of Mass

    • Abstract

    • 12.1 Introduction

    • 12.2 Mass Standards Based on the Planck Constant

      • 12.2.1 Watt Balance Standard

      • 12.2.2 Levitation Standard and Electrostatic Standard

    • 12.3 Silicon Sphere Standard

      • 12.3.1 Reference Mass and the Avogadro Constant

      • 12.3.2 Measurement of Volume of Silicon Unit Cell

      • 12.3.3 Measurement of Volume of Silicon Sphere

    • 12.4 Ions Accumulation Standard

    • References

  • Index

Nội dung

Introduction

Measurement consists in comparing the measured stateAxof a quantity to its state

A reference state, as illustrated in Fig 1.1, defines the accuracy of measurements, which is inherently limited by the precision of the standard used For years, metrologists have focused on developing standards reliant solely on fundamental physical and atomic constants These advancements allow for the realization of measurement units grounded in quantum phenomena One key area of research in metrology aims to establish a new measurement system based on quantum and atomic standards, intended to supplant the traditional SI system.

This chapter focuses on the evolution and significance of measurement systems, particularly the International System of Units (SI) It delves into the history of measurement standards and outlines the base units of the SI system Subsequent chapters explore key quantum phenomena essential for metrology, including the Josephson effect, the quantum Hall effect, and single-electron tunneling These phenomena are crucial not only for electrical metrology but also for broader scientific applications Notably, Brian David Josephson received the Nobel Prize in 1973 for his work on voltage quantization, while Klaus von Klitzing was awarded the Nobel Prize in 1985 for discovering the quantum Hall effect.

Over the past 25 years, standards for quantum measurement systems based on quantum mechanical phenomena have been established Quantum mechanics, which began with Max Planck's 1900 proposal of a formula for electromagnetic radiation emission, introduced the concept of energy quanta, defined by the Planck constant This approach significantly improved the understanding of electromagnetic radiation compared to classical physics models The development of quantum mechanics in the 1920s, spearheaded by Erwin Schrödinger's equation and Werner Heisenberg's uncertainty principle, along with contributions from notable physicists like Louis de Broglie and Niels Bohr, challenged conventional perceptions of physical phenomena A striking example of this is the observation of electric current flowing simultaneously in both directions within a circuit As Niels Bohr famously stated, "Anyone who is not shocked by quantum theory probably has not understood it." Quantum mechanical phenomena are now utilized in various fields of metrology.

The development of quantum standards for measuring physical quantities encompasses both electrical and non-electrical units This includes the establishment of precise standards for electrical quantities like voltage, electrical resistance, and electrical current, as well as non-electrical standards such as atomic clocks and laser-based measurements for length.

• The determination of the physical limits of measurement precision by the Heisenberg uncertainty principle;

• The construction of extremely sensitive electronic components: the magnetic flux sensor referred to as SQUID (Superconducting Quantum Interference Device) and the SET transistor based on single electron tunneling.

Fig 1.1 Measurement: a comparison between the object and standard

2 1 Theoretical Background of Quantum Metrology

Schr ử dinger Equation and Pauli Exclusion Principle

The foundation of quantum mechanics was established by Max Planck's groundbreaking discovery in 1900, where he formulated a theory based on black-body radiation measurements Planck proposed that energy is exchanged between particles and radiation in discrete packets, or quanta, which are proportional to a constant now known as the Planck constant (h = 6.626×10 −34 J s) and the frequency of radiation (f).

Measurements of energy density in the thermal, visible, and ultraviolet ranges (200 nm to 10 μm) revealed discrepancies with classical physics, particularly with the Rayleigh-Jeans formula, which accurately described energy density only in the far infrared range (above 5 μm) For shorter wavelengths, this formula predicted infinite energy density, leading to the phenomenon known as the ultraviolet catastrophe In contrast, Planck's formula, based on the quantization of energy, aligned well with experimental data across the entire wavelength spectrum.

The Planck law describes the relationship between spectral radiant emission and temperature, represented as u(f, T) based on frequency f and temperature T, or u(λ, T) based on wavelength λ and temperature T The formula is given by u(f, T) = (4hf^3) / (c^3), highlighting the dependence of emission on these variables.

The spectral radiant emission of a perfect black body is described by the equation \( u(f, T) = \frac{1}{e^{\frac{hf}{k_B T}} - 1} \), where \( f \) is the radiation frequency, \( T \) is the absolute temperature, \( h \) is the Planck constant, \( k_B \) is the Boltzmann constant, and \( c \) is the speed of light in a vacuum The groundbreaking Planck lecture on this topic, delivered on December 14, 1900, marked the inception of quantum mechanics In 1905, Albert Einstein further advanced this concept by analyzing the photoelectric effect, concluding that both emitted and absorbed light energy, denoted as \( E \), is quantized He asserted that light energy is carried in discrete packets, or quanta, represented by \( hf \), which are proportional to the frequency \( f \) and the Planck constant This revolutionary idea was met with skepticism, as it challenged the prevailing belief of light as a continuous wave.

firmed experimentally by the American physicist Robert A Millikan in 1915, he himself was very surprised at that result Moreover, to light quanta, later named

The Schrödinger Equation and the Pauli Exclusion Principle are fundamental concepts in quantum mechanics that describe the behavior of particles, including photons, which Einstein noted possess zero rest mass Experiments conducted by Walther Bothe and Arthur H Compton have provided significant evidence supporting the quantum nature of light, further enhancing our understanding of its properties.

Albert Einstein's groundbreaking research on the photoelectric effect earned him the Nobel Prize in 1921 and significantly impacted theoretical physics While he is frequently depicted as a critic of quantum mechanics, particularly its reliance on probability, his work on the photoelectric effect showcases his pivotal contributions to the field.

In 1905, Einstein made significant contributions to the development of quantum mechanics, particularly through his work on the specific heat of solids in 1907 In his publication on specific heat, he integrated elements of quantum theory with the classical theories of electrical and thermal conduction in metals, originally proposed by Paul Drude in 1900.

In 1924, Louis de Broglie proposed a groundbreaking hypothesis in his PhD thesis, suggesting that matter possesses both particle and wave characteristics, similar to light, which was already known to exhibit these dual properties This was supported by experimental evidence, including the bending of light rays in the gravitational field of stars, as predicted by Einstein De Broglie theorized that, if light could behave as both a particle and a wave, then elementary particles that make up matter might also share this dual nature He formulated a relationship linking the movement of a material particle to a wave, defined by the equations \( k = \frac{h}{\lambda} \) and \( f = \frac{E}{h} \), where \( \lambda \) represents the wavelength, \( p \) is the momentum, \( m \) is the mass, \( E \) is the energy, and \( h \) is the Planck constant.

Actual dependence by Planck`s formula

The spectral radiant emission of a black body at a temperature of 2,000 K is illustrated in Figure 1.2, showcasing both the Rayleigh-Jeans formula (dashed line) and the Planck formula (solid line) across thermal and visible spectral ranges.

4 1 Theoretical Background of Quantum Metrology

The movement of an electron at a speed of 10³ m/s is linked to a wavelength of approximately 7 × 10⁻⁷ m, characteristic of ultraviolet radiation In contrast, a neutron traveling at the same speed exhibits a de Broglie wavelength of about 4 × 10⁻¹³ m, similar to that of cosmic rays Particles with significantly larger masses, even at lower speeds like 1 m/s, produce wavelengths so short that they cannot be measured, rendering their wave properties unconfirmable For instance, a hypothetical particle with a mass of 10⁻³ g moving at 1 m/s would yield a de Broglie wavelength of around 7 × 10⁻²⁸ m, a scale at which neither such waves nor elementary particles of approximately 1 mg have been detected, leaving their existence uncertain.

The electron, discovered by John J Thomson in 1896, exemplifies the particle-wave duality of matter, possessing an electric charge of 1.602×10^−19 C and a mass of 9.11×10^−31 kg Its wave characteristics were experimentally confirmed through the diffraction of an electron beam passing through a metal foil, a phenomenon observed independently by George.

P Thomson (son of J.J Thomson, the discoverer of the electron), P.S Tartakovsky

[8] and the Polish physicist Szczepan Szczeniowski.

The Schrödinger equation, formulated by Austrian physicist Erwin Schrödinger in 1926, marks a significant milestone in physics, drawing on analogies between waves and particles Its validity remains unchallenged by experimental evidence, leading to the assumption that it is accurate While the Heisenberg uncertainty principle limits the precision of measuring a particle's parameters, the Schrödinger equation effectively describes the state of an elementary particle Central to this equation is the wave function, denoted as Ψ (psi), which illustrates the complex relationship between time and the particle's position coordinates.

The wave function Ψ describes the behavior of a particle, where A is a function dependent on time and the particle's coordinates The mass of the particle is denoted as m, while t represents time The Laplace operator is indicated by ∇, and ħ, the reduced Planck constant, is defined as h divided by 2π Additionally, j represents the imaginary unit in this context.

When the functionAis independent of time, it expresses the potential energy of the particle In such cases the Schrửdinger equation takes the form: h 2

2mr 2 WỵAWẳEW; ð1:5ị whereAis the potential energy of the particle, andEdenotes its total energy.

1.2 Schr ử dinger Equation and Pauli Exclusion Principle 5

In 1926, Max Born introduced a physical interpretation of the wave function, which represents the likelihood of a particle's presence in a specific volume, denoted as dV The probability is directly proportional to the square of the wave function's magnitude, expressed mathematically as p = k|W|² dV, where p signifies the probability, k is the proportionality constant, and V represents the spatial volume accessible to the particle.

The Pauli exclusion principle states that within an atom, no two electrons can share the same quantum state, meaning they cannot have identical sets of four quantum numbers This principle is crucial for understanding the behavior of separate atoms and nanostructures, including concepts such as the two-dimensional electron gas, nanostructures, and single electron tunneling.

Heisenberg Uncertainty Principle

In 1925, German physicist Werner Heisenberg, then just 24 years old, proposed a description of elementary particles that paralleled the Schrödinger equation Two years later, he introduced the uncertainty principle, a cornerstone of quantum mechanics, in a 1927 paper titled "Über den Inhalt der anschaulichen quantentheoretischen Kinematik und Mechanik." This principle highlights the limits of precision in determining a particle's state and is intrinsically linked to the particle-wave duality of matter Importantly, the uncertainty principle is independent of the measurement instruments' accuracy For instance, when the position (x) of an electron is measured with a certain uncertainty (Δx), it can also be represented as a wave beam with varying wavelengths The electron's wavelength (λ) is connected to its momentum, as expressed by the de Broglie formula: \( k = \frac{h}{p} \) where \( p = mv \), with \( m \) being the electron's mass and \( v \) its speed.

Along the segment Δx corresponding to the uncertainty in the position of the particle a wave hasnmaximums and the same number of minimums:

6 1 Theoretical Background of Quantum Metrology

A wave beam with zero amplitude beyond the segmentΔxmust include waves that have at least (n + 1) minimums and maximums along this segment:

Dx kDk nỵ1: ð1:9ị From (1.8) and (1.9) it follows that:

DxDk k 2 1: From the de Broglie formula we get:

Dk k 2 ẳDp h : Finally, we obtain the formula for theuncertainty principle:

DxDph=2; ð1:10ị where,ħ= h/2π—reduced the Planck constant.

The uncertainty principle states that the product of the uncertainties in position (Δx) and momentum (Δp) of a particle, such as an electron, cannot be less than half of the reduced Planck constant This principle highlights that precise measurements of position and momentum are inherently limited; if the uncertainty in position is minimized, the uncertainty in momentum must increase, and vice versa In three-dimensional space, this relationship is expressed through a set of three inequalities, reflecting the complexities of simultaneous measurements in multiple dimensions.

The uncertainty principle plays a crucial role in the behavior of nanometer-sized structures, particularly in understanding electron dynamics When the uncertainty in the position of an electron is measured at 2×10^−10 m—comparable to atomic dimensions—it leads to a quantifiable uncertainty in the electron's velocity, Δv, as described by the relevant formula.

This is a wide range, about three times wider than the velocity of the electron, νth, related to the thermal energyk B Tat room temperature (νth≈10 5 m/s).

By relinquishing the simultaneous determination of a particle's motion parameters and assuming a fixed position, we can utilize the formula (1.12) to establish the boundaries for the energy uncertainty ΔE and the uncertainty Δt related to the particle's lifetime or observation duration.

For instance, let us calculate the timeΔtnecessary for a measurement of energy with the uncertaintyΔE= 10 −3 eV = 1.6 ×10 −22 J:

Limits of Measurement Resolution

When measuring low electrical signals with quantum devices, an important question arises regarding the limits of energy resolution in these measurements Energy resolution refers to the smallest amount of energy or change in energy that can be accurately detected by a measuring instrument However, the specific limits of measurement resolution remain undefined, as they are influenced by inherent physical constraints.

• The Heisenberg uncertainty principle in the determination of parameters of elementary particles;

• The quantum noise of the measured object, which emits and/or absorbs elec- tromagnetic radiation;

• The thermal noise of the measured object.

The thermal noise power spectral density in an object at an absolute temperature

Tis described by the Planck equation:

Df ẳhfỵ hf expð k hf B T ị 1; ð1:13ị wherek B is the Boltzmann constant.

The Planck equation (1.13) takes two extreme forms depending on the rela- tionship between the thermal noise energy kBT and the quantum hf of energy of

8 1 Theoretical Background of Quantum Metrology electromagnetic radiation For kBT ≫ hf the Planck equation only includes the thermal noise component and takes the form of the Nyquist formula:

Df ffik B T: ð1:14ị Fork B T≪hfthe Planck equation only includes the quantum noise:

Thermal noise significantly influences spectral power at low frequencies, while quantum noise becomes more prominent at high frequencies The frequency at which thermal and quantum noise are equal, represented by the equation kBT = hf, varies with temperature For instance, at 300 K, this frequency is 6.2 × 10^12 Hz, whereas at the colder temperature of 2.7 K, typical of space, it rises to 56 GHz.

Energy is defined as a quantized physical quantity, with its uncertainty influenced by the measured frequency's precision, currently around 10^-16 Additionally, the accuracy of the Planck constant's determination is approximately 10^-9.

This article explores the estimation of the lower limit for measuring electric current, defined as the flow of electrons within a conductor The intensity of electric current is calculated as the derivative of the electric charge Q(t) over time t, or as the ratio of electric charge over a specific time interval.

Qto the timetover which the charge is transferred:

At the microscopic level, we can observe and quantify the flow of individual electrons, allowing us to calculate the corresponding electric current For instance, a flow of one billion electrons per second equates to a current of 160 picoamperes (pA) Modern ammeters and multimeters can measure such low currents effectively However, when the flow drops to one electron per second or less, it is more appropriate to track the movement of individual electrons rather than rely on average current measurements Additionally, sensors that measure physical quantities are influenced by the energy changes in the signal, with their sensitivity constrained by the Heisenberg uncertainty principle Currently, the highest energy resolution is achieved by SQUID sensors, reaching a physical limit of 0.5h In terms of linear displacement measurements, the best resolution achieved is 10^-6 Å, or 10^-16 meters.

The X-ray interferometer has achieved remarkable advancements in metrology, particularly when compared to atomic dimensions, such as the copper atom's diameter of 3.61Å and the gold atom's diameter of 4.08Å In the realm of displacement and geometric dimension measurements, the scanning tunneling microscope (STM) has demonstrated exceptional linear resolution, achieving Δa = 0.01Å (10^-12 m) and Δb = 0.1Å for vertical and horizontal measurements, respectively This precision allows for detailed studies of atomic arrangements on solid surfaces, with the STM operating on the principle of electron tunneling through a potential barrier.

1 R.D Deslattes, Optical and x-ray interferometry of a silicon lattice spacing Appl Phys Lett.

2 A Einstein, Ű ber einen die Erzuegung und Verwandlung des Lichtes betreffenden heuristischen Gesichtspunkt Ann Phys 17, 132 – 148 (1905)

3 R.H Feynman, R.B Leighton, M Sands, The Feynman Lectures on Physics, vol 1 (Addison- Wesley, Reading, 1964)

4 J.R Friedman, V Patel, W Chen, S.K Tolpygo, J.E Lukens, Quantum superposition of distinct macroscopic states Nature 406, 43 – 45 (2000)

5 S Hawking, A Brief History of Time (Bantam Books, New York, 1988)

6 V Kose, F Melchert, Quantenma b e in der elektrischen Me b technik (VCH Verlag, Weinheim, 1991)

7 W Nawrocki, Introduction to Quantum Metrology (in Polish) (Publishing House of Poznan University of Technology, Pozna ń , 2007)

8 I.W Savelev, Lectures on Physics, vol 3 (Polish edition, PWN, 2002)

10 1 Theoretical Background of Quantum Metrology

Measures, Standards and Systems of Units

This chapter explores the evolution of measurements and standards throughout history, focusing on the development of units for length, time, temperature, and electrical quantities It highlights the significance of temperature measurements and scales, which are discussed in detail despite not having a dedicated chapter The narrative traces the progression from ancient Chinese measurement systems to the establishment of international standards, culminating in the International System of Units (SI) The SI system is defined by its seven base units: meter, kilogram, second, ampere, kelvin, candela, and mole The Meter Convention is identified as a pivotal moment in fostering international collaboration for the unification of measurement units and the maintenance of quality standards Additionally, the chapter examines the structure of international and national measurement services, emphasizing the crucial roles of the International Bureau of Weights and Measures (BIPM) and national metrological institutes (NMIs).

History of Systems of Measurement

Measurement is based on a comparison of the actual state of the measured quantity with another state of this quantity, the reference state, which is reproduced by a standard—see Fig.1.1.

Since the dawn of civilization, humans have utilized measurements, with three primary physical quantities—length, mass, and time—being essential for survival The earliest evidence of measurement dates back 8,000 to 10,000 years Historically, efforts to create cohesive measurement systems have emerged, such as the comprehensive system introduced by Emperor Huang Ti in China around 2700 BC This system utilized the dimensions of a bamboo rod, where the unit of length was defined by the distance between two nodes, the unit of volume was the space within the bamboo rod between those nodes, and the unit of mass equated to the weight of 1,200 grains of rice, reflecting the volume's content.

A fourth physical quantity of importance for knowledge and production, the temperature, has only been measured since less than 400 years.

In modern civilization, measurements are crucial not only for trade but also for controlling various technological processes, including manufacturing, processing, and agriculture, as well as in experimental sciences and everyday devices For instance, an average car is equipped with around 100 sensors that measure temperature, pressure, liquid levels, and flow rates Advances in physics have led to the development of quantum standards and research instruments, facilitating further scientific breakthroughs The first international measurement system was established by the Romans, who introduced units like the Roman foot (295 mm) and the Roman mile (5,000 feet), spreading these measures across their conquered territories, including the Iberian Peninsula, North Africa, the British Isles, and the Balkans, potentially reaching present-day Poland Interestingly, the contemporary British foot measures 304.8 mm, indicating anthropological changes over time.

The trade in commodities during the eighteenth and nineteenth centuries prompted significant advancements in measurement standards, as increased population mobility created a demand for uniformity Local measures like the cubit, foot, pound, and Russian pood, which had been used independently across various cities and regions, needed to be replaced with internationally recognized standards to facilitate trade over larger areas.

The establishment of a coherent system of measurement for various physical quantities has evolved over a long period, culminating in the International System of Units (SI) This system began on June 22, 1799, with the official deposition of the platinum standards for the meter and kilogram in the Archives of the French Republic in Paris The meter is defined as one ten-millionth of the Earth's meridian segment that runs from the North Pole to the equator through Paris, while the kilogram is based on the mass of one-thousandth of a cubic meter of pure water at its maximum density.

In 1832, Carl Gauss introduced a coherent measurement system based on length, mass, and time, leading to the derivation of magnetic and electrical units The second was defined as a fraction of the solar day, with the millimeter as the unit of length and the gram as the unit of mass Subsequently, Gauss and Weber expanded this system to include electrical units The foundational principles of this coherent measurement system, comprising base and derived units, were further developed by James Clerk Maxwell and William Thomson with the support of the British Association for the Advancement of Science (BAAS).

Derived units depend on base units, making the accuracy of base units essential for precise measurements In 1874, the British Association for the Advancement of Science (BAAS) introduced the coherent CGS system, which established the centimeter, gram, and second as its base units for length, mass, and time This systematization followed earlier independent efforts in France, England, and Prussia to standardize units of measurement.

In response to resolutions proposed during the world expositions in Paris (1855 and 1867) and London (1862), the French government initiated the establishment of an international metric system, leading to the creation of a commission with representatives from thirty countries This collaboration culminated in the signing of the Meter Convention by seventeen nations on May 20, 1875, which officially designated the meter as the unit of length and the kilogram as the unit of mass The convention also established the International Bureau of Weights and Measures (BIPM), responsible for maintaining the international prototypes of the meter and kilogram, and overseeing ongoing global measurement standards.

• The establishment of basic standards and scales for the measurement of the most important physical quantities, and preservation of the international prototypes;

• Comparisons of national and international standards;

• Coordination of the measurement techniques relevant to calibration;

• Performance and coordination of measurements of fundamental physical constants.

National Metrological Institutes (NMIs) serve a crucial role on a national level, similar to the functions of the BIPM internationally Prominent NMIs include the National Institute of Standards and Technology (USA), Physikalisch-Technische Bundesanstalt (Germany), and the National Physical Laboratory (UK) The Convention of the Meter, an intergovernmental organization, comprises fifty-six member states and forty-one associated states, including Belarus, Lithuania, and Ukraine, as of 2014.

The MKS (meter-kilogram-second) system is the first international system of units, based on three fundamental units: the meter, the kilogram, and the second Unlike a complete system, the MKS is a collection of units that does not establish the relationships between them, as illustrated in Figure 2.1.

Fig 2.1 Base units of the MKS system, the fi rst international system of units

2.1 History of Systems of Measurement 13

The meter, an arbitrary unit of length, was established based on the French experience and defined as the distance between two lines on a platinum-iridium alloy bar (90% platinum and 10% iridium), known as the international prototype of the meter The kilogram, the unit of mass, is defined as the mass of 1/1000 m³ of water at its maximum density, with its prototype also made from the same alloy In the MKS system, the second is defined as 1/86,400 of a mean solar day, utilizing an astronomical basis for time measurement The decimal system is applied to all units in the MKS system and subsequent international measurement systems, except for time and angular measures.

Between 1878 and 1889, the BIPM commissioned thirty prototypes of the meter, necessitating a precise temperature scale for accurate measurements, as temperature was known to influence the standards The first General Conference on Weights and Measures (CGPM) in 1889 approved the meter and kilogram as official physical standards, with the original meter prototype, a Pt90Ir10 bar, preserved at the International Bureau of Weights and Measures under specific conditions The kilogram prototype from the same year remains the international mass standard Additionally, the CGPM established the hydrogen temperature scale, calibrated using a constant-volume hydrogen gas thermometer at the melting point of ice (0°) and the boiling point of water (100°) In 1946, the International Committee for Weights and Measures (CIPM) introduced the MKSA system, incorporating the ampere alongside the meter, kilogram, and second as its four base units.

The MKSA system, originally comprising four base units, has not been officially recognized by the CGPM However, in 1954, the 10th CGPM approved an updated version of the MKSA system, expanding it to six base units by incorporating the kelvin and candela.

The International System of Units (SI)

The International System of Units, commonly referred to as the SI system (from the French Système international d’unités), is the current measurement system in use today This system was officially adopted during the 11th General Conference on Weights and Measures.

1960 as an extension of the MKSA system used at that time At the beginning the system included six base units (valid since 1954), two supplementary units (radian

In 1971, the 14th General Conference on Weights and Measures (CGPM) expanded the International System of Units (SI) by adding the mole, which represents the amount of substance, bringing the total number of base units to seven The current base units of the SI system include the meter, kilogram, second, ampere, kelvin, candela, and mole, along with derived units and standards such as the steradian.

Base units are essential in the measurement system, as the accuracy of derived units relies on their precise representation These base units form the foundation for defining derived units across various fields, including geometry, mechanics, electricity, thermodynamics, magnetism, optics, acoustics, and ionizing radiation Metrological institutions aim to minimize uncertainty in base unit representation, although this can be a costly process In the current SI system, there are seven base units, of which only three—kilogram, second, and kelvin—are fully independent The definitions of the other four units—the meter, ampere, candela, and mole—are dependent on the kilogram and second.

A key benefit of a measurement system is the interconnectedness of physical quantities, where the unit of one quantity can be defined in terms of others For instance, the ampere, which measures electric current, can be expressed through measurements of force and length Notably, the kelvin stands out as the only base unit in the SI system that is not related to any other base unit.

Derived units in the International System of Units (SI) are defined as products of powers of base units, forming a coherent set known as coherent SI units This coherence indicates that derived units are expressed solely through base units, without any additional factors For instance, the derived units pascal and joule exemplify this principle, being defined in terms of base units through multiplication or division.

Fig 2.2 Base units of the

MKSA international system of measurement of 1946 and their interrelations

2.2 The International System of Units (SI) 15

The International System of Units (SI), established in 1960, has seen its base unit definitions adopted and revised by the General Conferences on Weights and Measures both prior to and following this date Here, we present the definitions of the SI base units.

The metre is the length of the path travelled by light in vacuum during a time interval of 1/

In 1983, the 17th CGPM established that the speed of light in a vacuum is precisely 299,792,458 meters per second (m/s) This definition underpins the international standard for the meter, which is defined with an uncertainty of 10^-12.

The kilogram is the unit of mass; it is equal to the mass of the international prototype of the kilogram.

The kilogram, defined by the 3rd CGPM in 1901, is based on the mass of the international prototype, which is precisely one kilogram (m(K) = 1 kg) However, this prototype experiences irreversible surface contamination, accumulating nearly 1 μg of contaminants each year As a result, the uncertainty in realizing the kilogram on an international scale is approximately 2×10⁻⁹.

The second is defined as 9,192,631,770 cycles of radiation emitted during the transition between two hyperfine levels of the ground state of a cesium-133 atom This definition applies to a cesium atom at rest and at a temperature of absolute zero (0 K).

The above definition was adopted by the 13th CGPM in 1967 It implies that the hyperfine splitting in the ground state of the cesium 133 atom is of exactly m s cd mol A

Fig 2.3 Base units of the SI 1 m system and their interrelations

16 2 Measures, Standards and Systems of Units

9 192 631 770 Hz:ν(hfs Cs)ẳ9 192 631 770 Hz The uncertainty of realization of the second on the international scale is better than 10 − 15 [1].

The ampere is defined as the constant current that, when maintained in two infinitely long, parallel conductors with negligible circular cross-sections placed 1 meter apart in a vacuum, generates a force of 2 × 10 −7 N per meter of length between them.

In 1948, the 9th CGPM established that the magnetic constant μ0, or vacuum permeability, is precisely defined as 4π×10 −7 H/m The uncertainty in realizing the ampere on the international scale is noted to be 9×10 −8.

The kelvin, unit of thermodynamic temperature, is the fraction 1/273.16 of the thermo- dynamic temperature of the triple point of water.

From this definition, adopted by the 13th CGPM in 1967, it follows that the thermodynamic temperature of the triple point of water is exactly 273.16 K:

The above definition of the kelvin refers to water with the isotopic composition defined exactly by the following proportions: 0.000 155 76 mol of 2 H per mole of

1H, 0.000 379 9 mol of 17 O per mole of 16 O, and 0.002 005 2 mol of 18 O per mole of 16 O The uncertainty of realization of the kelvin on the international scale is

The candela measures luminous intensity in a specific direction from a light source emitting monochromatic radiation at a frequency of 540 × 10^12 Hz, with a radiant intensity of 1/683 watts per steradian.

This definition was adopted by the 15th CGPM in 1979 It implies that the spectral luminous efficacy for monochromatic radiation of a frequency of

540ì 10 12 Hz is exactly 683 lm/W, Kẳ683 lm/Wẳ683 cd sr/W.

1 The mole is the amount of substance of a system which contains as many elementary entities as there are atoms in 0.012 kg of carbon 12; its symbol is

2 When the mole is used, the elementary entities must be specified and may be atoms, molecules, ions, electrons, other particles, or specified groups of such particles.

In 1971, the 14th CGPM established a definition for the mole, which pertains to unbound carbon-12 atoms at rest in their ground state This definition confirms that the molar mass of carbon-12 is precisely 12 grams per mole (M(12C) = 12 g/mol), with an international realization uncertainty of 2×10−9.

The International System of Units (SI) has undergone several revisions since its adoption 50 years ago, with the most notable change being the addition of the mole as a base unit The most recent update, the eighth edition of the SI Brochure, reflects these important modifications.

The Intergovernmental Organization of Meter Convention published a comprehensive document in 2006, spanning 180 pages This document includes three appendices that elaborate on the main chapters, detailing the parameters of the standards and the conditions required for their maintenance.

Measurements and Standards of Length

Measurements of distance and length vary widely, ranging from fractions of a millimeter to hundreds of kilometers Richard Feynman noted that this range extends from 10^-15 meters, representing the radius of a hydrogen nucleus, to 10^27 meters, which is the estimated diameter of the universe Historically, early measurements relied on natural standards, such as the foot, based on the length of a human foot, the cubit, defined as the distance from the fingertip to the elbow, and the inch.

Measurement standards have evolved from subjective units based on human anatomy, such as the width of a finger or stride, to larger units like the bow shot and verst, which was the maximum distance a voice could carry These larger measures, including those for distances traveled in a day on foot or horseback, were even more subjective For finer measurements, the barleycorn represented a length smaller than an inch, while the poppy seed was even smaller, equating to its diameter Additionally, sailors utilized the nautical mile, defined as one minute of arc of the Earth's circumference, to navigate vast distances accurately.

Natural units of length, such as the inch or cubit, offered the advantage of being readily available to everyone; however, they varied significantly between individuals This raised the question of whose body part defined the measurement, exemplified by the cubit in ancient Egypt being based on the pharaoh’s forearm, and the royal foot used during Charlemagne's reign The push for standardized measurements reflects a spirit of democracy, as seen in the 1584 definition of a measuring rod in Oppenheim, Germany, which established a uniform length of 16 feet.

To establish a reliable measuring rod, allow sixteen individuals of varying heights to step out of the church in succession The collective length of their footsteps will serve as a dependable standard for measurement.

By the end of the eighteenth century, France utilized numerous foot and cubit standards, creating confusion in trade In Paris, the royal foot, known as pied du roi, served as the official unit of length, measuring 32.484 cm and divided into 12 inches, 144 lines, and 1,728 points This multitude of measurement standards significantly obstructed the growth of commerce.

In the late seventeenth and eighteenth centuries, discussions in France, England, and the United States centered around the establishment of a universal unit of length grounded in the laws of nature Two distinct standards for this unit were proposed and evaluated during this period.

The Earth’s size is commonly measured using its circumference or radius, a standard established by the ancient Greek scholar Eratosthenes in 230 BC His pioneering work, "On the Measurement of the Earth," laid the foundation for understanding the planet's dimensions.

• The length of a pendulum with a 1-s period of free oscillation.

The period of free oscillation of a pendulum with a fixed length, established by Galileo in 1583 and further explored by Christiaan Huygens in 1656, remains constant Both scientists aimed to utilize this principle to create precise clocks, but only Huygens achieved success in this endeavor Subsequent research by Jean Richter continued to build on their foundational work.

A study conducted in 1675 revealed that the oscillation period of a pendulum is slightly influenced by the local acceleration due to gravity This gravitational acceleration varies on the Earth's surface and is dependent on the distance from the center of the Earth.

The measurement of gravitational acceleration (g) varies based on latitude and elevation above sea level, impacting the accuracy of length standards Richter's findings diminished the appeal of using a pendulum as a unit of length standard Specifically, the length of a pendulum that oscillates with a period of one second is calculated to be 248.562 mm, as determined by the formula l = g.

4p 2 T 2 ; ð2:1ị where lis the length of the pendulum, T its period of free oscillation, and g the acceleration of gravity (e.g.,gẳ9.81225 m/s 2 in Poznan, Poland, on the latitude of

In May 1790, the National Constituent Assembly of France approved a resolution to create measurement units grounded in natural laws, with the Academy of Sciences tasked to develop these units By March of the following year, significant progress had been made in this initiative.

In 1791, a distinguished committee of French scholars, including Lagrange, Lavoisier, and Laplace, proposed a new standard for the unit of length, the meter They defined the meter as one ten-millionth of the segment of the Earth's meridian that runs from the North Pole to the equator, passing through Paris This definition was subsequently supported by meticulous measurements of the meridian, establishing a precise artifact standard for the meter.

Fig 2.4 Setting of the length of a measuring rod in Oppenheim (Germany, 16th century); 1 rod was equal to 16 feet

In 1799, a platinum bar with a rectangular cross-section was established as the standard unit of length, representing the distance between its end surfaces, and was archived by the French Republic During the 1867 conference in Berlin, the International Association of Geodesy proposed a uniform unit of length for Europe, gaining support from prestigious institutions such as the Russian Academy of Sciences and the French Bureau of Longitude This collaboration led to the French government's initiative to create an international commission for a metric measurement system, culminating in the signing of the Meter Convention by seventeen countries in 1875 Ultimately, the original meridian standard was replaced by an arbitrary meter standard, based on French precedents, as defined by the 1st General Conference on Weights and Measures in 1889.

One metre is defined as the distance between the centers of the main lines within the area marked by directional lines on the international prototype, measured at a temperature of 0 °C This measurement is supported at points located 0.22L from each end, where L represents the length of the prototype.

The physical prototype of the meter was a standard bar composed of 90% platinum and 10% iridium, serving as the definition for 70 years until it was replaced by an atomic standard This platinum-iridium bar was defined at a temperature of 0°C, which can be accurately achieved and maintained using a mixture of ice and water.

Measurements and Standards of Mass

Isaac Newton introduced the concept of mass to science, with values ranging from the mass of elementary particles, like the electron at approximately 9.11 x 10^-31 kg, to celestial bodies such as Earth, which has a mass of about 6 x 10^24 kg The mass of astronomical objects is measured based on gravitational laws, and early efforts to determine Earth's mass involved calculations using its previously established average density.

In the early centuries of commerce, bulk commodities like grain were measured by volume using units such as gallons, bushels, or barrels, while precious metals and stones were measured by weight Initially, weights were small and likely represented by coins made of precious metals During the Frankish empire under Charlemagne, a 1-pound weight, weighing 367.2 grams, was created from mint alloy, and 240 denarii were produced Both 1-pound weights and 1.53-gram silver coins played significant roles in trade, including the first Polish denarius coin issued by Mieszko I.

22 2 Measures, Standards and Systems of Units

In the latter half of the tenth century, Poland emerged as a significant entity, with historical studies revealing that Mieszko's denarius weighed 1.53 grams, matching the mass of Charlemagne's denarius.

The Meter Convention of 1875 established the meter as the unit of length and the kilogram as the unit of mass, defining one kilogram as the mass of 1/1000 m³ of water at its maximum density of 4°C The international prototype of the kilogram (IPK), made from an alloy of 90% platinum and 10% iridium, is a cylindrical artifact with equal diameter and height, measuring 39 mm It serves as the sole arbitrary artifact standard among the seven base unit standards.

SI system Thefirst General Conference on Weights and Measures in 1889 defined the role of the IPK as an artifact standard:

This prototype shall henceforth be considered a unit of mass.

In order to resolve the question as to the use of the terms of mass or weight, the 3rd CGPM in 1901 confirmed the validity of the previous definition:

The kilogram is the unit of mass; it is equal to the mass of the international prototype of the kilogram.

Fig 2.5 The standard of the unit of mass in the Polish

Central Of fi ce of Measures

(GUM) in Warsaw, the of fi cial BIPM copy no 51.

The prototype is covered by two glass bells (courtesy of the GUM)

2.4 Measurements and Standards of Mass 23

The international prototype of the kilogram has been gaining weight at a rate of 1μg per year due to impurities accumulating on its surface In response, the International Committee of Weights and Measures (CIPM) declared in 1988 that the reference mass of the prototype is determined immediately after it undergoes a specialized cleaning process This three-stage preparation includes cleaning, washing, and drying, with the prototype specifically cleaned using a chamois leather cloth that has been treated to reduce acidity and soaked twice.

To clean the IPK effectively, immerse it in a bath of ethanol and ether for 48 hours, followed by manual cleaning with a chamois leather cloth, applying a pressure of approximately 10 kPa Use double-distilled water for washing, employing hot steam directed at the prototype from a distance of 5 mm The drying process consists of three phases, starting with the collection of water droplets.

The preparation of the IPK involves several steps, including the use of filter paper, drying with compressed air, and passive drying In the final phase, the IPK is stored in a safe with air access for two weeks Consequently, the entire process of preparing the IPK for comparisons takes approximately three weeks.

The IPK is kept at the International Bureau of Weights and Measures (BIPM) in

Sèvres has produced nearly 100 platinum-iridium copies of the international prototype of the kilogram (IPK), with six designated as official copies stored alongside the IPK at the BIPM Additionally, the BIPM utilizes eight other platinum-iridium artifact standards for calibration purposes The remaining eighty copies have been allocated to national laboratories worldwide, including Copy No 51, which is held by Poland's national metrological laboratory, the Central Office of Measures (GUM).

The definition of the kilogram is crucial not only for mass measurement but also for the ampere, mole, and candela, meaning any uncertainty in the kilogram's definition impacts these units as well The current standard for the kilogram faces several challenges, including its limited accessibility, as the international prototype is only available in Sèvres and is vulnerable to damage and surface contamination, losing nearly 1µg per year The most significant issue is the long-term drift in its mass; since its adoption in 1889, periodic verifications have shown a mass change of 50µg over a century, highlighting the need for a more stable and reliable definition.

Over a period of 100 years, there has been an unexplained drift of 5×10 −8 in the mass of platinum-iridium artifacts, highlighting the need to redefine the unit of mass Following the third verification period, it has become clear that a new standard for the kilogram must be established Current discussions suggest that the kilogram may be redefined in relation to either the Planck constant or the Avogadro constant (N A).

At its 10th meeting in 2007 the Consultative Committee for Mass and Related Quantities (CCM) at the CIPM discussed new definitions of the kilogram and the

24 2 Measures, Standards and Systems of Units mole Seven new definitions of the kilogram have been proposed, each referring to one or more physical constants.

In Poland, the national standard for mass is a BIPM-certified platinum-iridium artifact known as prototype kilogram No 51, acquired by the Central Office of Measures (GUM) in 1952 This national prototype is housed and utilized by the GUM's Laboratory of Mass in Warsaw Since its acquisition, prototype No 51 has participated in two of the last three international comparisons, with the first comparison conducted between 1899 and 1900.

The Polish national standard comparisons revealed that in the second international comparison (1939-1953), the mass was recorded as 1 kg + 185μg, while in the third comparison (1988-1992), it was measured at 1 kg + 227μg The uncertainty in the mass of the prototype was determined to be 2.3μg.

Efforts have long been underway to redefine the kilogram, the unit of mass, in relation to a fundamental physical constant while ensuring its connection to other measurement units like time, length, and voltage The new standard for the kilogram must exhibit an uncertainty lower than the current international prototype's uncertainty of 5 parts per 10^8, ensuring long-term stability However, it appears unlikely that the kilogram will be redefined based on the Planck constant (h) at the upcoming 27th General Conference on Weights and Measures (CGPM), as haste in redefining base units is not advisable in metrology.

Clocks and Measurements of Time

Astronomical phenomena, including days, years, and lunar cycles, have historically been used to measure time, making a standardized time unit more universally acceptable than common units of length or mass Ancient Egyptian obelisks, which served as sundials, are enduring symbols of this early timekeeping practice.

Around 3500 BC, the Egyptians and Maya utilized obelisks to indicate the time of day through the shadow cast by the sun, employing a 12-hour division from sunrise to sunset However, due to the changing length of daylight throughout the year, the hours varied in duration, with summer hours being longer than winter hours This practice of using variable-length hours persisted in Europe until approximately the fifteenth century Additionally, portable sundials from as early as 1500 BC were used in ancient Egypt, and pocket sundials emerged in Europe during the early Middle Ages An example of an elegant garden sundial can be seen in the Imperial Palace in Beijing.

Around 1500 BC, Egypt became home to the first water clocks, which were also recognized in China and likely in Babylon A water clock operates as a vessel with a bottom opening that allows water to flow out, indicating time through the gradual decrease in water level More sophisticated versions of these clocks featured a gear train mechanism to move the hands, enhancing their functionality.

Ancient civilizations utilized various timekeeping devices, including sundials, water clocks, and sand hourglasses, to measure time Water clocks featured metal balls that fell onto a tin plate, with the number of balls indicating the elapsed time The Greeks, in particular, employed these innovative methods as early as 300 BC, showcasing their advanced understanding of time measurement.

In ancient China, time-measuring instruments included sundials, water clocks, and candle clocks, the latter burning at a constant rate The division of time into hours was significantly influenced by religious practices, as both Muslims and Christian monks required clocks to schedule prayers A muezzin calls Muslims to prayer five times daily from a minaret, emphasizing the importance of prayer over sleep The Persian scholar Muhammad ibn Ahmad al-Biruni authored the first book on time measurement, "On the Reckoning of Time," in the early 11th century Since the early Middle Ages, bells marked the seven canonical hours in Christian monasteries and churches, a practice introduced by Pope Sabinian around 605 CE.

The invention of mechanical clocks revolutionized time measurement, enabling accurate divisions of the day into minutes and seconds Prior to this, sundials were the primary time-telling device, with the Arabs making significant improvements to their accuracy Notably, Ibn Yunus, a renowned court astronomer in 11th-century Egypt, created tables of sundial shadow lengths with minute-by-minute precision Later, Ulugh Beg, a prominent Uzbek astronomer and ruler of Samarkand, independently calculated sundial tables, showcasing the advancements in astronomical knowledge during this period.

Fig 2.6 A garden sundial with a 24-h dial-plate

26 2 Measures, Standards and Systems of Units

The first mechanical clocks emerged in Italian and French cities in the early fourteenth century, utilizing a counterweight to maintain a uniform speed of rotation for accurate time measurement These clocks were commonly installed on town hall and church towers A significant advancement in clock accuracy was achieved with the invention of the pendulum clock by Christiaan Huygens, which used a pendulum to stabilize the rotation of the hands.

1656 The periodTof oscillation of the mathematical pendulum with a free length lis determined by the relation:

; ð2:2ị where T is the period of oscillation, l the length of the pendulum, and g the acceleration of gravity.

The pendulum clock, invented by Huygens, revolutionized timekeeping by achieving an accuracy of 1 minute per day, and later 10 seconds per day, surpassing all previous clocks in history This innovation enabled the precise division of the day into equal hours, minutes, and seconds Although Galileo had conceptualized the pendulum clock and sketched it in 1582, he never completed its construction Instead, he measured time using his pulse during experiments on free fall Additionally, in 1640, astronomer Johannes Hevelius recognized that the period of a pendulum is fixed, utilizing this property for time measurement, albeit without a clock.

Thefirst unit of time defined in the international system of units was the second. According to the definition of 1901:

The second is 1/86 400 of the mean solar day.

Until 1956, the definition of a second was based on a consistent measure; however, variations in the length of the solar day prompted a new definition The revised definition, adopted in 1956, was based on the astronomical ephemeris year, which is determined by the positions of celestial bodies relative to Earth.

The second is the fraction 1/31 556 925.9747 of the tropical year.

The invention of electronic quartz clocks marked a significant advancement in time measurement accuracy, stemming from Pierre Curie's discovery of the piezoelectric effect in 1880 In 1922, Walter G Cady observed that quartz crystals, the most commonly used piezoelectric materials, exhibited high stability in their electrical oscillation frequencies Cady also determined that the orientation of the crystal axis during the cutting process was crucial for maintaining both the oscillation frequency and its temperature stability.

2.5 Clocks and Measurements of Time 27

Marrison at Bell Laboratories developed a crystal oscillator that became a frequency standard, leading to the creation of the first quartz clock in 1929, which utilized the oscillator's signal This quartz clock significantly outperformed mechanical clocks in accuracy In 1969, the invention of the quartz watch established it as the most precise common measuring device, surpassing the accuracy of other everyday instruments like weighing scales, tape measures, thermometers, and electricity meters by a considerable margin.

Enhancements in quartz clock accuracy were primarily driven by increased stability in the frequency of the quartz oscillator It was discovered that the oscillation frequency of quartz is sensitive to temperature variations, which negatively affects the oscillator's performance as a reference frequency source In quartz oscillators lacking temperature stabilization (XO), the relative frequency change (Δf/f) in response to temperature fluctuations (T) is significant.

Quartz oscillators have significantly improved in accuracy by compensating for the temperature dependence of Δf/f, which is also utilized in quartz thermometers Key parameters of these oscillators are outlined in Table 2.1 From a user perspective, both long-term and short-term stability are essential Long-term stability, characterized by accuracy and frequency drift over a year, is vital for oscillators used in metrological laboratories and telecommunication networks In contrast, while watches also require oscillators to maintain accurate time over extended periods, their quality demands are less stringent Short-term stability is crucial for systems needing precise timing for individual or consecutive measurements, such as the Global Positioning System.

Table 2.1 Parameters of quartz oscillators [3, 15]

Type of oscillator XO TCXO Oven

Double oven BVA OCXO Stability of frequency in τ ẳ 1 s

28 2 Measures, Standards and Systems of Units

Quartz oscillators are divided into the following types by the method used for temperature stabilization:

• XO, quartz crystal oscillator without temperature compensation; the temperature coefficientk t of the temperature dependence of the relative change Δf/f in the frequency of these oscillators isk t > 10 −7 /°C;

• TCXO, temperature-compensated quartz crystal oscillator XO; temperature coefficientk t > 5×10 −8 /°C;

• Oven OCXO, oven-controlled XO quartz oscillator with heating and tempera- ture stabilization using a thermostat; temperature coefficientk t > 5×10 −9 /°C;

• Double Oven OCXO, oven-controlled XO quartz crystal oscillator with heating and two-stage temperature stabilization using a thermostat; temperature coeffi- cientk t > 2× 10 −10 /°C;

• Double Oven BVA OCXO, oven-controlled XO quartz crystal oscillator with heating and two-stage temperature stabilization using a thermostat and BVA (“electrodeless”) quartz; temperature coefficientk t > 5× 10 − 11 /°C.

BVA quartz oscillators with two-stage temperature stabilization offer the highest accuracy, utilizing additional crystal planes in their excitation process Developed at the University of Besançon in France, these oscillators are considered comparable in quality to rubidium atomic frequency standards The BVA OCXO technology, produced by Oscilloquartz in Neuchatel, Switzerland, yields top-tier quartz oscillators, such as the 86007–B series, which boasts a temperature coefficient of k t > 5×10 −11 /°C and impressive long-term stability metrics In 2004, NIST introduced a compact, low-cost cesium atomic clock with an accuracy of 10 −10, while various companies now provide cesium and rubidium miniature atomic clocks that outperform quartz oscillators, including a rubidium clock from Quartzlock with an accuracy of 5×10 −11 and exceptional stability ratings.

Since the verification of their high stability atomic clocks have been used as time standards and synchronized with one another In the SI system:

The second is the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyper fi ne levels of the ground state of the cesium 133 atom.

The International Bureau of Time in Paris established Atomic Time on January 1, 1958, at 0 hours, 0 minutes, and 0 seconds In 1967, the second was officially defined based on the cesium atomic clock, integrating it into the international system of units.

Superconductivity

Voltage Standards

Superconductor Digital Circuits

Other Applications of Josephson Junctions

RF-SQUID

DC-SQUID

Measurement System with a DC-SQUID

Magnetic Measurements with SQUID Systems

SQUID Noise Thermometers

Quantum Hall Effect

Studies of Conductance Quantization in Nanostructures

Electron Tunneling

Electronic Circuits with SET Junctions

Theoretical Principles

Caesium Atomic Frequency Standards

Hydrogen Maser and Rubidium Frequency Standard

Optical Radiation Frequency Standards

Realization of the Definition of the Metre

Satellite Positioning Systems

Atomic Resolution Microscopes

Atomic Force Microscope

Mass Standards Based on the Planck Constant

Silicon Sphere Standard

Ngày đăng: 30/05/2022, 20:40

TỪ KHÓA LIÊN QUAN