A crisis in Physics- The breakdown of classical physics.

 

  

Anesa Hosein


Table of contents

 

Introduction: 1

Black Body Radiation. 1

Specific heats. 1

Model of the atom.. 2

The Photoelectric Effect 3

Wave-Particle Duality. 3

Theory of Complimentarity: 4

Conclusion: 4

References. 5

 

 


Introduction:

 

May day! May day! We seem to have a problem. Classical physics is about to be destroy. Things are going to become probabilistic. Help us to stay deterministic!!

 

It seem, that everything was going pretty cool during the late 1800's until Planck decided to do something very different and send classical physics crashing. Scientists at the time thought it was preposterous and he should have never suggested that radiation was quantised, it was quite easy for them to laugh at the whole idea. However, scientists like Debye, Einstein and Bohr who soon took to the idea of quantisation was able to turn around the whole picture. So, let us see how they did that, and how this theory of quantisation was able to break the foundation of classical physics.

 

Black Body Radiation

 

Explanations of the wavelength distribution of energy radiated by a black body had reached partial success by Wiens and by Rayleigh and Jeans [1]. Wien's explanation agreed with short wavelengths and that of Rayleigh and Jeans with long wavelengths [1]. The formula derived by Rayleigh and Jeans, had predicted that at short wavelengths energy tended to infinity [1]. This had unsettled scientists for a while and had name this impending phenomena the ultra-violet catastrophe [1]. In 1900, Max Planck was able to settle their fears by deriving a formula of for this distribution of wavelengths in a black body that seemed reasonable [1]. The only problem with Planck's explanation was that he made an unorthodox assumption that radiation being absorbed by the atoms was in discrete amounts [1]. First of all, Planck whole derivation of the formula was not ethical, he looked at the experiment expected results and fitted a formula to that and then made the assumption that the radiation was quantised [3]. Classically, it was impossible for radiation to be discrete. This sent the whole classical physics world into mayhem. Well, fortunately enough, Planck's derived expression conformed to both Wien's and Rayleigh-Jeans formula [1].

 

Specific heats

 

There were some problems in the whole area of specific heats. Dulong-Petit law was not being held at low temperatures and the application of quantisation had come to the rescue once more. Classically, solid crystals were considered as an assembly of atoms held together in a periodic array by some certain attractive forces [4]. Atoms on acquiring thermal energy vibrated like a harmonic oscillator about their equilibrium position [4]. It was found using Maxwell-Boltzmann statistics that the equi-partitioning of energy of a solid in each degree of freedom was kT, thus in three degrees would have been 3NkT, thus making the specific heat capacity 3NkT [3, 4]. Therefore, the specific heat capacity (SHC) of solids was 3NK [3]. It was found that SHC did not remain constant for when temperature approached zero, but itself approached zero also [3].

 

Einstein, tried to resolve these problems by using Planck's theory and considering that atoms to be quantum harmonic oscillators, that is they have discrete energy values and including the frequency of the oscillators into the equation [4]. Debye, however, realized that Einstein's model was not totally accurate from a physical point of view of the thermal motion of the lattice at low temperatures [4]. Debye, treated the oscillators as coupled together rather than discrete since atoms oscillate relative to their neighbours [4]. He also treated them as capable of propagating elastic waves when a frequency might vary over wide range of values [4]. Debye stated that the continuum model may be used for all vibrational modes of the crystal since if the wavelength of an incoming wave is larger than the interatomic spaces the crystal will look as a continuum [4]. Also, Debye considered it impractical to assign all 3N oscillators of the crystal with identical frequency since long wavelength motion may have low frequencies as in the case for ordinary elastic waves at acoustic frequencies [4]. Debye further assumed that the number of distinguishable modes were cut off at 3N to agree with the number of degrees of freedom for N atoms [4].

 

Model of the atom

 

With the advent of the discovery of radioactivity by Becquerel in 1896, new ideas of the nature of the atom could not be reconciled with classical concepts [5]. Thus with the probing of the atom with alpha particles the make-up of the atom could have been determined [5]. With these results, Rutherford was able to make his model of the atom, which clearly didn't respect the classical laws of physics [5]. His model consisted of a nucleus of a small, heavy positive charge surrounded by electrons in such away that made the atom as a whole neutral [5]. A problem with this model statistically was that this type of model allowed many more degrees of freedom atoms in a solid and thus the solid will have a higher specific heat capacity [5]. Also, in the Rutherford model, if the electrons move around the nucleus, they must continuously accelerate towards it and give of radiation while doing so [5]. However, from observation, it was known that atoms do not continuously radiate but has to be excited first by some form of energy [5]. Also, the calculated radiation that was expected in Rutherford's model was higher than what was ever observed [5].

 

It was Bohr in 1913, who used, you guessed it, Planck's assumption of quantisation to explain away the problems arising from Rutherford's model [5]. Bohr assumed that the electrons in an atom occupied certain defined energies, and when the electrons were excited they will 'jump' to the next energy or stage [5]. Also, when the an electron, moved from a higher energy state to a lower energy state, it was only then that light was emitted [5]. Bohr made the additional assumption that the quantised energy states were due to the discreteness of the orbital angular momentum of the electron [5]. This model was enhanced when Sommerfield considered elliptical electron orbits instead of circular electron orbits [1].

 

To prove that electrons energy states were in fact quantised was the Franck Hertz experiment [2]. The original experiment was designed to measure the exchange in energy between electrons and atoms of a studied gas element [2]. Electrons were emitted from a heated cathode which were accelerated through a tube with the low pressured gas to the anode through a potential difference [2]. The electrons passes through the holes in the anodes and arrive at a plate, which has a retarding force attached to it [2]. If the energy is greater than that of the retarding potential, its current is measured [2]. Thus, the current was measured as a function of the potential difference [2]. It was found that when the potential difference increased, that the current only increased to a point and then dropped dramatically indicating there was some interaction between the electrons and the atoms [2]. This is interpreted that electrons with this energy was able to excite the atom to an excited state, and energy less than this cannot excite the atoms and from this it was known that the states were quantised.. 

 

The Photoelectric Effect

 

The photoelectric effect is the emission of electrons from a metal as a result of interaction between radiation. Classical wave theory suggests that any frequency of light should eject electrons from the surface of the metal [2]. However, experiments showed there was  a definite cut off frequency, below which no electrons were released[2]. Classical physicists expected a measurable time delay between the arrival of the radiation and the emission of the photoelectron, during this time the electrons were expected to absorb energy from the wavefront[2]. It was also found that the energy of the electrons coming off is exactly the same for all  metals [2]. This was very puzzling to scientists in the classical era since they could not explain why electrons given the correct frequency didn't have a time lag [2]. Einstein adapted, the now popular, Planck's hypothesis, that the energy of the radiation is quantised and argued that light consisted of energy packets called quanta or photons, each of energy h [2]. Einstein also stated that the surface needed a minimum amount of energy called the work function, for electrons to be released that is photons of ho > (where is the work function of the metal) will be able to release the electron from the metal [2]. He then further, argued that the difference in the energies is given as the kinetic energy to the electron [2]. Also, that the intensity of the light beam was related to the number of quanta and not related at all to the energy associated with the beam [2]. He then concluded that the photoelectric effect was a surface phenomena only [2].

 

Wave-Particle Duality

 

De Broglie argued just as waves behaved as though they are particles (photons), particles under certain conditions behave as waves [2]. The evidence of the particle- like behaviour of waves came from the photoelectric effect [2]. To support this theory of De Broglie is the Davisson Germer experiment which showed basically that electrons displayed wave properties [3]. Electrons were accelerated to hit a crystal whose layers were 1Å apart [2]. A detector was placed at a specified angle to measure the intensity or current of the electrons [2]. It was found that when the current produced was graphed against the kinetic energy of the accelerated electron, that interference had occurred, both constructive and destructive [2]. The interference involved in this experiment was believed to between different parts of the wave associated with the single electron which had been scattered on contact with the crystal [2]. This was checked by lowering the number of electrons released [2].

 

Compton scattering experiment in turn showed again that waves behaved as particles [3]. This experiment studied the scattering of X-rays from graphite [2]. It measured the intensity of the light as a function of the scattering angle [2]. It was found that there was a shift in the wavelength, which was called the Compton shift [2]. Classically, the incoming wave with a particular frequency would have excited the electrons, and thus given of a wavelength of maximum frequency, however the wavelength given off is one with a lower frequency [2]. To explain this Compton, said that the X-rays were a collection of photons each with energy h [2]. When the photons interacted with the free electrons in the graphite, the photon recoils and thus makes up part of the scattered beam [2]. Some of its energy is given up during the interaction, thus lowering their frequencies but increasing their wavelength [2].

 

 

The Young's Double Slit experiment showed the light wave- particle duality nature [3]. Consider if there were two slits and light was passing through, interference will occur and thus fringes will be seen on the screen [3]. This the wave behaviour of light [3]. If however, the intensity is lowered to say one photon a minute, classically it would be expected that there should be no fringes present but there is [3]. Quantisation, unfortunately cannot , say why this happens and no plausible reason for this behaviour has been given thus far, although many theories are floating around.

 

Theory of Complimentarity:

 

The last topic that will be discussed is the theory of complimentarity or the principle of indeterminacy that was developed by Bohr. This theory was first developed through Heisenberg's uncertainty principle which stated that momentum and position could not be measured simultaneously if accuracy is to be maintained. Bohr moved this theory into that it was impossible to measure a wave characteristic and particle characteristic at the same time which he developed from the laws of commutation [2]. That is, from the laws of commutation if two properties were found to be non-zero they were said to be complimentary, and thus were impossible to find both simultaneously [3]. This completely threw classical physics out of the window since classical physics was based on the principle that all laws of physics were deterministic [2]. This new theory construed a totally new meaning to physics, that is the laws of physics are probabilistic [2].

 

Conclusion:

 

Although, classical physics does not hold for microscopic regions (atoms, electrons etc.), it does still hold well for the macroscopic world. Therefore, it is not a completely dead as would quantum physics will suggest. The world of quantum physics is going through similar problems to that of the classical physics at the moment and it wouldn't be surprising if next there was a crisis in quantum physics as well.

 


References

 

1 Smith, S M H (1965) 'A textbook of nuclear physics.' Pergamon Press Ltd. England. pp. 16, 27,       143

5 Dicke RH & Wittke JP (1974) 'Introduction to quantum mechanics.' Addison-Wesley Publishing        Company Inc. USA. pp. 3-4, 8-11.

 

 

Lecture Notes

 

 

2 Rodrigues M (1996) 'Phy 212 - Modern Physics Lecture Notes.' Dept of Physics. University of         Guyana.

3 Williams M (1998) 'Phy 412 - Quantum Mechanics Lecture Notes.' Dept of Physics. University         of Guyana.

4 Williams M (1998) 'Phy 413 - Solid State Physics Lecture Notes.' Dept. of Physics. University          of Guyana.

 

AG00111_.gif (1635 bytes) Back to my homepage

AG00111_.gif (1635 bytes) Back to Reports