Скачать презентацию

Идет загрузка презентации. Пожалуйста, подождите

Презентация была опубликована год назад пользователемBoris Bogomolov

1 Physical Limits of Computing A Brief Introduction Dr. Michael P. Frank Dept. of Computer & Information Science & Engineering (Affil. Dept. of Electrical & Computer Engineering) University of Florida, Gainesville, Florida Presented at: 2004 Computing Beyond Silicon Summer School (Week 4) California Institute of Technology Pasadena, California, July 6-8, 2004

2 Abstract Physical and computational systems share a number of common characteristics.Physical and computational systems share a number of common characteristics. They are both special cases of the more general concept of dynamical systems. They are both special cases of the more general concept of dynamical systems. In fact, we can even show that there is a fundamental underlying unity between the physical and computational domains.In fact, we can even show that there is a fundamental underlying unity between the physical and computational domains. E.g., in this talk, we will survey some ways to understand a variety of key physical concepts in computational terms. E.g., in this talk, we will survey some ways to understand a variety of key physical concepts in computational terms. Due to this underlying unity, physical systems have firm limits on their computational capabilities.Due to this underlying unity, physical systems have firm limits on their computational capabilities. Since a computation embedded within a physical system clearly cannot exceed the raw computational capabilities of the physical system itself. Since a computation embedded within a physical system clearly cannot exceed the raw computational capabilities of the physical system itself. We review some of the known limits.We review some of the known limits. On information capacity, processing rate, and communication bandwidth. On information capacity, processing rate, and communication bandwidth.

3 Physics as Computation Preview: Most/all physical quantities can be validly reinterpreted in terms of information and computation.Preview: Most/all physical quantities can be validly reinterpreted in terms of information and computation. Physical entropy is Physical entropy is Incompressible information.Incompressible information. Physical action is Physical action is Total amount of (quantum-physical) computation.Total amount of (quantum-physical) computation. Physical energy is Physical energy is The rate of physical computation.The rate of physical computation. The various different forms of energy correspond to physical computation that is occupied doing different kinds of things. The various different forms of energy correspond to physical computation that is occupied doing different kinds of things. Physical temperature is (proportional to) Physical temperature is (proportional to) Physical rate of computing per bit of information capacity.Physical rate of computing per bit of information capacity. The clock speed for physical computation. The clock speed for physical computation. Physical momentum is Physical momentum is Amount of motional computation per unit distance translated…Amount of motional computation per unit distance translated… there are others for angular momentum, velocity, etc. … there are others for angular momentum, velocity, etc. … These identities can be made rigorous!These identities can be made rigorous! We will sketch the arguments later if there is time… We will sketch the arguments later if there is time…

4 Fundamental Physics Implies Various Firm Limits on Computing Speed-of-Light Limit Thoroughly Confirmed Physical Theories Uncertainty Principle Definition of Energy Reversibility 2 nd Law of Thermodynamics Adiabatic Theorem Gravity Theory of Relativity Quantum Theory Implied Universal Facts Affected Quantities in Information Processing Communications Latency Information Capacity Information Bandwidth Memory Access Times Processing Rate Energy Loss per Operation

5 Some limits… Communications latency…Communications latency… Over distance d is at least t = d/c. Over distance d is at least t = d/c. Despite spooky non-local-seeming quantum statistics.Despite spooky non-local-seeming quantum statistics. Information capacity…Information capacity… For systems of given size & energy is finite. For systems of given size & energy is finite. Obtained by counting numbers of distinct quantum states.Obtained by counting numbers of distinct quantum states. Information bandwidth…Information bandwidth… Limited for flows of given power and cross-sectional area. Limited for flows of given power and cross-sectional area. Obtained from capacity and propagation velocity limitsObtained from capacity and propagation velocity limits Memory access times…Memory access times… Limited by information density & velocity… Limited by information density & velocity… Processing rate…Processing rate… Limited by accessible energy, indirectly by size. Limited by accessible energy, indirectly by size. Also limited by power constraints & energy efficiency. Also limited by power constraints & energy efficiency. Energy efficiency…Energy efficiency… Limited by Landauer bound for irreversible computing, Limited by Landauer bound for irreversible computing, No technology-independent limits for reversible computing yet known. No technology-independent limits for reversible computing yet known.

6 Entropy and Information The following definitions of the entropy content of a given physical system can all be shown to be essentially equivalent…The following definitions of the entropy content of a given physical system can all be shown to be essentially equivalent… Expected logarithm of the state improbability 1/p. Expected logarithm of the state improbability 1/p. Given a probability distribution over system states.Given a probability distribution over system states. Expected size of the smallest compressed description of the systems state. Expected size of the smallest compressed description of the systems state. Using the best available description language & compressor.Using the best available description language & compressor. Expected amount of information in the state that cannot be reversibly decomputed. Expected amount of information in the state that cannot be reversibly decomputed. Using the best available mechanism.Using the best available mechanism. Expected amount of a systems information capacity that is in use Expected amount of a systems information capacity that is in use It cannot be used to store newly-computed information for later retrieval.It cannot be used to store newly-computed information for later retrieval.

7 Action and Amount of Computation In quantum mechanics,In quantum mechanics, States are represented as complex-valued vectors v, States are represented as complex-valued vectors v, & temporal transformations are represented by unitary operators (generalized rotations) U on the vector space. & temporal transformations are represented by unitary operators (generalized rotations) U on the vector space. The Us may be parameterized as e iHθ (H hermitian, θ real)The Us may be parameterized as e iHθ (H hermitian, θ real) We can characterize the magnitude of a given vector rotation Uv = e iHθ v byWe can characterize the magnitude of a given vector rotation Uv = e iHθ v by The area swept out in the complex plane by the normalized vector components as θ is swept from 0 to a given value. The area swept out in the complex plane by the normalized vector components as θ is swept from 0 to a given value. Important conjecture: This quantity is basis-independent!Important conjecture: This quantity is basis-independent! We can characterize the action performed by a given unitary transform operating on a set of possible vs as the maximum rotation magnitude over the vs.We can characterize the action performed by a given unitary transform operating on a set of possible vs as the maximum rotation magnitude over the vs. Or, if we have a probability distribution over initial vectors, we can define an expected action accordingly. Or, if we have a probability distribution over initial vectors, we can define an expected action accordingly. The connection with computation is provided by showing that it takes a minimum area (action) of π/4 to flip a bit.The connection with computation is provided by showing that it takes a minimum area (action) of π/4 to flip a bit. I.e., minimum angle of π/2 to rotate to an orthogonal vector. I.e., minimum angle of π/2 to rotate to an orthogonal vector. It takes a minimum action of π/2 (annihilate/create pair) to move a state forward by 1 position along an unbounded chain.It takes a minimum action of π/2 (annihilate/create pair) to move a state forward by 1 position along an unbounded chain. The total action of a transform gives the total number of such operations. The total action of a transform gives the total number of such operations.

8 Energy and Rate of Computation The energy of an eigenvector of H is the corresponding eigenvalue.The energy of an eigenvector of H is the corresponding eigenvalue. The average energy of a general quantum state follows directly from the eigenstate probabilities. The average energy of a general quantum state follows directly from the eigenstate probabilities. The average energy is exactly the rate at which complex-plane area is swept out (action accumulated).The average energy is exactly the rate at which complex-plane area is swept out (action accumulated). In the energy basis, and also in other bases. In the energy basis, and also in other bases. Thus, if action is amount of computation, then energy is rate of computation.Thus, if action is amount of computation, then energy is rate of computation.

9 Generalized Temperature The concept of temperature can be generalized to apply even to non-equilibrium systems.The concept of temperature can be generalized to apply even to non-equilibrium systems. Where entropy is less than the maximum. Where entropy is less than the maximum. Example: Consider an ideal Fermi gas.Example: Consider an ideal Fermi gas. Heat capacity/fermion is C = π 2 k 2 T/2μ. Heat capacity/fermion is C = π 2 k 2 T/2μ. μ = Fermi energy; k = log e; T = temperatureμ = Fermi energy; k = log e; T = temperature Equilibrium temperature turns out to be: Equilibrium temperature turns out to be: T = (2/πk)(E x μ) 1/2, thus C = πk(E x /μ) 1/2 where:T = (2/πk)(E x μ) 1/2, thus C = πk(E x /μ) 1/2 where: E x = E E 0, avg. energy excess/fermion rel. to T=0 Equilibirum (max) entropy/fermion is: Equilibirum (max) entropy/fermion is: S max = dS = dQ/T = dE x /T = πk(E x /μ) 1/2 = CS max = dS = dQ/T = dE x /T = πk(E x /μ) 1/2 = C Consider this to be the total information content S max = I tot = S + X (entropy plus extropy).Consider this to be the total information content S max = I tot = S + X (entropy plus extropy). We thus have: T = 2(E x /I tot ) We thus have: T = 2(E x /I tot ) The temperature is simply 2× the excess energy per unit of total information content.The temperature is simply 2× the excess energy per unit of total information content. Note that the expression E x /I tot is well-defined even for non- equilibrium states, where the entropy is S < S max = I tot.Note that the expression E x /I tot is well-defined even for non- equilibrium states, where the entropy is S < S max = I tot. Thus, we can validly ascribe a (generalized) temperature to such states. Thus, we can validly ascribe a (generalized) temperature to such states.

10 Generalized Temperature as Clock Speed Consider systems such as the Fermi gas, where T = cE/I.Consider systems such as the Fermi gas, where T = cE/I. Where c is a constant of integration. Where c is a constant of integration. E is excess energy above the ground state. E is excess energy above the ground state. I is total physical info. content I is total physical info. content For such systems, we can say that the generalized temperature gives a measure of the energy content, per bit of physical information content.For such systems, we can say that the generalized temperature gives a measure of the energy content, per bit of physical information content. E b = c -1 Tb = c -1 k B T ln 2 E b = c -1 Tb = c -1 k B T ln 2 Since energy (we saw) gives the rate of computing, the temperature therefore gives the rate of computing per bit.Since energy (we saw) gives the rate of computing, the temperature therefore gives the rate of computing per bit. In other words, the clock frequency! In other words, the clock frequency! For our case c=2, room temperature corresponds to a max. frequency of:For our case c=2, room temperature corresponds to a max. frequency of: f max = 2c -1 Tb/h = k B (300 K)(ln 2)/h = ~4.3 THz Comparable to freq. of room-T IR photons Comparable to freq. of room-T IR photons A computational subsystem that is at a generalized temperature equal to room temperature can never update its digital state at a higher frequency than this!A computational subsystem that is at a generalized temperature equal to room temperature can never update its digital state at a higher frequency than this!

11 Information Limits

12 Some Quantities of Interest We would like to know if there are limits on:We would like to know if there are limits on: Infropy density Infropy density = Bits per unit volume= Bits per unit volume Affects physical size and thus propagation delay across memories and processors. Also affects cost.Affects physical size and thus propagation delay across memories and processors. Also affects cost. Infropy flux Infropy flux = Bits per unit area per unit time= Bits per unit area per unit time Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removalAffects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal Rate of computation Rate of computation = Number of distinguishable-state changes per unit time= Number of distinguishable-state changes per unit time Affects rate of information processing achievable in individual devicesAffects rate of information processing achievable in individual devices

13 Bit Density: No classical limit In classical (continuum) physics, even a single particle has a real-valued position+momentumIn classical (continuum) physics, even a single particle has a real-valued position+momentum All such states are considered physically distinct All such states are considered physically distinct Each position & momentum coordinate in general requires an infinite string of digits to specify: Each position & momentum coordinate in general requires an infinite string of digits to specify: x = … metersx = … meters p = … kg m/sp = … kg m/s Even the smallest system contains an infinite amount of information! No limit to bit density. Even the smallest system contains an infinite amount of information! No limit to bit density. This picture is the basis for various analog computing models studied by some theoreticians. This picture is the basis for various analog computing models studied by some theoreticians. Wee problem: Classical physics is dead wrong!Wee problem: Classical physics is dead wrong!

14 The Quantum Continuum In QM, still uncountably many describable states (mathematically possible wavefunctions)In QM, still uncountably many describable states (mathematically possible wavefunctions) Can theoretically take infinite info. to describe Can theoretically take infinite info. to describe But, not all this info has physical relevance!But, not all this info has physical relevance! States are only physically distinguishable when their state vectors are orthogonal. States are only physically distinguishable when their state vectors are orthogonal. States that are only indistinguishably different can only lead to indistinguishably different consequences (resulting states) States that are only indistinguishably different can only lead to indistinguishably different consequences (resulting states) due to linearity of quantum physicsdue to linearity of quantum physics There is no physical consequence from presuming an infinite # of bits in ones wavefunction There is no physical consequence from presuming an infinite # of bits in ones wavefunction

15 Quantum Particle-in-a-Box Uncountably many continuous wavefunctions?Uncountably many continuous wavefunctions? No, can express wave as a vector over countably many orthogonal normal modes.No, can express wave as a vector over countably many orthogonal normal modes. Fourier transform Fourier transform High-frequency modes have higher energy (E=hf); energy limits imply they are unlikely.High-frequency modes have higher energy (E=hf); energy limits imply they are unlikely.

16 Ways of Counting States Entire field of quantum statistical mechanics is about this, but here are some simple ways: For a system w. a constant # of particles:For a system w. a constant # of particles: # of states = numerical volume of position- momentum configuration space (phase space) # of states = numerical volume of position- momentum configuration space (phase space) in units where h=1. in units where h=1. Approached in macroscopic limit.Approached in macroscopic limit. Unfortunately, # of particles not usually constant! Unfortunately, # of particles not usually constant! Quantum field theory bounds:Quantum field theory bounds: Smith-Lloyd bound. Still ignores gravity. Smith-Lloyd bound. Still ignores gravity. General relativistic bounds:General relativistic bounds: Bekenstein bound, holographic bound. Bekenstein bound, holographic bound.

17 Smith-Lloyd Bound Based on counting field modes.Based on counting field modes. S = entropy, M = mass, V = volumeS = entropy, M = mass, V = volume q = number of distinct particle typesq = number of distinct particle types Lloyds bound is tighter by a factor ofLloyds bound is tighter by a factor of Note:Note: Entropy density scales with 3/4 power of mass- energy density Entropy density scales with 3/4 power of mass- energy density E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×.E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×. Smith 95 Lloyd 00

18 Examples w. Smith-Lloyd Bound For systems at the density of water (1 g/cm 3 ), composed only of photons:For systems at the density of water (1 g/cm 3 ), composed only of photons: Smiths example: 1 m 3 box holds 6×10 34 bits Smiths example: 1 m 3 box holds 6×10 34 bits = 60 kb/Å 3= 60 kb/Å 3 Lloyds example: 1 liter ultimate laptop, 2×10 31 b Lloyds example: 1 liter ultimate laptop, 2×10 31 b = 21 kb/Å 3= 21 kb/Å 3 Cool, but whats wrong with this picture?Cool, but whats wrong with this picture? Example requires very high temperature+pressure! Example requires very high temperature+pressure! Temperature around 1/2 billion Kelvins!!Temperature around 1/2 billion Kelvins!! Photonic pressure on the order of psi!!Photonic pressure on the order of psi!! Like a miniature piece of the big bang. -Lloyd Like a miniature piece of the big bang. -Lloyd Probably not feasible to implement any time soon! Probably not feasible to implement any time soon!

19 More Normal Temperatures Lets pick a more reasonable temperature: 1356 K (melting point of copper):Lets pick a more reasonable temperature: 1356 K (melting point of copper): Entropy density of light only 0.74 bits/ m 3 ! Entropy density of light only 0.74 bits/ m 3 ! Less than the bit density in a DRAM today!Less than the bit density in a DRAM today! Bit size comparable to wavelength of optical- frequency light emitted by melting copper Bit size comparable to wavelength of optical- frequency light emitted by melting copper Lesson: Photons are not a viable information storage medium at ordinary temperatures.Lesson: Photons are not a viable information storage medium at ordinary temperatures. Not dense enough. Not dense enough. CPUs that do logic with optical photons cant have logic devices packed very densely.CPUs that do logic with optical photons cant have logic devices packed very densely.

20 Entropy Density of Solids Can easily calculate from standard empirical thermochemical data.Can easily calculate from standard empirical thermochemical data. Obtain entropy by integrating heat capacity ÷ temperature, as temperature increases…Obtain entropy by integrating heat capacity ÷ temperature, as temperature increases… Example result, for copper: Example result, for copper: Has one of the highest entropy densities among pure elements at atmospheric pressureHas one of the highest entropy densities among pure elements at atmospheric room temperature: 6 bits/atom, 0.5 b/Å room temperature: 6 bits/atom, 0.5 b/Å 3 At boiling point: 1.5 b/Å 3At boiling point: 1.5 b/Å 3 Cesium has one of the highest #bits/atom at room temperature, about 15. -But only 0.13 b/Å 3 Cesium has one of the highest #bits/atom at room temperature, about 15. -But only 0.13 b/Å 3 Lithium has a high #bits/mass, 0.7 bits/amu. Lithium has a high #bits/mass, 0.7 bits/amu. Related to conductivity? × denser than its light!

21 Some Quantities of Interest We would like to know if there are limits on:We would like to know if there are limits on: Infropy density Infropy density = Bits per unit volume= Bits per unit volume Affects physical size and thus propagation delay across memories and processors. Also affects cost.Affects physical size and thus propagation delay across memories and processors. Also affects cost. Infropy flux Infropy flux = Bits per unit area per unit time= Bits per unit area per unit time Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removalAffects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal Rate of computation Rate of computation = Number of distinguishable-state changes per unit time= Number of distinguishable-state changes per unit time Affects rate of information processing achievable in individual devicesAffects rate of information processing achievable in individual devices

22 Smith-Lloyd Bound Based on counting orthogonal field modes.Based on counting orthogonal field modes. S = entropy, M = mass, V = volumeS = entropy, M = mass, V = volume q = number of distinct particle typesq = number of distinct particle types Lloyds bound is tighter by a factor ofLloyds bound is tighter by a factor of Note:Note: Entropy density scales with 3/4 power of mass- energy density Entropy density scales with 3/4 power of mass- energy density E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×.E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×. Smith 95 Lloyd 00

23 Whence this scaling relation? Note that in the field theory limit, S E 3/4.Note that in the field theory limit, S E 3/4. Where does this come from? Where does this come from? Consider a typical freq. in field spectrumConsider a typical freq. in field spectrum Note that the minimum size of a given wavelet is ~its wavelength. Note that the minimum size of a given wavelet is ~its wavelength. # of distinguishable wave-packet location states in a given volume 1/ 3# of distinguishable wave-packet location states in a given volume 1/ 3 Each such state carries a little entropy Each such state carries a little entropy occupation number of that state (# of photons in it)occupation number of that state (# of photons in it) 1/ 3 particles each energy 1/, 1/ 4 energy 1/ 3 particles each energy 1/, 1/ 4 energy S 1/ 3 E 1/ 4 S E 3/4S 1/ 3 E 1/ 4 S E 3/4

24 Whence the distribution? Could the use of more particles (with less energy per particle) yield greater entropy?Could the use of more particles (with less energy per particle) yield greater entropy? What frequency spectrum (power level or particle number density as a function of frequency) gives the largest # states? What frequency spectrum (power level or particle number density as a function of frequency) gives the largest # states? Note a minimum particle energy due to box size Note a minimum particle energy due to box size No. The Smith-Lloyd bound is based on the blackbody radiation spectrum.No. The Smith-Lloyd bound is based on the blackbody radiation spectrum. We know this spectrum has the maximum infropy among abstract states, b/c its the equilibrium state. We know this spectrum has the maximum infropy among abstract states, b/c its the equilibrium state. Empirically verified in hot ovens, etc.Empirically verified in hot ovens, etc.

25 General-Relativistic Bounds The Smith-Lloyd bound does not take into account the effect of gravity.The Smith-Lloyd bound does not take into account the effect of gravity. Earlier bound from Bekenstein: Derives a limit on entropy from black-hole physics:Earlier bound from Bekenstein: Derives a limit on entropy from black-hole physics: S < 2 ER / c E = total energy R = radius of system Limit only attained by black holes!Limit only attained by black holes! Black holes have 1/4 nat entropy per square Planck length of surface (event horizon) area. Black holes have 1/4 nat entropy per square Planck length of surface (event horizon) area. Minimum size of a nat: 2 Planck lengths, squareMinimum size of a nat: 2 Planck lengths, square 4×10 39 b/Å 3 average ent. dens. of a 1-m radius black hole! (Mass Saturn)

26 The Holographic Bound Based on Bekenstein black-hole bound.Based on Bekenstein black-hole bound. The maximum entropy within any surface of area A (independent of energy!) is A/(2L P ) 2The maximum entropy within any surface of area A (independent of energy!) is A/(2L P ) 2 L P is Planck length (see lecture on units) L P is Planck length (see lecture on units) Implies any 3D object (of any size) could be completely defined via a flat (2D) hologram on its surface having Planck-scale resolution.Implies any 3D object (of any size) could be completely defined via a flat (2D) hologram on its surface having Planck-scale resolution. Bound is only really achieved by a black hole with event horizon=that surface. Bound is only really achieved by a black hole with event horizon=that surface.

27 Do Black Holes Destroy Information? Currently, it seems that no one completely understands how information is preserved during black hole accretion for later re- emission as Hawking radiation.Currently, it seems that no one completely understands how information is preserved during black hole accretion for later re- emission as Hawking radiation. Via infinite time dialation at surface? Via infinite time dialation at surface? Some researchers (e.g. Hawking) claimed that black holes must be doing something irreversible in their interior (destroying information).Some researchers (e.g. Hawking) claimed that black holes must be doing something irreversible in their interior (destroying information). The arguments for this seem not very rigorous... The arguments for this seem not very rigorous... The issue is not completely resolved, but I have many papers on it if youre interested.The issue is not completely resolved, but I have many papers on it if youre interested. Incidentally, Hawking recently conceded a bet on this. Incidentally, Hawking recently conceded a bet on this.

28 Implications of Density Limits Minimum device sizeMinimum device size thus minimum communication latency (as per earlier). thus minimum communication latency (as per earlier). Minimum device cost, given a minimum cost of matter/energy. Minimum device cost, given a minimum cost of matter/energy. Implications for communications bandwidth limits (coming up)Implications for communications bandwidth limits (coming up)

29 Communication Limits Latency (propagation-time delay) limit from earlier, due to speed of light.Latency (propagation-time delay) limit from earlier, due to speed of light. Teaches us scalable interconnection technologies Teaches us scalable interconnection technologies Bandwidth (infropy rate) limits:Bandwidth (infropy rate) limits: Classical information-theory limit (Shannon) Classical information-theory limit (Shannon) Limit, per-channel, given signal bandwidth & SNR.Limit, per-channel, given signal bandwidth & SNR. Limits based on field theory (Smith/Lloyd) Limits based on field theory (Smith/Lloyd) Limit given only area and power.Limit given only area and power. Applies to I/O, cross-sectional bandwidths in parallel machines, and entropy removal rates.Applies to I/O, cross-sectional bandwidths in parallel machines, and entropy removal rates.

30 Hartley-Shannon Law The maximum information rate (capacity) of a single wave-based communication channel is: C = B log (1+S/N)The maximum information rate (capacity) of a single wave-based communication channel is: C = B log (1+S/N) B = bandwidth of channel in frequency units B = bandwidth of channel in frequency units S = signal power level S = signal power level N = noise power level N = noise power level Law not sufficiently powerful for our purposes!Law not sufficiently powerful for our purposes! Does not tell us how many effective channels are possible, given available power and/or area. Does not tell us how many effective channels are possible, given available power and/or area. Does not give us any limit if we are allowed to increase bandwidth or decrease noise arbitrarily. Does not give us any limit if we are allowed to increase bandwidth or decrease noise arbitrarily.

31 Density & Flux Note that any time you have:Note that any time you have: a limit on density (per volume) of something a limit on density (per volume) of something & a limit v on its propagation velocity & a limit v on its propagation velocity this automatically implies:this automatically implies: a limit F = v on the flux a limit F = v on the flux by which I mean rate per time per areaby which I mean rate per time per area Note also we always have a limit c on velocity!Note also we always have a limit c on velocity! At speeds near c must account for relativistic effects At speeds near c must account for relativistic effects Slower velocities also relevant: Slower velocities also relevant: electron saturation velocity in various materialselectron saturation velocity in various materials velocity of air or liquid coolant in a cooling systemvelocity of air or liquid coolant in a cooling system Thus density limit implies flux limit F= cThus density limit implies flux limit F= c v Cross- section

32 Relativistic Effects For normal matter (bound massive-particle states) moving at a velocity v near c:For normal matter (bound massive-particle states) moving at a velocity v near c: Entropy density increases by factor = (1 (v/c) 2 ) 1 Entropy density increases by factor = (1 (v/c) 2 ) 1 Due to relativistic length contractionDue to relativistic length contraction But, energy density increases by factor 2 But, energy density increases by factor 2 Both length contraction & mass amplificationBoth length contraction & mass amplification entropy density scales up only w. square root (1/2 power) of energy density from high velocity entropy density scales up only w. square root (1/2 power) of energy density from high velocity Note that light travels at c already,Note that light travels at c already, & its entropy density scales with energy density to the 3/4 power. Light wins as v c.& its entropy density scales with energy density to the 3/4 power. Light wins as v c. If you want to maximize entropy/energy flux If you want to maximize entropy/energy flux

33 Entropy Flux Using Light F S = entropy fluxF S = entropy flux F E = energy fluxF E = energy flux SB = Stefan-Boltzmann constant, 2 k B 4 /60c 2 3 SB = Stefan-Boltzmann constant, 2 k B 4 /60c 2 3 Derived from same field-theory arguments as the density bound.Derived from same field-theory arguments as the density bound. Again, blackbody spectrum optimizes entropy flux given energy fluxAgain, blackbody spectrum optimizes entropy flux given energy flux It is the equilibrium spectrum It is the equilibrium spectrum Smith 95

34 Entropy Flux Examples Consider a 10cm-wide, flat, square wireless tablet with a 10 W power supply.Consider a 10cm-wide, flat, square wireless tablet with a 10 W power supply. Whats its maximum rate of bit transmission? Whats its maximum rate of bit transmission? Independent of spectrum used, noise floor, etc.Independent of spectrum used, noise floor, etc. Answer:Answer: Energy flux 10 W/(2·(10 cm) 2 ) (use both sides) Energy flux 10 W/(2·(10 cm) 2 ) (use both sides) Smiths formula gives 2.2×10 21 bps Smiths formula gives 2.2×10 21 bps Whats the rate per square nanometer surface?Whats the rate per square nanometer surface? Only 109 kbps! (ISDN speed, in a 100 GHz CPU?) Only 109 kbps! (ISDN speed, in a 100 GHz CPU?) 100 Gbps/nm 2 nearly 1 GW power! 100 Gbps/nm 2 nearly 1 GW power! Light is not infropically dense enough for high-BW comms. between densely packed nanometer-scale devices at reasonable power levels!!!

35 Entropy Flux w. Atomic Matter Consider liquid copper (~1.5 b/Å 3 ) moving along at a leisurely 10 cm/s…Consider liquid copper (~1.5 b/Å 3 ) moving along at a leisurely 10 cm/s… BW=1.5x10 27 bps through the 10-cm wide square! BW=1.5x10 27 bps through the 10-cm wide square! A million times higher BW than with 10W light!A million times higher BW than with 10W light! 150 Gbps/nm 2 entropy flux! 150 Gbps/nm 2 entropy flux! Plenty for nano-scale devices to talk to their neighborsPlenty for nano-scale devices to talk to their neighbors Most of this entropy is in the conduction electrons... Most of this entropy is in the conduction electrons... Less conductive materials have much less entropyLess conductive materials have much less entropy Lesson:Lesson: For maximum bandwidth density at realistic power levels, encode information using states of matter (electrons) rather than states of radiation (light). For maximum bandwidth density at realistic power levels, encode information using states of matter (electrons) rather than states of radiation (light). Exercise: Kinetic energy flux?

36 Some Quantities of Interest We would like to know if there are limits on:We would like to know if there are limits on: Infropy density Infropy density = Bits per unit volume= Bits per unit volume Affects physical size and thus propagation delay across memories and processors. Also affects cost.Affects physical size and thus propagation delay across memories and processors. Also affects cost. Infropy flux Infropy flux = Bits per unit area per unit time= Bits per unit area per unit time Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removalAffects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal Rate of computation Rate of computation = Number of distinguishable-state changes per unit time= Number of distinguishable-state changes per unit time Affects rate of information processing achievable in individual devicesAffects rate of information processing achievable in individual devices

37 Speed Limits

38 The Margolus-Levitin Bound The maximum rate at which a system can transition between distinguishable (orthogonal) states is: 4(E E 0 )/hThe maximum rate at which a system can transition between distinguishable (orthogonal) states is: 4(E E 0 )/h where: where: E = average energy (expectation value of energy over all states, weighted by their probability)E = average energy (expectation value of energy over all states, weighted by their probability) E 0 = energy of lowest-energy or ground state of systemE 0 = energy of lowest-energy or ground state of system h = Plancks constant (converts energy to frequency)h = Plancks constant (converts energy to frequency) Implication for computing:Implication for computing: A circuit node cant switch between 2 logic states faster than this frequency determined by its energy. A circuit node cant switch between 2 logic states faster than this frequency determined by its energy.

39 Example of Frequency Bound Consider Lloyds 1 liter, 1 kg ultimate laptopConsider Lloyds 1 liter, 1 kg ultimate laptop Total gravitating mass-energy E of J Total gravitating mass-energy E of J Gives a limit of bit-operations per second! Gives a limit of bit-operations per second! If laptop contains bits (photonic maximum), If laptop contains bits (photonic maximum), each bit can change state at a frequency of Hz (25 EHz)each bit can change state at a frequency of Hz (25 EHz) 12 billion times higher-frequency than todays 2 GHz Intel processors 12 billion times higher-frequency than todays 2 GHz Intel processors 250 million times higher-frequency than todays 100 GHz superconducting logic 250 million times higher-frequency than todays 100 GHz superconducting logic But, the Margolus-Levitin limit may be far from achievable!But, the Margolus-Levitin limit may be far from achievable!

40 More Realistic Estimates Most of the energy in complex stable structures is not accessible for computational purposes...Most of the energy in complex stable structures is not accessible for computational purposes... Tied up in the rest masses of atomic nuclei, Tied up in the rest masses of atomic nuclei, form anchor points for electron orbitalsform anchor points for electron orbitals mass & energy of core atomic electrons, mass & energy of core atomic electrons, fill up low-energy states not involved in bonding,fill up low-energy states not involved in bonding, & of electrons involved in atomic bonds & of electrons involved in atomic bonds needed to hold the structure togetherneeded to hold the structure together Conjecture: Can obtain tighter valid quantum bounds on infropy densities & state-transition rates by considering only the accessible energy.Conjecture: Can obtain tighter valid quantum bounds on infropy densities & state-transition rates by considering only the accessible energy. Energy whose state-infropy is manipulable. Energy whose state-infropy is manipulable.

41 More Realistic Examples Suppose the following system is accessible: 1 electron confined to a (10 nm) 3 volume, at an average potential of 10 V above ground state.Suppose the following system is accessible: 1 electron confined to a (10 nm) 3 volume, at an average potential of 10 V above ground state. Accessible energy: 10 eV Accessible energy: 10 eV Accessible-energy density: 10 eV/(10 nm) 3 Accessible-energy density: 10 eV/(10 nm) 3 Maximum entropy in Smith bound: 1.4 bits? Maximum entropy in Smith bound: 1.4 bits? Not clear whether bound is applicable to this case.Not clear whether bound is applicable to this case. Maximum rate of change: 9.7 PHz Maximum rate of change: 9.7 PHz 5 million × typical frequencies in todays CPUs5 million × typical frequencies in todays CPUs 100,000 × frequencies in todays superconducting logics100,000 × frequencies in todays superconducting logics

42 Summary of Fundamental Limits

Еще похожие презентации в нашем архиве:

Готово:

UDC O. Burdeinyi, Bachelor student O. Pidtychenko, PhD in Tech. Sciences, As. Prof., research advisor S. Kobzar, senior lecturer, language advisor.

UDC O. Burdeinyi, Bachelor student O. Pidtychenko, PhD in Tech. Sciences, As. Prof., research advisor S. Kobzar, senior lecturer, language advisor.

© 2018 MyShared Inc.

All rights reserved.