Entropy is really important... somehow? What exactly is it, and why does Entropy appear in physics' most philisophical problems?
How can Graham's Number collapse my brain into a black hole?
Notes:
1. Notice the definition of "unusable" depends on what we define as useful. Different definitions of "useful" differ from each other with a fixed constant. Usable is relative, just like energy. Only differences matter, and the definition can't change between calculations. Ignoring this is a common basis for pseudoscientific statements, so it's good to be aware of.
2. F = U - TS is only one definition of free energy: Helmholtz’s free energy. It assumes the same starting and ending pressure of the same gas. The reason for the many types of free energy, is that due to there being 6 thermodynamic variables (Temperature, Entropy, Pressure, Volume, Energy, Molar). That's too many degrees of freedom for any general equation, so there are commonly used assumptions that get us to useful free energy equations. Others include Gibbs free energy, which is used in chemistry.
3. The Carnot engine is the theoretically most efficient engine possible. Viascience has a beautiful series going into more detail (Thermodynamics 4a is the video that begins on entropy.)
4. Differential calculus sure assumes that gases are smooth and continuous, but a small enough particle size would still be best approximated by a differential equation. Part of the reason people were so mad over atoms existing was a philosophical debate on the reality of the universe.
5.Energy alone doesn’t determine the number of possible microstates. Gas amount, potential (like gravity), electric fields etc. all contribute. This function assumes those are constant/zero, as the value that matters in this derivation is the parameter of total energy.
6.The function of microstates is just a count of all possible configurations of energy across particles. It doesn’t matter exactly what is being sorted, whether it be baseballs in bins, pigeons in holes, or a gas’ energy over its particles. Therefore, the same function can apply to both gases.
7. A fair die produces outcomes 1, 2, 3, 4, 5 and 6 with probabilities: 1/6, 1/6, 1/6, 1/6, 1/6, 1/6. The information associated with each outcome is Q = −k log 1/6 = k log 6, and the Shannon entropy S = log2(6) = 2.58 bits. A weighted die produces outcomes 1, 2, 3, 4, 5 and 6 with probabilities 1/10, 1/10, 1/10, 1/10, 1/10, 1/2. The information contents associated with the outcomes are k log 10 ,k log 10, k log 10, k log 10, k log 10 and k log 2. (These are 3.32, 3.32, 3.32, 3.32, 3.32 and 1 bit respectively.) The Shannon entropy is then S = k(5 × 1/10 log 10 + 1/2 log 2) = k(log√20) = 2.16 bits.
8. Hawking radiation is not that simply proven. All matter particles are waves because of Quantum Field Theory, which in theory doesn’t take gravity into account. This off-the-cuff argument would be a potential intuition as to whether Hawking radiation might exist or not, but it took Stephan Hawking and others to prove it in gravity regimes that QFT has proven itself.
------------------
Timestamps:
0:00 - Intro
0:55 - Entropy from Thermodynamics (Free Energy)
3:06 - Entropy from Statistical Mechanics (Microstates)
6:40 - Entropy from information theory (Shannon Entropy)
10:35 - Making predictions with entropy
11:03 - Black hole entropy
12:13 - Maxwell's demon can't reduce entropy
14:25 - Numbers have mass
15:36 - Closing Thoughts
------------------
Music: Mark Tyner - Close To You
Негізгі бет Memorising Graham’s Number Creates Black Holes | Entropy
Пікірлер: 58