Tuesday, October 14, 2014

Voodoo Blog: Slight Return

How I Assume You're Reacting Right Now

After a hectic and unpleasant few weeks, this blog should be back in business! Previously, these kind of bland posts have been followed by bursts of genuine activity, so I'm hopeful to get back to almost daily writing.

Max Born and Wolfgang Pauli


The IITSTIAPW series is going pretty well! Someone asked me why I haven't discussed anything actually quantized (that is, discrete) yet, which is a very fair criticism. The reason is that I haven't yet introduced anything that makes the discrete nature of quantum physics seem natural. What I'm going to do next time is introduce a new concept, "the operator", through a specific example, "the particle number operator". This will make it obvious that one must be in a Hilbert Space at the expense of making it somewhat mysterious that most "real" systems (that is, systems that we have) do not have a definite number of particles but rather a distribution over them. In fact, systems with definite numbers of particles are called Fock States and have weird properties. For instance, thanks to no-cloning theorems, only one person can observe a system in a Fock state, which means that if you encode a signal into a system in a Fock state (in binary polarization angles) an eavesdropper can't quietly listen in. Two people cannot look at the same thing if the image of that thing is in Fock space. The idea of splitting light comes from intuition bread on non-Fock light. I hope I can give examples that make that concept clear if only to myself.

Never Ending Dandy, Baby

Well, the best TV show of the past two years, Space Dandy, has been over for some time now. Even though we all hope with bated breath for the announcement of a movie, a new season or at least a new batch of merch, perhaps we should take this time to reflect. I'd like to do a retrospective of Shinchiro Watanabe's work in general, starting with Macross Plus in 1994 (it seems like only yesterday, it was twenty years ago) and ending with Space Dandy in 2014 (well, I haven't seen Terror In Resonance yet). It would be a bit of fun, dipping into more artistic endeavors online. But I doubt there would be many interested.

Before I go, this one is just for me:

Tuesday, September 30, 2014

Is It The Same Temperature In All Possible Worlds? Part 4: Bra-Ket Notation


So, in the last installment of IITSTIAPW, I attempted to explain how the fact that quantum particles fluctuate along many paths to a target brings a picture that looks like the physical lines of force for an electric field. This might be called the "position field" of a particle. This field gives all the information there is about the position of a single particle. In this post, I will introduce some notation to help tame these thought experiments. I was hoping to get all the way to entanglement today, but not quite.

Actually, a gravitational potential is unrealistic, partly because most quantum particles are too light to really notice gravity. You could do this experiment with electric potentials though.

In our first thought experiment, we have a particle on a special angle iron. This angle iron is bent and placed on a support. Three sensors are placed on the angle iron, two in the crook of the metal and one on the edge of the right side. The left sensor is green, the right crook sensor is red and the edge sensor is orange. The red sensor turns on only if the orange sensor is activated first. If a classical particle should be placed onto either side, then it would fall down the bend of the angle iron until it hit the red or green sensor (there is an extremely tiny stable region, but that can be safely ignored), though only the green would ever be activated. A quantum particle however has more choices, as illustrated below.

 Possible paths if we measure green (top) or red (bottom)

The particle is said to be \( \left| G \right\rangle \) if the green sensor is activated and to be \( \left| R \right\rangle \) if the red sensor is activated. The name of this notation is "ket", but it is read, essentially, "state G". So the above sentence should be read out loud "The particle is said to be in state G if the green sensor is activated and and to be in state R if the red sensor is activated.". These are all the possible states of the system. As it turns out, the seemingly unique right method to represent quantum states is as complex vectors, linear algebra type vectors. While there is a result that explains why they are complex vectors given they are vectors (long story short, it is because complex numbers can have the same length and rotate), there is no result explaining why linear algebra should be used in the first place. The square of the length of these complex vectors . The structure was guessed by a group of physicists and nobody's ever managed to make it go away. Amusingly, most of the physicists didn't actually know linear algebra beforehand and painstakingly reproved many old theorems in new, less expressive notation.

 More paths are like more lines of force, they represent a stronger field, a higher probability

By themselves, after we look at the results we might say that each situation either happened or didn't. So we can without loss of generality claim that \( \alpha \left| G \right\rangle \) and \( \alpha \left| R \right\rangle \) have a length of one. But before the sensors are turned on, things are more complicated. The above picture implies that we must be in some state \( \left| S \right\rangle = \alpha \left| G \right\rangle + \beta \left| R \right\rangle \) where \( | \beta |^2 \ll | \alpha |^2 \) and \( | \beta |^2 + | \alpha |^2 = 1\). \( \left| S \right\rangle \) is said to be a superposition of states. Notice above that even though \( \left| S \right\rangle \) is written as the some of two states, if you look above you'll see a single perfectly good field. Superpositions of states are still pure states, there is still only one particle fluctuating around. In a later post I'll cover multiple particles, in which case entanglement starts becoming an interesting issue.

We measured R!

Let's say that we place a quantum particle on the crook of the angle iron and waited a while. Later, once things have settled down, we turn on the sensors and the red sensor clicks! Happening at all is sufficient evidence that the universe is not classical. But quantum physics fixes more than that. We denote the situation that we measured the system in a particular state, say \( \left| R \right\rangle \), by \( \left\langle R \right| \). This new object is called a "ket" and is basically read "measuring state R". When you have a bra and a ket together, you have a bra-ket. A bra-ket is the probability of measuring state bra from state ket. For example, if you are definitely on the left side then you aren't on the right side and vice versa, so that \( \left\langle R | G \right\rangle = 0 \) and \( \left\langle G | R \right\rangle = 0 \). Let me write how that sentence should be read aloud. "If you are definitely on the left side then you aren't on the right side and vice versa, so that measuring state R from state G has probability zero and measuring state G from state R has probability zero.". Obviously, if you're at a sensor, then it clicks right away so that \( \left\langle R | R \right\rangle = 1 \) and \( \left\langle G | G \right\rangle = 1 \). These states are "orthonormal" vectors by design. But what if I put the particle in the middle and waited a long time before turning on the sensors, like I did in the above experiment? Then the particle is in \(\left| S \right\rangle \), so that $$\left\langle R | S \right\rangle = \left\langle R \right| ( \alpha \left| G \right\rangle + \beta \left| R \right\rangle )$$ $$\left\langle R | S \right\rangle = \alpha \left\langle R | G \right\rangle + \beta \left\langle R | G \right\rangle$$ $$\left\langle R | S \right\rangle = \alpha 0 + \beta 1 = \beta $$ This is the most basic linear algebra structure of quantum mechanics.

Very Formal

If I were being formal, I would go over the precise axioms of the so-called bra-ket notation. In addition, so far I have only worked with a single measurable, position. Instead, I'll just make promises. In the next few posts of this series, I will first go over quantum mechanics for multiple particles, then give the formal bra-ket axioms (which is, in essence, an axiom scheme for quantum mechanics) and finally extend the ideas in these posts to fields.

Saturday, September 27, 2014

Turbulence Theory Through Feynman Diagrams

Surfing Through Space

Are you a fluid mechanics type wanting to break into high energy physics? A high energy physics type looking to work in Fluid Mechanics? A dandy guy in space? Well, this one is aimed at the first three. Huge audience, really pandering.

R Kraichnan

Robert Kraichnan is a famous turbulence theorist, famous for developing the Direct Interaction Approximation, which leads to the Kraichnan Spectrum (which has been an enormous empirical success) and his work in two dimensional turbulence that has been so important in modeling global weather systems (technically, 2D turbulence had been worked on by Onsager. Characteristically, nobody understood him until they were required to as to pass their classes...). It's not as widely known that he got his start in high energy physics! As an undergrad at MIT, Kraichnan was required to turn in a sort of thesis project. An overachiever, Kraichnan founded the modern theory of the "graviton", a hypothetical particle whose existence would imply a geometry equivalent to General Relativity.

thanks a lot internet
 
Anyway, Kraichnan never stopped being a high energy physicist in many ways. He actually used Feynman diagrams and such techniques to obtain many of his results, the reworked them with techniques his contemporaries were more comfortable with. As a result, many of his papers are very cryptic (deriving this stuff twice must have been exhausting). His talk "Interpretation of a Dynamical Approximation for Isotropic Turbulence" actually reveals a lot more of his method than his later, more obscure papers. I learned about this paper from this talk from Uriel Frisch. Frisch gives a good description of what the paper is about, as long as you already know what it is about. In particular, the three lines on page 6 represent the subscripts in Frisch's equation. Once you realize that, the others are a lot easier to read. One nice thing about this approach is that cross-correlations, so important in turbulence theory even as far back as GI Taylor (one could say fairly that there was no turbulence theory until Taylor started looking at cross correlations...), come right out of the theory.

Anyway this is a really nice paper, but it is too late in the evening for me to attempt a real explanation. Just read it!

Wednesday, September 24, 2014

Is It The Same Temperature In Every Possible World? Part 3: The Pure Wave Function

EDIT: I uploaded a version that didn't include a major point I wanted to make.

A View of Mount Fuji
 
In the last post, we talked about how in quantum physics there are new ways for a particle to move. The new ways I called "fluctuating", a particle can fluctuate to you in addition to moving in a more traditional way. I'll call the old method "rolling". This opens up new opportunities for a particle, since a particle that can fluctuate and roll can get into places that a particle that could only roll around couldn't. Take moving around carbon. I can roll you a lump of coal if you aren't too far uphill, but I can burn it and let it waft (fluctuate through the air) over to you no matter how high you are. Getting the coal to you on a large mountain is analogous to getting two protons to fuse. Since the proton can fluctuate, it can get close to another proton even if it doesn't have enough energy to overcome the electric repulsion by direct movement.


Imagine a ball sitting on the bottom of an angle iron. At either end of the iron we have detectors, so that we know that the ball is at one end of the angle iron at \( t_1 \) and at the other end at \( t_2 \). Classically, the ball can only really do one thing, roll down the piece of metal on the base of the joint. This means that the classical view of the particle's position is something like this:

Rolling

But quantumly, the ball can go on paths other than the base. The ball can fluctuate up the side of the angle iron.


Fluctuating

All of these paths are consistent with the experiment/our knowledge of the world/the boundary conditions. Just like the cloud of coal particles that waft toward you, it isn't the case that this quantum particle goes on a particular one of the paths. From the perspective of the starting time, the future path is undetermined, from the perspective of the ending time, the past path is undetermined. But indeterminism isn't a theory, and we're now ready to see what it is that quantum theory does determine. Let's look at the above situation from the top.

The Top

Obviously, these aren't all of the paths that the particle fluctuates on. The drawing is topological. I draw more lines to denote heavily traveled areas, less to indicate regions that the particle rarely sees. Does that sound familiar?

Electric Field

This new mode of movement makes the position of a particle not into a single point, but rather a field of "existence", or to use standard terminology "probability". This field has many of the properties you'd expect a field to have, such as continuity, the existence of a current, etc. This field, for all intents and purposes is the particle. The structure of quantum mechanics allows us to totally determine the field. It is this fact that allows us to determine precisely the predictions of quantum theory. Every particle is a geometry, the structure of the particle determines the shape of that field. For instance, the electric field is the geometry created by the existence of the photon.

The Experiment Below

Experiments can be designed to show that this geometry can't be given by a classical particle. Let's say that the quantum particle is slow, much slower than light. Put a pair of sensors are placed on the angle iron mentioned above. The first sensor is placed in an area that is not accessible by classical movement, but it can fluctuate up there. When activated, it turns on the second sensor placed in an area that is classically acceptable. If the second sensor is ever activated at all, then you know that the particle is moves in a quantum way. Nice and binary.

Now be careful about thinking about this. It is not the case that the particle randomly chooses one of these paths and follows it. It fluctuates out over all of these paths. As it turns out, there are experimental consequences of this. I'll go over this in a couple posts.

Finished With Step 0

Now that I've given the flavor of quantum mechanics, a couple of steps remain before we get into the interpretation matters. The above gives the quantum idea of a single particle, in fact there are multiple particles in the universe. This gives rise to a density matrix and entanglement. More difficultly, the above gives the field that is a single particle, but we know in this world there are also fields. If particles quantize to be fields, what do fields quantize to? Finally, I need to show that the above arguments lead to the Schrodinger equation (there's a standard argument due to Feynman). But that can wait for later. See You Next Time!

Tuesday, September 23, 2014

Relaxing Music For Busy Times

Sometimes you just don't want to explain how movement by fluctuation leads to a field view of . More rarely, you want to, but can't. This is one of those rare times. I keep calm by listening to music, all sorts of different music.

Back in the 40's and 50's, there was a huge Hawaiian music boom. Bing Crosby started bringing over steel guitarists for his records (which is how they ended up in country music despite not at all appearing in the underlying folk tradition). The . In the early days the tradition of the steel guitar was strongly influenced by Jazz and Blues in addition to Hawaiian folk music. Of course, the recorded versions were also influenced by then contemporary pop music (again, think Bing Crosby). One of these numbers sounded to me like an uptempo version of "You Gotta Walk That Lonesome Valley". The greatest master of the Steel Guitar in it's infancy was Sol Hoopii, who came to the continent as a stowaway as a teenager. This video was made for distribution in churches, to keep parishioner attention they would play what were essentially little educational shorts occasionally. Sometimes they managed to record great art in the process!



That's all for today. Have Fun!

Friday, September 19, 2014

Is It The Same Temperature In Every Possible World? Part 2 - The Flavor Of Quantum Mechanics

Not This Stuff Yet

It has been decided that instead of making a strictly logical foundations post, I'm going to start by writing a post to give someone unused to quantum mechanics an idea of the general flavor. I'm going to do my best to make this maximally bland when it comes to theory, giving a couple of asides for those who already have this way of thinking in their head.

Animated Double Slit

The usual approach to quantum mechanics involves the double slit experiment. This is how Feynman approaches Quantum Mechanics in his lectures, and this is clearly how Bohm thought of the whole problem. The basic issue is this. If something is a wave, like water or possibly light, we know what will be seen on the other side. We know because Huygens taught us. If you want to know, look above. If something is a particle, then the tiny slits will only allow the particles through in one direction:


See the difference? Well, an issue comes up in interpreting the experimental results. The same things that to all - literally all - direct experiments look like a particle, move like a wave. This used to be called particle-wave duality, complementarity and other unclear names. And there is really no issue in classical quantum mechanics that can't be stated in terms of this experiment, so that's nice.

But Who Cares?

Despite the depth of this approach, I don't think it really gets at what people want to know about quantum mechanics. In addition, it doesn't obviously generalize to relativistic quantum mechanics, which you need if there are magnetic fields. So, I'm going to try a different method. If this fails to give you any intuitive view of quantum, feel free to blame me.

You And Me and Coal


Imagine that you are the blue person and I am the black person. I want to give you some carbon, and I'm holding some nice anthracite coal. Now, there are two ways that I can get this carbon to you. One is that I can move it to you on some path, perhaps by throwing it to you.

For Certain Values of "to you"

But that isn't always possible. For instance, if there is a fine grating between us, the above picture is impossible. What I can do then is burn the coal, and fan the smoke to you. Then you can reconstitute the smoke into carbon.

Less Dangerous, Less Efficient

In other words, there are two possible methods of getting the carbon to you. One is by direct movement, the other by fluctuation. In classical mechanics, fluctuation is treated as a limit of direct movement. In quantum mechanics, the two are given equal weight. It really is the case that something can get to me by traveling straightforwardly or by fluctuating over. The Schrodinger formulation emphasizes the wave like movements, the Feynman formulation the diffusive movement. The rest of the difficulties of quantum mechanics are mere physics.

Let's go through a more directly quantum version of this idea. Let's look at some particle in a box! This box is divided into three parts, outside the box there is nothing. In the first part and last part the potential is zero. In the middle there is some potential to stop particles from moving:

The Potential

If there is any direct movement path for a quantum particle to get somewhere, then you call that region classically allowed. In those place, you expect to see waves, just like classical mechanics waves. If you have to fluctuate to get there, call it classically forbidden. In those places the distribution should fall off exponentially, just like classical mechanics diffusion. This is called the WKB approximation. This idea gives the following idea about where the particle could be:

The Distribution of Where the Particle Could Be

In the classically allowed regions, the particle moves around on easy classical paths. I represent that rapid movement with a big wave. In the classical forbidden region, the particle must fluctuate into weird areas. I represent that region by a exponentially decaying curve. But this is just formulation. I must now show how this quantum view of the world differs from the classical vision. I'll do this by proposing a thought experiment. At the beginning of time, the only particle in the universe is entirely at one location. Perhaps you, a god, have some sort of sensor that lets you tell this.

In the beginning, there was the Dirac Delta

We turn off the sensor, allowing the particle to move again. At first, it mostly moves in the left classically permitted zone, moving in the quick, easy way. But soon the fluctuation type movement starts becoming visible. At this point, it starts to look like the WKB approximation above.

This again.

Over time, the particle's movement and fluctuation start to balance out. At this point, it's spread evenly in the first and third in the two classically permitted areas and to a lesser extent in the classically forbidden area. I'd make a picture, but I left all my scripts on another computer. You turn on a sensor on the right classically permitted area. You can get the expected waiting time from the probability the area under the curve over the sensor. Though this is a classically permitted area, there is no classical path that brought you here. The history of this experiment is a quantum history. This might sound silly, and I tried to make it that way. But this has real experimental and theoretical consequences.

G Gamow

This situation is analogous to an atom undergoing fusion. The left classically permitted region is like free space, the right classically permitted region is like a fused nucleus. The bump in the middle represents the region that the incoming proton is to weak to get past the nucleus's magnetic field. With only classical movement, fusion is not possible. But quantum physics permits a new type of movement, and even if one can't move past the bump, one can fluctuate past.

In the next post in this series, I'll show how this idea of movement by fluctuation leads to a path integral and maybe do some interpretation. In the mean time, I have a couple other ideas to try out. See you next time!

Thursday, September 18, 2014

Hypermodern Classical Mechanics

Hypermodern Classical in a Non-Scientific Context

Really great paper from Norden Huang of Hilbert-Huang Transform (HHT) fame. The opening sentence is great:

"Historically, there are two views of nonlinear mechanics: the Fourier and the Poincare. ...

This is thus the Fourier view: The system has a fundamental oscillation (the first-order solution) and bounded harmonics (all the higher-order solutions). Although this approach might be mathematically sound, and seems to be logical, the limitations of this view become increasingly clear on closer examination: First, the perturbation approach is limited to only small nonlinearity; when the nonlinear terms become finite, the perturbation approach then fails; Second, and more importantly, the solution obtained makes little physical sense. It is easily seen that the properties of a nonlinear equation should be different from a collection of linear ones; therefore, the two sets of solutions from the original equation and the perturbed ones should have different physical and mathematical properties.
...
Poincare's system provides a discrete description. It defines the mapping of the phase space onto itself. In many cases, Poincare mapping enables a graphical presentation of the dynamics. Typically, the full nonlinear solution is computed numerically. Then the dynamics are viewed through the intersections of the trajectory and a plane cutting through the path in the phase space. The intersections of the path and the plane are examined to reveal the dynamical characteristics. This approach also has limitations, for it relies heavily on the periodicity of the processes. The motion between the Poincare cuts could also be just as important for the dynamics. Both the Fourier and Poincare views have existed for a long time.

Only recently has an alternative view for mechanics, the Hilbert view, been proposed."

Okay, so it isn't surprising that Huang would see the world through an HHT filter, but still it's great to realize what I thought of as being a cool trick become the foundation of a worldview! If you're familiar with the HHT, then you can skip to page 22 (that is, 442 if your going by page labels) where Huang et. al. start going through the motions on real nonlinear systems. It's amusing that in spite of the fact that this is a paper that promises to be about fairly hardcore math, someone felt the need to explain what factoring x meant. The punchline is in the last paragraph of this section: "The Hilbert representation gives a true physical interpretation of the dynamics by indicating the instantaneous value of the frequency, and thus the Hilbert view is much more physical.".

The supposed increased physical insight begins to be used on page 30 (that is, 440), where Stokes waves are examined with HHT instead of Fourier transforms. The wave theory (perhaps, "point of view" would be better) of water waves is found to be insufficient: "In numerous controlled experiment ... the main frequency is seen to shift to a lower frequency ...". Obviously, the wave theory says the frequencies should stay still. They explain: "This decrease, however, occurs only at certain very local regions. Such local change makes the data nonstationary, the condition Fourier analysis is ill-equipped to deal with. ... [T]he Fourier spectral peak represents only the global mean frequency. It is not sensitive to the local change of frequency as in the local and discrete downshift phenomena.". The weakness of Fourier methods is connected to their linearity and the consequent uncertainty principle: "the only way the Fourier method can represent a local frequency change is through harmonics. But such a representation is no longer local.".

I think that Huang here makes a good argument that HHT is a good first tool, because it doesn't require a lot of assumptions and computing power is cheap. Huang definitely shows that there are situations where interpreting the Fourier Spectra physically is difficult, and perhaps they in these situations are purely formal. The local, non-linear nature of the HHT is an advantage when systems aren't linear. So, HHT has no advantages for Quantum Mechanics. However, this article is much too empirical (a rare problem indeed!). Advantages and disadvantages are shown by example, but how can anybody know if these are just special cases. In addition, the Poincare and Fourier approaches aren't antithetical. In order to demonstrate greatness, the interconnection between this Hilbert view needs to be explored. Some fresh uncertainty principle would be nice! Also, the Poincare and Fourier systems views give a misleading picture because of their global nature, but they are motivated by deep physical theory - Fourier Analysis by the Newton's 3 Laws formulation of classical mechanics (everything is second order DEs) and Poincare by the Lagrangian/Hamiltonian view of classical mechanics (everything is motion in phase space). The physical motivation of the non-linear HHT should be more deeply delved into. It's given some room in the first part I said one could skip, but I think it should have been more prominent (perhaps I should read Bendat & Piersol ...). The local nature of HHT would make more sense if this was done, I think.

Anyway, this is a nice little paper full of neat applications of a cool modern scientific tool. A good distraction from both my real jobs and my quantum mechanics blogging (which I promise will happen). Enjoy!

Monday, September 15, 2014

A Quick One While I Am Away

A dandy guy in complex Hilbert space

I've been looking around for a nice authoritative foundation of quantum mechanics that isn't too interpretative. The classics are Dirac, von Neumann (haven't read...) and Feynman. I've got a couple books, including Messiah (which I haven't read...) a book called "A First Course In Quantum Mechanics", Roger Penrose's amazing but super thick Road To Reality (really good on complex analysis! I don't know so much about physics). And there's a lot of really nice things like this.

This guy again. People are going to start getting annoyed.

Feynman is very dedicated to an ... interpretation of physics. Maybe style would be a better word than interpretation. Not just quantum mechanics, but all of physics. Feynman's ontology is that everything is particles, even fields are particles. This makes some things nice and some cruel and wicked. In addition, field theory is really pretty, especially if you take out that distracting physics. This very local view of reality was called by philosopher David Lewis the "Humean Mosaic", but Feynman by denying the existence of fields goes far beyond this. Lewis's strong localism doesn't deny that fields exist because field have totally local interpretations, as demonstrated by Weyl somewhere. (Pro-tip: if you ever have a physics theorem you know is true but can't prove it at the moment, say Weyl did it and you'll probably be right). The question is whether this interpretation gives us anything extra. Working with fields can make some things a lot easier. For instance, a static magnetic field outside a superconductor in the Meissner state is obviously a variation on Laplace's Equation because the field has to be zero on the super conductor. This is obvious if you think in terms of fields, but it is more mysterious (but still true) if you think in terms of the microscopic interactions on the surface superconductor.

I'll let you all go with some music:



See you soon!

Saturday, September 13, 2014

Pseudo-Review: Theoretical Solid State Physics, Volume 1 - Perfect Lattices in Equilibrium


The reason this is just a pseudo-review is that while the subject is very important and very interesting, this textbook is very much a typical physics textbook, so there's not much to say about it. It's clear and covers the field as it existed in1973 very well. There is lots of theory and plenty of empirics, so the student can get a good view of how things are done as well as why they are done that way. It's a bit dated, there's nothing on now important topics like quantum heterostructures, but a student that can grasp this book will grasp those as well. The book presumes a strong knowledge of physics - especially quantum and statistical mechanics. This is about all I can think in terms of actual reviewing, so now I will turn to some general observations.

A Minimal Riemann Surface, But Why?

As was mentioned in a previous post, solid state physics is an inherently quantum field. In fact, much of the mathematics of modern solid state physics comes from quantum field theory. In solid state physics, one can make the models of quantum field theory amusingly literal, resulting in such wacky creatures as spin ice. Reading about such things reminds me of the old anime Outlaw Star. In this sci-fi cartoon, there was an episode that was a riff on the old Hal Clement novel Mission of Gravity in which a magnetic monopole plays a role in the main character's escape from a prison plant. I hope this means that reading and thinking about this helps me write the "Is Every Possible World The Same Temperature?" series of posts.

But such distractions aside, the format of this book had a lot of little things that bothered me. This is very much a physicists book, where important theorems are used, not proven. One example is in what is essentially the founding theorem of perfect lattice part of solid state physics. There is some wavefunction \( \Psi(\vec{x},t) \) that describes the overall behavior of a crystal. We start by assuming that the system is in equilibrium, so that \( \Psi(\vec{x},t) = \Psi(\vec{x},t) \). We do this because: 1. Less variables to deal with and 2. This situation is quite applicable to the real world.

We go on to assume the crystal is large enough that border effects can be ignored and perfectly regular. That is, there is some lattice vectors \(\vec{v_1}, \vec{v_2}, \vec{v_3}\). We can examine the set of transforms \(T_{\vec{v_i}}\) such that: $$T_{\vec{v_i}}(\Psi(\vec{x})) = \Psi(\vec{x}+\vec{v_i}).$$If you continue along the crystal enough, you'll land in a location that is indistinguishable from where you started. For instance, if you start on an atom, you'll end up on another atom of with the same surroundings. This assumption comes from crystallography, and is applicable to some systems but not others. If true enough, there exists an N such that $$T_{\vec{v_i}}^N(\Psi(\vec{x})) = \Psi(\vec{x}+N \vec{v_i}) = \Psi(\vec{x}).$$This means that the transform \(T_{\vec{v_i}}\) is a root of unity, and we can use Lagrange's Theorem to bring in the powerful tools of group theory. Lagrange's Theorem and Fourier Analysis together give the theory of Bloch Waves.

This is all simple enough, even obvious once you've seen group theory for a bit. But I still would have preferred the book to present it more like this, with Lagrange's Theorem and everything showing it's face. I would like to find a book that is like Khinchin's Mathematical Foundations of Statistical Mechanics but for solid state stuff. In essence, what I really wanted the book to give me insight into math from a physics perspective, rather than what the book is - a book to give insight into physics using math. Oh well.

Writing this post has made me decide that the next post in the "Is Every Possible World The Same Temperture?" series must be on what quantum mechanics even is. Then when I want to make a statement about quantum physics, I can do it by dropping a pointer. I hope to build a maximally uninterpreted bare theory that all the future posts will try to get behind, hopefully all succeeding and all failing. See you soon!

Friday, September 12, 2014

John Fahey

I'm working on those quantum posts, which need to be pretty detailed. This is in addition to my real life. So today I'll just share one of America's great unknown artists:


Within his looping, repetitive, raga-like song style hides sophisticated musicianship:


Hopefully you find something you love:

Thursday, September 11, 2014

Is It The Same Temperture In Every Possible World? Part 1


It's strange, but people didn't argue about the interpretation of classical mechanics. The usual explanation is that classical mechanics is intuitive, which is obviously untrue - if it was why are so many students so bad at it? Even if it wasn't obvious, psychologists have convincingly argued that our intuitive reactions to how objects move do not follow Newton's Laws. There are at least two facts about quantum mechanics that lead people to feel it needs interpretation:
  1. Not everything can be sharply defined at the same time. (what used to be called 'complementarity')
  2. Things are not, in general, continuous (what used to be called 'quanta')
Most of the unintuitive physics results come from the second fact. The idea that in order to get there, you must pass through here is a deeply held belief. One big reason we need mathematics is to force people - such as economists - to say it out loud.

However, the first reason clearly raises two questions. One is "Where does apparent sharpness come from?" and the other is "How does nature know what to make sharp?". The second question is very tough. There are a variety of answers. The original answer, enshrined in ancient texts, is that how you set up your experiment somehow decides what nature will make sharp. Explaining this led von Neumann to support the idea that the universe is guided by each human consciousness. Many leap all to quickly to reject this idea, but I will argue in a bit that it could conceivably be a theorem of quantum mechanics!

Harald and Niels Bohr

The most powerful man in early quantum mechanics was the great physicist Niels Bohr. Bohr's complete work runs 13 volumes, his scientific work covering 9 fat books. Throughout his graduate studies, Bohr worked on the problem of modeling solids (specifically, metals) with dynamical models. Einstein and Jean Perrin had convincingly shown that the atomic hypothesis was not just a convenient thought experiment, but a genuine description of the world. A couple years later, Einstein had published the first modern (that is, quantum) solid state physics model, which was a bit of a proof of concept showing that quantum mechanics could be used to new physics. Bohr's thesis included a proof that classical physics could not explain ferromagnetism. The intuition is that in classical physics, jiggling atoms are as likely to go one way or another, and since their little magnetic poles can point at any angle, there will always be enough room for them all to point different directions. This demonstrated that only quantum mechanics could explain solid state physics, which remains a very interesting and deeply quantum mechanical field!

But I digress. Bohr, when confronted by the question "How does nature know what to make sharp?" would say that there is nothing behind the experiment. All we do is observe, beyond that we can neither see nor speak. "Whereof one cannot speak, thereof one should remain silent." - Wittgenstein. Bohr was influenced in his description by positivism, but many have argued that he may have been more influenced by other philosophies, such as existentialism. Regardless of whether this vision was positivistic, pragmatic or mystical, it was accepted as gospel for some time. In this series of posts, I will explain what the possible replacements of silence could be. I'll have more on this subject in the fullness of time, including: 1) Why did people get bored with Bohr?, 2) What are the most prominent alternatives?, 3) What do the alternatives mean mathematically and philosophically? and 4) What can we believe?.

Monday, September 1, 2014

Why Can't We Be Friends: The Difference Between Bayesian And Frequentist Statistics In Two Paragraphs

Universes as plenty as blackberries

Bayesians think data is deterministic and parameters random. This means that looking all and only at the real data can teach you something about the data. We have probabilistic thoughts about what process could have made that one and only sample. There is a massive set of possible worlds and we don't know which we are in - but everything is deterministic within that world. That set of worlds might be real (David Lewis, David Deutsch) or it might be subjective (Jaynes, Savage), it doesn't make a difference mathematically. That's why we need priors, to help us regulate that set. Bootstrapping and other sampling distribution techniques are unnecessary noise makers - we're in the world we are in, don't make up new ones. The world we're most likely in is the one that is most like the data we really see (the likelihood principle).

A primitive binary classifier

Frequentists (aka error statisticians) think data is random and parameters are deterministic. We know what world we are in, but have to eliminate noise, error, etc. The world itself is random! We don't need priors, we need to get an idea of how our world would change if we started in the same place. Bootstrapping and other sampling distribution techniques are good for doing exactly this. Who cares if a hypothesis matches our data really closely, our data might be filled with noise and imperfection (no likelihood principle).

An unusually calm disagreement in the social science

Bayesian and Frequentist philosophies cannot be totally reconciled, because there exists tests maximally efficient in one and incoherent in another. They are like elliptic and hyperbolic geometry in this way. Use of one theory or another for a given situation is a philosophically deep choice. One doesn't have to dismiss one or the other just because they disagree if you look at different situations. Instead one has to be honest about the flaws (and, perhaps, to a lesser extent the strengths) that one's method of choice has for the problem at hand.

Sunday, August 31, 2014

Search Theory And Idle Capacity

Through Mainly Macro, a blog about macroeconomics from a New Keynesian perspective, I discovered this paper by Pascal Michaillat and Emmanuel Saez. The associated slides are a good summary. The paper is very hardcore economics theory paper, aimed a a very hardcore economics question - why does unemployment lurch around the way it does? It's a difficult question. Why did people gainfully employed in 1928 suddenly find themselves without jobs for 11 years? No matter what your opinions  are on this matter, this paper will be of a benefit to you for clarifying how that cause got into the economy as a whole.


This paper takes a search theory perspective on the subject. This improves on the repeatedly cited Barro model because it allows Saez and Michaillat to use supply and demand to analyze the situation. In addition, they are able to use this model in a compartmentalized way. For instance, in this paper, they just use aggregate demand, but the model can be relaxed with different models of demand. Different ways of finding prices are examined in this paper, from crude price fixing to Nash bargaining. As far as I can tell, one could take any pricing mechanism and get a new version of this model as a result. Risk and uncertainty are abstracted out, meaning this is not a model of - say - the 2007-8 financial crisis (which in this model is just "some demand shock"), but rather of how the crisis got into the economy. Since prices are affected by risk and the pricing mechanism in this model is arbitrary, this would be a good place to put that. For instance, there is a great deal of difference (in productivity) in hiring different people for a given position, even a "low skill" position. How is the model affected by increasing uncertainty of hiring?


I don't know much about the relevant precursor model. I will say that as a mathematician, I think that the name "disequilibrium" is a bad choice of words. The word "equilibrium" means that there is no change over time, the use comes from mechanics and has been extended by physicists and mathematicians (and economists!) in appropriate ways. When Barro, Fisher, etc. call a model "disequilibrium", they mean that the economy is not assumed to rapidly respond to changes in the way that the old fashioned Alfred Marshall model. That means that this paper is equilibrium analyses of a disequilibrium model, which is comprehensible only if you already know what this is all about. I am against this word, antidisequilibrium. But unfortunately antidisequilibriumism has little chance against established use.


Also, economists draw charts sideways. The variable x (market tightness, illustrated on slide 9 and 10) is the independent variable and quantity of a good is the dependent variable (with a maximum capacity k). They can't help it, Alfred Marshall did it and people had to do it to match him, then their students had to do it to match them, etc. But surely even economists can see that this is worse than Hitler and that the only use for such charts is discovery of landmines? Can't economists and publishers band together to rid of us this evil?

Why are they doing that? I've seen this show and I have no idea...

Because there is already a good explanation of the bones of the model (from a New Keynesian perspective) on the blog I found this paper on, I'll end by talking about myself. Though you can't tell from this post (in which I talked about famous crises), I have a really hard time thinking about economic models in relationship with the world I am in directly. When I read the excellent textbook The Spatial Economy I found myself thinking a lot more about Sun Tzu than the highway system. When I read this paper, up until section 5 (the empirical part) I found myself thinking a lot more about feudal societies (such as the Tokugawa Shogunate) than modern fluctuations. It's strange, because the modern ones have actual time series and other empirical data - which is what I do! Perhaps it is just more fun.

Thursday, August 28, 2014

Feynman On What "Science" Is


In celebration of the decision of CalTech to put The Feynman Lectures On Physics online, I thought I'd reproduce something he wrote in his wonderful little lecture QED: The Strange Theory of Light and Matter. The full lecture can be found here. As far as I know, this is the only lecture on quantum field theory aimed at a popular audience. I first read this book years ago, and I often find myself using Feynman's visualization tricks often. For instance, Feynman's visualization of complex numbers as being little clocks on points helped me make sense of complex analysis. I've found that even the obvious fact that \( e^z \) can pass from positive to negative without hitting zero can be a stumbling block for people raised on real numbers

Naturally, this lecture series starts out with a sort of mission statement and declaration of purpose. Reviewers prefer to skip these, instead reviewing the book/lecture/etc that they want instead of what they are really getting. And I'm sure that some will be disappointed that this book doesn't go into detail about, for instance, the mathematics of path integrals. But this throat clearing is an important part of engaging with an audience, even an audience that wants to see you. In so doing, Feynman actually produces an interesting thesis on the philosophy of science:

"The Maya Indians were interested in the rising and setting of Venus as a morning 'star' and as an evening 'star' - they were very interested in when it would appear. After some years of observation, they noted that five cycles of Venus were very nearly equal to eight of their 'nominal years' of 365 days (they were aware that the true year of seasons was different and they made calculations of that also). To make calculations, the Maya invented a system of bars and dots to represent numbers (including zero), and had rules by which to calculate and predict not only the risings and settings of Venus, but other celestial phenomena, such as lunar eclipses.

In those days, only a few Maya priests could do such elaborate calculations. Now, suppose we were to ask one of them how to do just one step in the process of predicting when Venus will rise as a morning star - subtracting two numbers. And let's assume that, unlike today, we had not gone to school and did not know how to subtract. How would the priest explain to us what subtraction is?

He could either teach us the numbers represented by the bars and dots and the rules for 'subtracting' them, or he could tell us what he is really doing: 'Suppose we want to subtract 236 from 584. First, count out 584 beans and put them in a pot. Then take out 236 beans and put them on one side. Finally, count the beans left in the pot. That number is the result of subtracting 236 from 584.'

You might say, 'My Quetzalcoatl! What tedium - counting beans, putting them in, taking them out - what a job!'

To which the priest would reply, 'That's why we have the rules for the bars and dots. The rules are tricky, but a much more efficient way of getting the answer than by counting beans. The important thing is, it makes no difference as far as the answer is concerned: we can predict the appearance of Venus by counting beans (which is slow, but easy to understand) or by using tricky rules (which is much faster, but you must spend years in school to learn them).'"

I think this is a wonderful metaphor for mathematical modeling, and it is a very good approach to teaching students. In my teaching, I very frequently encounter students completely uninterested in my subject - it would be more correct to say that I occasionally find students interested in math. What I do to engage the students is to encourage them to think that what I am teaching is not abstracta to be vomited onto a test, but tools they can use as scientists and engineers. I notice that I get much more engagement when I do this (it seems to also get higher grades, but I haven't done a regression or anything...).

Incidentally, Feynman earlier admits that his history of QED is a Whig history, and mentions a couple of other issues in then contemporary philosophy of science. Feyerabend's claim that he was philosophically ignorant was always complete horseshit, based on Feyerabend's unwillingness to confront new scientific and philosophical difficulties. For instance, Feynman's anti-foundationalism shows up in a later part of this section - to him, it's modeling all the way down. I don't know whether this is trivially true or exaggerated. Just wanted to beat that dead horse a little more.

Tuesday, August 26, 2014

Foundations and Other Unneccessary Things


The economist John Maynard Keynes once said "Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.". Practical men, to paraphrase, are usually under the spell of popular philosophy. A few weeks back I did a post on Wittgenstein's criticism of the logicist program. I concentrated on a technical aspect, he pointed out that the interpretation of quantification over infinite sets is left open (that is, there are multiple models for given sets of axioms), therefore the alleged foundations of mathematics don't specify a specific mathematical language. Modern mathematicians admit this, but don't care. I didn't go into as much detail about a stronger, but more philosophical, criticism. Principa Mathematica , Die Grundlagen der Arithmetik, etc claimed to be the foundations of mathematics, but if we found an error in them (and an error was found in Grundlagen), then we would dispose of the book and not mathematics. In other words, in practice, there is nothing special about axioms that make them "below" theorems. Mathematics, and Wittgenstein argues science and even more life in general, is more like a hyperbolic tower where everything leans on everything else than an inverted pyramid where everything leans on the bottom stone. I bring up Keynes because I realize now that there is no way to read this and not be affected. I may or may not be a Wittgensteinian, but he has affected how I see things in a fundamental way. I must keep this in mind when I enter into "foundational" controversies.


After my Jaynes post, I did a bit of re-reading of his big book. What is the value of Cox's Theorem? What makes it superior to the usual Kolmogorov Axioms? To the extent that Cox and Kolmogorov disagree, so much the worse for Cox (as far as I can tell). Kolmogorov's axioms are deliberately vague as to interpretation. They are models for statements about normalized mass or subjective valuations of probability. Cox's theorem is no shorter or more intuitive. I don't think that the interpretation that the functional equations are about subjective degrees of belief is any more suggested than in the Kolmogorov axioms (that is, it isn't at all). Why? We can interpret f to be "the sand in this unit bucket outside this set", then recognize that being outside the outside is being inside, etc. Therefore, Cox isn't any better a foundations for subjective probability than Kolmogorov.

Azazoth

Cox's theorem isn't strong enough to constrain countable unions, which means that if it was The Real foundation of probability then it would run into strange problems. As I said in the Wittgenstein post, mathematicians like to deal with the infinite by making it as much like the finite as we can without risking contradiction. Countable additivity is a way of doing this. If you have half a bucket of sand and half a bucket of sand, then you have (half plus half equals) one bucket of sand. That's additivity in a nutshell. But if you add up infinitesimal (that is, limits of smaller and smaller scoops of) grains of sand (in a limiting procedure), what happens? In countable additivity, you get a bucket of sand - lucky you. In finite additivity, the answer isn't defined. There's no reason to think that you wont add up bits of sand and get Azazoth. In other words, you give up the ability to compute probabilities.


The problems are even worse for the Bayesian, because finite additivity isn't consistent with conditionalization (hat tip: A Fine Theorem). Since finite additivity is all Cox's Theorem gives you, clearly it needs to be made more robust! (Unlike, say Kolmogorov's Axioms) Obviously, I strongly disagree with that paper's thesis that de Finetti gave "compelling reasons" to abandon countable additivity, and regard de Finetti's examples of "intuitive priors" as bizarre. (Also, I find A Fine Theorem's Kevin Bryan's arguments even weaker. It isn't obvious to me that his hostile description of frequentist consistency is induction in any sense, much less a bad one...). The famous Bayesian Jaynes must have at least sensed this, because he was always combatitively pro-countable unions. But is his foundation for himself a castle built on sand? The answer is obvious to me, Jaynes just never cared about such things, thought it was a merely technical problem without deep import to general theory (he says in the appendix that the only difference between his approach and Kolmogorov's was that Kolmogorov took an infinity first method and him an infinity last).

Dr Fine, Dr Howard and Dr Howard in deep philosophical debate

This issue might be worth maintaining low level controversy about it, and Kolmogorov put it in the right place - as a questionable but reasonable assumption. An "axiom" as we mathematicians say. Sure, countable additivity is so useful and clearly correct in so many contexts that giving it up seems like giving up your legs. But science is multithreaded being, and intellectual controversy often ends in clarification. But in the Cox framework, finite additivity isn't a theorem, it's just a quirk of not constraining our function enough. It just doesn't feel like enough to me, it seems to me that if Kolmogorov, Doob et al were wrong they must be wrong in a much deeper way. Anyway, that's enough about countable probability.

As I said from the outset, it seems obvious to me that axioms are philosophical matters and arguing about them gets you into nothing but a Wittgensteinian language game. But there are differences between Kolmogorov and Cox about finite additivity (and whether functional equations are more intuitive than measure theory). So maybe there is some, small content there. Therefore, I will now e-beg for answers. Tell me about the wonders of Cox's Theorem, internet! I'm all ears!

Saturday, August 23, 2014

Quick Review: Phoenix Wright - Ace Attorney



Well, you can't spend all your time working hard. Even the sick, dying and people in dire poverty have to entertain themselves. Even though I have a plethora of projects, a few ideas that need exploration to become projects and of course there's all the time I ought to be improving myself in some way. Let me share with you some of what I spend that time doing:


The Ace Attorney series is a set of video games, members of a genre called visual novels. A visual novel is sort of like the old choose-your-own-adventure books or, more directly, adventure games where the main form of puzzle is choosing dialog. I don't have much experience with this genre of games, in fact this series is basically the only ones I've played. Even then, I've only played the core part of the series the Phoenix Wright trilogy. From the little bit of research I did, it seems that Ace Attorney is a bit more like an adventure game in that there are many puzzles other than choosing dialog. The idea is that a visual novel will make up for its reduced emphasis on traditional video game design to tell a more compelling story. Again, I haven't played much, but I expect that the quality of the stories varies wildly.

P Mason

In essence, these are murder mystery stories where you are cast in the role of the great detective yourself. If you've seen or read Perry Mason, you'll find the basic goals familiar. You have the clues, you have to solve the case and you have to force the killer into a dramatic courtroom confession. This participation, a fact unique to its video game format, is what sets the series apart. Well, that and its off the wall humor.


The legal system in the game is vaguely based on the Japanese legal system (very judge oriented, confessions by the defendant aren't compelling, prosecutors take even the concept of losing a case to be a personal insult, etc.) and beyond that fairly absurd. I'd like to read a lawyer's comments, but what I'd really like to read is a philosopher of law talk about the series. The reason is that a good lawyer has a lot of knowledge of one kind of law, just like a good surgeon has a knowledge of one species, but a philosopher of law would be better to comment on an arbitrary legal system - and the legal system of the Ace Attorney series is often very arbitrary!


Being a series of stories about murder, betrayal, decade long schemes, endless lies, corruption at high levels of government and innumerable small ills that accompany the big ones, the series is mostly a comedy. The characters are as over-the-top and broadly drawn as anything Dickens wrote. David Simon once said "Murderers lie because they have to; witnesses and other participants lie because they think they have to; everyone else lies for the sheer joy of it...". And so it is, everybody is filled with secrets, sometimes things embarrassing only to them and sometimes issues that are more comprehensible. One murder involved a low rent tokusatsu studio. One young fan refused to tell what he saw because he was traumatized. Not by seeing a murder, but because he thinks he saw his favorite hero be defeated! Truly we live in trying times.


Another recurring element in the series is mysticism. Not dark, esoteric lore like Eric Voeglin or more intellectual fare like the Shin Megami Tensei series. Possession and spirit channeling are major elements of the series. This seems to be a trademark of producer and man in charge Shu Takumi, since he also made the excellent puzzle game Ghost Trick which is about ... well, ghosts. If this bugs you, then so it goes.


Complaints, I have a few. The series is sometimes too linear. Many of the times I've failed to present evidence it was because there are two problems with testimony and I was going to present them in the wrong order. The series doesn't have a lot of replay value, especially since you'll spend time asking everybody everything the first run through. Well, anyway, back to work - this time for real!

Personal Note: Posts full of dull pictures of scientists streak broken! Now I will be less self-conscious.