Tony Smith's Home Page

Gravity and Black Holes at the Planck Energy, and

Simplex Physics above the Planck Energy:

The Discrete HyperDiamond Generalized Feynman Checkerboard and Continuous Manifolds are related by Quantum Superposition:

The Clifford algebra structure with Periodicity 8 of

the D4-D5-E6-E7-E8 VoDou Physics model

produces a Hyperfinite II1 Algebra and

a Many-Worlds Quantum Theory.

That structure may be a Conscious Choice made by our Zizzi Quantum Computer Universe.


Click Here to see about the Clifford Tensor Product Universe.


The basic underlying logic/set/geometry structure is similar to that of the Logic Alphabet of Shea Zellweger.


The Truth Quark, through its strong interaction with Higgs Vacua, may have two excited energy levels at 225 GeV and 173 GeV, above a ground state at 130 GeV. The 173 GeV excited state may exist due to appearance of a Planck-energy vaccum with < phi_vac2 > = 10^19 GeV in addition to the low-energy Standard Model vacuum with < phi_vac1 > = 252 GeV.
The Planck length is the fundamental lattice link scale in the D4-D5-E6-E7-E8 VoDou Physics model.

According to John C. Baez and S. Jay Olson in their paper at gr-qc/0201030:

"... Ng and van Dam have argued that quantum theory and general relativity give a lower bound delta L > L^(1/3) L_P ^(2/3) on the uncertainty of any distance, where L is the distance to be measured and L_P is the Planck length. Their idea is roughly that to minimize the position uncertainty of a freely falling measuring device one must increase its mass, but if its mass becomes too large it will collapse to form a black hole. ... Amelino-Camelia has gone even further, arguing that delta L > L^(1/2) L_P ^(1/2) ... Here we show that one can go below the Ng-van Dam bound [ and the Amelino-Camelia bound ] by attaching the measuring device to a massive elastic rod. ...

[ while the Ng-van Dam ] result was obtained by multiplying two independent lower bounds on delta L, one from quantum mechanics and the other from general relativity, ours arises from an interplay between competing effects. On the one hand, we wish to make the rod as heavy as possible to minimize the quantum-mechanical spreading of its center of mass. To prevent it from becoming a black hole, we must also make it very long. On the other hand, as it becomes longer, the zero-point fluctuations of its ends increase, due to the relativistic limitations on its rigidity. We achieve the best result by making the rod just a bit longer than its own Schwarzschild radius.

... Relativistic limitations on the rod's rigidity, together with the constraint that its length exceeds its Schwarzschild radius, imply that zero-point fluctuations of the rod give an uncertainty delta L > L_P . ...".


FAR ABOVE THE PLANCK ENERGY,

 
the universe looks like a lot of indistinguishable points, 
say N points,  
all connected to each other like an (N-1)-dim simplex.  
 
Denote each indistinguishable point by T. 
 
A 2-dimensional projection of an (N-1)-dim simplex 
is a 2-dimensional N-polygon, or N-gon.  
In the limit as N goes to infinity, 
the N-gon goes to a circle, so 
the universe far above the Planck energy with large N 
looks approximately like a circle of points, 
the boundary of a disk filled with links 
connecting all the points to each other.  
 
 
The N-polygon is a 2-dimensional projection of 
an N-dimensional N-simplex.  
 
At this level, physics is not easily definable 
in conventional terms, since there is no spacetime, 
there are no fermion or boson particles, 
and all particles are indistinguishable 
and linked to all other particles, 
so that no particles is closer to any one 
particle than any other particle.  
 
Above the Planck energy the universe is somewhat like 
Stuart Kauffman's system of NK Boolean Networks 
(see his books At Home in the Universe (Oxford 1995) 
or The Origins of Order (Oxford 1993)) 
with Connectivity = K = N-1 in which  
everything connected to everything else, and 
the Fitness Landscape is completely random.  
The Fitness Landscape roughly corresponds to the set or space of 
all states or histories over which summation leads to 
sum-over-histories Many Worlds quantum theory. 
 
In this case, since all points are indistinguishable, 
all the states or histories are indistinguishable, 
that is to say, the same, so that 
the universe is simpler than Kauffman's model 
in which the points are all distinguishable.  
 

  BREAK THE TOTAL SYMMETRY OF THE VOID   by allowing the points of the circle to be of two types, either the original type T or a second type of point, denoted by by F,   so that T and F correspond to the True and False values of a classical Boolean logic truth table.   (You could just as well use, instead of T and F, V and T for the Hebrew words Vohu and Tohu (void and unformed), as in the particle physics models of Haim Harari (Phys. Lett. 86B (1979) 83-86), Michael Shupe (Phys. Lett. 86B (1979) 87-92), and Stephen Adler (Phys. Rev. D 21 (1980) 2903-2915), or
Yin and Yang) 
  
 
In the crude image, all the T points lie on 
one semicircle and all the F points lie on 
the other semicircle, and 
each part of every link between two points 
is colored according to where it lies in the 
half-disks bounded by the T and F semicircles.  
In fact, any point could be either T or F, 
and there are 4 kinds of links: 
TT, TF, FT, FF.  
 
The links correspond to the values of 
a classical Boolean logic truth table.  
 
Still, physics is not easily definable 
in conventional terms, but now there are 
two things, T and F, 
which could correspond to binary 1 and 0, 
so binary structures can be formed 
with an underlying Z2 discrete symmetry. 
 
This is a little closer to Kauffman's NK model 
with K = N-1, because now there are two types of 
vertices in the N-simplex.  No longer are we 
looking at one N-simplex with indistinguishable vertices, 
so no longer are the states or histories indistinguishable, 
and it makes sense to talk about 
ManyWorlds possible states or histories.   
 
Now we can distinguish among all 2^N different 
subsets of vertices of the N-simplex.  
If we ignore the F vertices and the links to them, 
we get a Cl(0,N) Clifford algebra. 
The underlying vector space of Cl(0,N) is R(0,N). 
If we ignore the T vertices and the links to them, 
we get a Cl(N,0) Clifford algebra.
The underlying vector space of Cl(N,0) is R(N,0). 
 
Consider R(N,N) = R(0,N) x R(N,0).  
The Clifford algebra of R(N,N) is the split Cl(N,N). 
 
Can we see any structure in this space of ManyWorlds states?
i.e., 
What sort of structure can we see in Cl(N,N) for large N? 
 
Let x denote the real tensor product 
and M(R,16) denote the 16x16 real matrix algebra.  
We have the periodicity properties:  
 
Cl(N,N+8) = Cl(N,N) x M(R,16) = Cl(N,N) x Cl(0,8) 
Cl(N+8,N) = Cl(N,N) x Cl(8,0) = Cl(N,N) x M(R,16) = Cl(N,N+8) 
 
We also have Cl(N-4,N+4) = Cl(N,N).  
 
Therefore the Cl(N,N) structure of the ManyWorlds states 
at high energies can be reduced to Cl(0,2N) 
Let p = 2N mod 8.  Then Cl(0,2N) can then be "factored" into 
 
Cl(0,p) x Cl(0,8) x ... x Cl(0,8)  
 
Since any given N can be replaced by a larger N = 0 mod 8, 
the fundamental structure of the ManyWorlds states 
at high energy is that of a LOT of Cl(0,8) Clifford algebras, 
one for each of the ManyWorlds that appear in 
the ManyWorlds quantum model at low energy.    
 
The structure at and below the Planck energy 
is based on Cl(0,8) and its Spin group  Spin(0,8) 
which is the basic 
building block of the D4-D5-E6-E7 physics model. 
 
Each "world", or state, of the ManyWorlds quantum structure 
of the D4-D5-E6-E7 physics model is 
made up of the vector, adjoint, and spinor representations 
of a copy Spin(0,8).  
 
The root vector polytope for the Weyl-Coxeter 
group of Spin(0,8) is the 4-dimensional 24-cell  
which is the 4-dimensional polytope called by 
Shea Zellweger the "Logical Garnet".   
"Garnet" because one 3-dim projection is 
the rhombic dodecahedron of a garnet crystal. 
"Logical" because given two logical states, T and F, 
and their 4 truth table values TT, TF, FT, and FF, 
there are 2^4 = 16 truth table - state combinations, 
or Boolean logical binary relations.  
Shea Zellweger has a nice notation, 
or "logical alphabet", for these 16:  
In his geometrical picture, they correspond to 
the 16 "hypercubic" vertices of a 24-cell.  
The 8 "hyperoctahedral" vertices of a 24-cell 
correspond to the 2^3 = 8 possible "negations" 
or "mirror reflections" of the 3 parts of 
logical relations of the form A * B, 
which are like point-link-point substructures 
of the N-simplex. 
The structures of Shea Zellweger are similar to 
the 16 tetragrams of Ilm al-Raml (the Science of the Sands) 
attributed to the third Islamic prophet, Idris.  
 
 
Above the Planck energy, for each state of the ManyWorlds, 
we have 4 types of links at each vertex:  TT, TF, FT, FF 
so that there are 2^4 = 16 possible combinations of 
types of links that can be attached to a given vertex, 
just like Shea Zellweger's 16-element logical alphabet. 
 
For vertices with all 4 types of links present, 
we have 4! = 24 possible permutations, 
and 2^3 = 8 "orientation-preserving" reflections, 
producing the 4! x 2^3 = 24 x 8 = 192-element 
Weyl group of Spin(0,8) 
whose root vector polytope is the 24-cell, 
Shea Zellweger's logical garnet.  
 
This is another way of seeing that the 
ManyWorlds picture above the Planck energy 
involves discrete superposition 
(i.e. no complex amplitude phase) 
of a LOT of Spin(0,8) structures, 
which 
when we go below the Planck energy 
will produce a LOT of spacetime-based 
ManyWorld histories or states 
along with complex amplitude phases 
that come from the low-energy structure 
of Spin(0,8) representations.  
 

  NOW COOL DOWN TO THE PLANCK ENERGY.   Physically, that means to break some of the links so that everything is no longer connected to everything else. How to decide which links to break? Since all the links are equivalent in distance, BREAK ALL THE LINKS. This leaves us with N discrete points, some of which are T and some of which are F, all disconnected. Our circle picture is now even less accurate, because it seems to indicate nearest neighbor connections which do not exist, but here it is anyway:  
 
The Kauffman picture is now an NK model with K = 0.
As Kauffman notes, for Connectivity = K = 0, 
his Fitness Landscape is too simple. 
Here, since we still only two types of particle, 
things are even simpler than the Kauffman model. 
 

  HOW CAN THE T AND F PARTICLES INTERACT AND COMBINE?   We have 2 types of particle, so let each particle have 2 types of links. We can distinguish between going from T to F and going from F to T, so let each type of link have 2 directions. In all, that means that each T and F can have as many as 4 attached links, 2 in and 2 out. They correspond to the truth table values of TT, TF, FT, FF.   Now consider combinations of T or F with 1, 2, 3, or 4 links to each.   Call each such configuration a "seed". (The terminology "seed" is due to Ben Goertzel. He and Kent Palmer and Onar Aam are sources of many of the useful ideas in this rough view of physics at very high energies. Compare these "seeds" to my Quantum Sets and to MetaClifford Algebras.)   1-LINK SEEDS, where 0 designates the T or F and 1 designates an outgoing link,  
0 - 1
 
form 1-dimensional chains that 
correspond to the positive integers Z+
for which each vertex has 2 nearest neighbors.  
 
 
2-LINK SEEDS, 
where -1 designates an incoming link, 
 
-1 - 0 - 1
 
form 1-dimensional chains that 
correspond to the integers Z 
for which each vertex has 2 nearest neighbors.  
 
 
3-LINK SEEDS, 
where w and w2 designate complex cube roots of 1, 
 
w \ 0 - 1 / w2
 
form 2-dimensional hexagonal lattices that 
correspond to the Eisenstein complex "integers" 
for which each vertex has 6 nearest neighbors.  
 
 
4-LINK SEEDS, 
where i designate the complex square root of 1, 
 
 
i | -1 - 0 - 1 | -i
 
form 2-dimensional square lattices that 
correspond to the Gaussian complex "integers" 
for which each vertex has 4 nearest neighbors.  
 
 
WHAT IF TWO DIFFERENT 4-LINK SEEDS encounter each other? 
 
i k | | -1 - 0 - 1 -j - 0 - j | | -i -k
 
If the planes of the two 4-LINK SEEDS are taken to be 
orthogonal to each other, 
they form the quaternions with basis {1,i,j,k}.  
 
THERE EXISTS A 4-DIM HYPERPLANE OF "INTEGRAL" QUATERNIONS 
IN WHICH EACH POINT HAS 6x4 = 24 NEAREST NEIGHBORS.  
 
 
WHAT IF TWO DIFFERENT QUATERNION PAIRS OF 4-LINK SEEDS 
encounter each other? 
 
               i                     k
               |                     |
          -1 - 0 - 1            -j - 0 - j        
               |                     |
              -i                    -k
 
               I                     K
               |                     |
          -E - 0 - E            -J - 0 - J        
               |                     |
              -I                    -K
 
If the planes of the two QUATERNION PAIRS OF 4-LINK SEEDS 
are taken to be orthogonal to each other, 
they form the octonions with basis {1,i,j,k,E,I,J,K}.  
 
THERE EXIST 8-DIM HYPERPLANES OF "INTEGRAL" OCTONIONS 
IN WHICH EACH POINT HAS 240 NEAREST NEIGHBORS.  
 
 
IF YOU TRY TO CONTINUE THE PROCESS TO HIGHER DIMENSIONS, 
you no longer have real division algebra structure, 
and no longer have alternative algebra structure, 
so higher-dimensional structures will not 
give useful physics models. 
 
 

  BELOW THE PLANCK LENGTH   the octonion Planck-length lattice structure coming from the T and V SEEDS produces the D4-D5-E6-E7 model of physics
that somewhat like the above piece of jewelry: 
it has spacetime broken into 4-dim physical spacetime
and internal symmetry space;
it has fermion particles; and 
it has fermion antiparticles.  
 
It can be represented as a generalized Feynman checkerboard on 
a 4-dimensional HyperDiamond lattice spacetime.  
Since each vertex of a 4-dimensional HyperDiamond lattice is 
linked to 8 nearest heighbors (4 future lightcone, 4 past lightcone),
 it could be said to resemble more complex version of 
 
a Kauffman NK model with N = Connectivity = 8.  
Kauffman suggests that Connectivity = K = 8 
is necessary for interesting structures.  
 

  HOW DOES THE D4-D5-E6-E7 MODEL LOOK in terms of the T and F points of the SEEDS?   (Ideas for these low energy correspondences come from discussions at Georgia Tech with David Finkelstein, Tang Zhong, Igor Kulikov, Bereket, and Li ChangLin.)   The correspondences between T and V points, octonions, and first-generation fermion particles are:  
FFF 1 e-neutrino   down quarks: FFT FTF TFF I J K R B G   up quarks: TTF TFT FTT i j k R B G     TTT E electron
 
 
The actions of gauge bosons in terms of T and F triples are: 
 
photon: leaves T and F unchanged.   W+ and W- weak bosons: interchange ALL T and F in affected triple. Z0 weak boson: leaves T and F unchanged.   6 charged gluons: permute T and F in affected triple. 2 colorless gluons: leave T and F unchanged.   8 charged gravitons: change individual T and F in affected triple. 2 colorless gravitons: leave T and F unchanged.
 
 
Here graviton refers to Spin(5) fundamental spin-1 gravitons, 
prior to the action of the MacDowell-Mansouri mechanism, 
not the resulting phenomenological spin-2 gravitons. 
 

GRAVITY AT THE PLANCK ENERGY is being studied by Hawking and his students who propose that creation of virtual pairs of Planck-energy Black Holes

( a phenomenon that should also occur in the D4-D5-E6-E7-E8 VoDou Physics model upon reaching the energy scale of its Planck length lattice )

should have physical consequences: 
 
     Macroscopic black holes should evaporate down to Planck size 
     and then disappear in the sea of virtual black holes; 
 
     The theta angle of QCD should go to zero, 
     with no axion needed, 
     since the virtual black holes would produce loss 
     of coherence between the theta vacua of different winding numbers, 
     and the vacuum state with lowest energy should be theta = 0
     (this is like the mass of the quark aligning the theta vacuum, 
     as described by Taekoon Lee in hep-ph/0006349); 
 
     Effective interactions with low (sub-Planck) energy fields 
     should be suppressed by factors of the Planck mass, 
     except for scalar fields.  
 
     Fermion 4-vertex effective interactions should be suppressed 
     by 2 powers of the Planck mass, so that 
     K(L0) decay should be 10^7 years, which is probably unobservable now. 
 
     Fermion 6-vertex effective interactions should be suppressed 
     by 5 powers of the Planck mass, so that 
     baryon decay should be 10^64 years, which is probably unobservable now.
 
     Spin-1 boson effective interactions should also be suppressed 
     to unobservable levels.  
 
     
     However, 
     the Higgs scalar field should get effective PHI^4 or PHI^2 PHI^2 
     interactions with coefficient of order 1, 
     so it is unlikely that the Higgs scalar will be observed directly.  
 
The results of Hawking and his students are based on 
the topology of a continuum 4-dimensional spacetime, 
in which the pairs of virtual black holes are due 
to S2xS2 structures of the second homology group, 
the first being excluded by requiring simply-connected spacetime 
and the third being excluded by Poincare duality with the first. 
 
The S2xS2 bubbles represent virtual black holes appearing and 
disappearing in pairs, just as 
a Euclidean electron moving in a circle in an electric field can 
be analytically continued to a Minkowski electron-positron pair 
accelerating away from each other in the electric field.  
Effectively, the electron tunnels through Euclidean space 
and emerges as a pair of real particles in Minkowski space. 
(see particulary pp. 4-5 of their paper).  
 
Bousso and Hawking have described, 
with respect to a conventional inflationary cosmology, 
pair creation of Black Holes,  
which then produce particles and antiparticle 
by their evaporation by Hawking radiation. 
Some of their description may be relevant 
to the Expanding Instanton Cosmology of the D4-D5-E6-E7 model. 
 


Zizzi Quantum Computer Universe

In gr-qc/0304032, Spacetime at the Planck Scale: The Quantum Computer View, Paola Zizzi says:

"...[ General Structure of Quantum Computer Universe, applicable to Pre-Universe PreGeometry existing prior to 4-dimensional physical spacetime ]...

... We assume that spacetime at the Planck scale is

  • discrete,
  • quantised in Planck units, and
  • "qubitised" (each pixel of Planckian area encodes one qubit).

Then, we formulate the Quantum Computer View of quantum spacetime. Within this model, one finds that quantum spacetime might be in a entangled state, and might quantum-evaluate Boolean functions which are the laws of Physics in their most fundamental form.

... we also include the issue of information, (more precisely quantum information) ...

... Spin networks are relevant for quantum geometry. ... spin networks are graphs ... with edges ... and vertices labeled by ...[ geometric objects related to spinors ]...

... we use the quantum version ... of the Holographic Principle ... each pixel of Planckian area, encodes a qubit ... This is a quantum memory register. To process the quantum information stored in the memory, it is necessary to dispose of a network of quantum logic gates (which are unitary operators). The network must be part of quantum spacetime itself, as it describes its dynamical evolution. The quantum memory plus the quantum network form a quantum computer (quantum computer view of quantum spacetime). In the QCV, some new features of quantum spacetime emerge:

  • i) The dynamical evolution of quantum spacetime is a reversible process, as it is described by a network of unitary operators. …
  • ii) During a quantum computational process, quantum spacetime can be in a entangled state, which leads to non-locality of spacetime itself at the Planck scale (all pixels are in a non separable state, and each pixel loses its own identity).
  • iii) As entanglement is a particular case of superposition, quantum spacetime is in a superposed state, which is reminiscent of the Many-Worlds interpretation of Quantum Mechanics ...
  • iv) Due to superposition and entanglement, quantum spacetime can compute a Boolean function for all inputs simultaneously (massive quantum parallelism). We argue that the functions which are quantum-evaluated by quantum spacetime are the laws of Physics in their most fundamental, discrete and abstract form. Moreover, as the laws are the outputs of quantum measurements, their origin is probabilistic.
  • v) By scratch space management, we find that at the Planck scale it is possible to compute composed functions of maximal depth.
  • vi) The quantum information stored and processed by quantum spacetime prevents direct tests of the Planck scale.

... an event in quantum spacetime is an extended object without structure (a block). ... the quantum event encodes quantum information. ...

... each unit of Planck area (a pixel) is associated with a classical bit of information. ... In the quantum version of the holographic principle, a pixel encodes one quantum bit (qubit) of information. ... For example, the action of ...[a group G]... on the edge states ... gives ... equally superposed states ... When a surface is punctured by such a superposed state, a pixel of area is created, which encodes a qubit ... The elementary (Plankian) pixel can then be viewed as the surface of a unit (in Planck units) sphere in ..[ N ]... dimensions. The pixel is punctured (simultaneously) in the poles by an edge in the superposed state of spin down and spin up. Equivalently, a qubit corresponds to the surface of the N-dimensional unit sphere, where the logic states 0 and 1 correspond to the poles. This is the so-called Bloch sphere ...

... Having assumed that spacetime at the Planck scale encodes quantum information, the latter must be processed to give rise, as an output, to the universe as we know it. If so, quantum spacetime is not just a quantum memory register of n qubits: it is the whole thing, a quantum memory register plus a network of quantum logic gates. In other words, spacetime at the Planck scale must be in such a quantum state to be able to evaluate those discrete functions which are the laws of Physics in their discrete and most fundamental form. ...

... If the qubits encoded by pixels were superposed, the surface embedding a region of space would "exist" in many different states simultaneously. ... the idea of a superposed state of qubits associated to pixels, fits quite well in the Many-Worlds interpretation of Quantum Mechanics ... Spacetime at the Planck scale, like the state of a quantum computer, can only decohere at the end of every computational process, which terminates with a measurement. In the case of quantum spacetime, we should better say that decoherence is due to "self-measurement" (projection operators must be included in the quantum spacetime structure). ... spacetime itself would be spoiled of locality, at the Planck scale. In other words, two quantum events might be described by a single quantum state, each event losing its own identity. ...

... Let us consider a finite number N of pixels ... each one encoding one qubit ... the number of pixels of area of a certain surface S is equal to the number of punctures made by spin network edges ...[carrying representations of a group G]... onto S). The N qubits span a Hilbert space of dimension 2^N ...

... To be able to perform quantum computation, the qubits ... must be manipulated by some unitary transformations performed by quantum gates (the number of the gates is called the size of the network). ... The action of the [2x2] Hadamard gate H on the first qubit gives the superposed state ... If we take the superposed state as the control qubit ... and the second qubit of the memory as the target qubit ... the action of the [4x4] XOR gate is ... an entangled state of two qubits. ...

... A quantum logic gate on n qubits is a 2^n x 2^n unitary matrix U. Initially, all the qubits of a quantum register are set to |0> . By the action of the Walsh-Hadamard transform, the n input qubits are set into an equal superposition ...

... The quantum computation of Boolean functions f is implemented by unitary operators U_f. ... Some extra registers (called scratch space) are also needed to store intermediate results. ... the number of required scratch registers, increases linearly with the depth of the composite function which has to be quantum computed. ... in order to compute highly composite functions, the first register (storing the argument) must have the smallest possible size, to leave room for the needed number of scratch registers. In particular, if n=1 (the Planck scale), the available scratch space has size N-1, and the highest level of composition for f is d=N when d-1 scratch registes, of one qubit each, sum up to the original register of size N. Thus, the quantum computation of highly composite functions must be performed close to the Planck scale, and the output (some global property of f) is obtained at macroscopic scales. ...

[ It seems to me that Zizzi's quantum computer ideas could be applied to a "pregeometry" of a lot of "fundamental vertices" with no particular aggregation in any particular dimensionality, but with (possibly varying) numbers of "pre-links" that could link up with other "pre-links" of other "fundamental vertices", and so then consciously self-compute (and therefore, in a sense, create) a maximally efficient quantum code for processing information about how our Universe evolves. If there turned out to be a maximally efficient code, something like the quantum Reed-Muller code [[ 256, 0, 24 ]], then the physics of our Universe might be most efficiently describable in terms of the D4-D5-E6-E7-E8 VoDou Physics model, which has among its fundamental structures the 256-dimensional Cl(1,7) Clifford algebra and the 24+3 = 27-dimensional exceptional Jordan algebra. The corresponding "fundamental spacetime" and "fundamental set of spinor representations" could describe fermions on each vertex and a "fundamental set of gauge Lie algebra representations" that could describe gauge group elements on each link, similar to the process described above, which is also describable in terms of Clifford Tensor Products. The interplay among possible evolutionary histories of our Universe could then be computed by itself much like a giant Quantum Game.

A research program along those lines might be:

 

However, in this paper Zizzi applies her ideas to a specific model of SU(2) and 4-dimensional spacetime. ]

... Spin networks are relevant for quantum geometry. ... spin networks are graphs embedded in 3-space, with edges labeled by spins and vertices labeled by intertwining operators. In loop quantum gravity, spin networks are eigenstates of the area and volume-operators ... We interprete spin networks as qubits when their edges are labeled by the spin-1/2 representation of SU(2). ... For example, the action of the unitary SU(2) matrix ... on the edge states ... gives ... equally superposed states ...[whereby]... spin network edges...[carry]... the 1/2 -representation of SU(2) ... The elementary (Plankian) pixel can then be viewed as the surface of a unit (in Planck units) sphere in three dimensions. The pixel is punctured (simultaneously) in the poles by an edge in the superposed state of spin down and spin up. Equivalently, a qubit corresponds to the surface of the 3-dimensional unit sphere, where the logic states 0 and 1 correspond to the poles. This is the so-called Bloch sphere ... There is clearly an analogy between the spin networks approach to quantum gravity and our Quantum Computer View of quantum spacetime. ...

... The state of n qubits is the unit vector in the 2^n-dimensional complex Hilbert space: C^2 x C^2 x ... x C^2 n times [tensor product] ...

[ Zizzi's specific model differs from the D4-D5-E6-E7-E8 VoDou Physics model in that Zizzi leads to the usual complex hyperfinite II1 von Neumann algebra factor instead of the generalized real hyperfinite II1 von Neumann algebra factor. ]

[ Zizzi's specific model also differs from the D4-D5-E6-E7-E8 VoDou Physics model in that Zizzi's network edges and vertices are labelled by SU(2) spins and intertwining operators, respectively, instead of Spin(1,7) Lie algebra generator Cl(1,7) bivectors and Cl(1,7) spinors. It also differs from the ideas of John Baez ( lines marked > are from a message from me ):

"... The quotient of Lie algebras e6/f4 is a vector space 
that can be naturally identified with H3_0(O).  
That's really cool!  

But the quotient of Lie groups E6/F4 is what matters for 
the spin foam models, and this is a bit "curvier" - 
it has a natural metric that's not flat.  

They are closely related, however: 
e6/f4 can be viewed as a tangent space of E6/F4.  

A baby example of the same phenomenon is this:

sl(2,C)/su(2) = R^3, 3d flat space
SL(2,C)/SU(2) = H^3, 3d hyperbolic space.

This is what we get if we replace the Jordan algebra H3(O) 
by the smaller Jordan algebra H2(C).

>I would like very much to be told how such a construction goes,
>because
>in my opinion such an H3_0(O) spin foam model
>should lead to,
>not just quantum gravity, but a Theory Of Everything.
Shhh!  That's supposed to be secret.   :-)

Yes, of course something like this is my goal, 
but I'm not eager to count my chickens 
before they are hatched, nor take my ideas to
market while they're still half-baked. ... 

... it's not quite that the 
"foam" is "made of 3-spheres".  
People do quantum field theory on a 3-sphere and 
get a spin foam model of (Riemannian) 4d quantum gravity.
The Lagrangian for this quantum field theory guarantees 
that Feynman diagrams can be interpreted as 4-simplices 
stuck together along their tetrahedral faces.  
The relevance of the 3-sphere is that L^2(S^3) 
can be decomposed as a direct sum of "simple" 
representations of Spin(4),
which are the representations which correspond 
to bivectors - the right thing for describing 
the quantum geometry of a triangle. 

One can and should see how this generalizes 
to a wide class of homogeneous spaces, and 
it should be especially fun 
for "exceptional" spaces like E6/F4.

Or maybe even bigger ones involving E7 and E8! ..." ...

and my related ideas:

"... My suggestion of spin foam

(structure group)/(automorphism group) = E6 / F4

was just an example, 
and there is in my opinion a better way.

As you say, you can look at E6 / F4
either at the Lie algebra level 
or at the Lie group level.

You say

"... the quotient of Lie groups E6/F4 is
     what matters for the spin foam models ...".

That does exist and 
is a 26-dim rank-2 symmetric space of type EIV,
with a compact realization 
as the set of OP2s in (CxO)P2
and a
noncompact realization as 
the set of hyperbolic OP2s in hyperbolic (CxO)P2.

However,
if you try to buld a spin foam out of that,
I suspect that it is not as easy as 
the Spin(4)/Spin(3) = SU(2) = S3
case of making a foam of 3-spheres.

It seems to me that what you want 
for a single bubble/component of foam
would be something that has two characteristics:

1 - the 24-dim 3-octonion structure of 26-dim H3(O)o

2 - associative structure so that you can
    put a lot of bubbles together nicely

My candidate for that is the Clifford algebra Cl(8),
with graded structure

1   8  28  56  70  56  28   8   1

and total dimensionality 2^8 = 256 = 16x16 = (8x8)x(8x8)

Cl(8) has
1 - 3 octonions (vector 8, and two half-spinor 8s)

2 - associative product and periodicity factorization
    Cl(8N) = Cl(8) x ...(N times tensor product)... x Cl(8)

Actually,
I am not the only one using that factorization as the basis
of a fundamental physics model.

My friend and teacher David Finkelstein (who first taught me
about details of Clifford algebras back in the early 1980s)
is also working on a model in which a bunch (maybe it could
be called a foam??) of Cl(8)s is a starting point out of which
everything we see (spacetime, particles, etc) condenses.
David's web page is at URL
http://www.physics.gatech.edu/people/faculty/dfinkelstein.html
where he says, among other things:
"... The Spinorial Chessboard shows how the dynamics,
a large squad of chronons, can spontaneously break down
into a Maxwellian assembly of squads of 8 chronons each ...".
Although David's model differs from mine in some ways,
his "squad of 8 chronons" is the set of the 8 generators of Cl(8).

In both his model and my model, the "foam" is not of spacetime,
but a sort of pregeometric foam, and spacetime is derived
as a sort of "condensation".

In this picture, you are using algebra structure, 
not group structure,
so instead of starting with
a foam of group objects like 3-spheres
you start with
a bunch of algebra-tangents of group objects,
which you can, 
AFTER you put the algebra-tangents together,
THEN exponentiate up to make a big group foamy thing. ...". ]
[ Zizzi's model may be applicable to some of the phenomenology of Gravitation in our 4-dimensional physical spacetime. However, since SU(2) is not as big as SU(2)xSU(2) = Spin(1,3) or the Conformal Group Spin(2,4) = SU(2,2), it may be that generalization, such as by using SU(2,2), may be necessary to describe Segal's Conformal Theory of Gravity. Here is some of what Zizzi says, indicating how such a program might proceed: ]...

... According to inflationary cosmological theories, the cosmological horizon has at present a radius R = 10^60 Lp, thus its surface area is A = 10^120 Lp^2, that is an area of 10^120 pixels, each one encoding one qubit. In the QCV, the cosmological horizon's surface can be interpreted as a quantum memory register of N = 10^120 qubits. Thus, spacetime at the Planck scale can compute a composite function of maximal depth d = 10^120. ...

... the recursive functions computed by quantum spacetime at the Planck scale, are the laws of Physics in their discrete, abstract, and fundamental form. We, human beings, who are "derived from the laws", look like fixed points (in the sense of Goedel diagonalization lemma), as we are part of the program and still we are aware that it is running ... (although we cannot grasp the whole of it). ... the output (the result of a measurement) appears randomly, thus the nature of the global properties of such laws is probabilistic. In fact, while the whole quantum computational process is deterministic, in the sense that time evolution is guaranted by a unitary operator, the output is random, as measurement is a non-unitary operation. ...

... The Heisenberg time-energy uncertainty relation, delta E delta t > hbar, allows virtual partcles of mass m = delta E / c^2 to come into existence for an interval of time delta t > hbar / delta E. The time-energy uncertainty relation is saturated at the Planck scale: Ep tp = hbar. This means that an object having a Planck mass ... and size equal to the Planck length .. cannot be anything else than a virtual object. The particle-like character of such a virtual object is enlightened by the fact that the Compton length Lc = hbar / m c , when calculated for the Planck mass, coincides with the Planck length ... Moreover, the Schwarzchild radius Ls = 2 G m / c^2, when calculated for the Planck mass, is twice the Planck length ... The factor 2 (although very often discarded in the literature ... is very important, as the Schwarzchild radius of this object is bigger than its size, thus the object is a black hole. Then, at the Planck scale, we have virtual particle-like black holes/wormholes ... Notice, however, that the particle-like structure of such an object is defined at the very Planck length by the Compton length, while its black hole structure is defined at twice the Planck length by the Schwarzchild radius. The surface area of the event horizon of such a virtual Schwarzchild black hole, is about four Planckian pixels, and encodes four qubits. In our model, lengths are quantised in Planck units: Ln = n Lp and the quantised area is: A = a n^2 Lp^2. The value of the constant a is fixed to a = 2 ln 4 the Bekenstein formula: S = A / 4 Lp^2 , where S = N ln2 is the information entropy, and N = n^2 number of qubits. Then our discrete area spectrum is: An = 4 ln2 n^2 Lp^2. ...

... For n=1, we get the area of the event horizon of a virtual Schwarzchild black hole A1 = 4 ln2 Lp^2 , which coincides with the minimal eigenvalue of the area spectrum calculated in loop quantum gravity when considered in the context of black hole entropy and in black holes quantisation models. However, in our scheme, it would not be possible to consider particle-like quantum black holes with masses larger than the Planck mass, as in that case the Compton length would be a fraction of the Planck length ... Then, black holes with masses larger than the Planck mass (already at twice the Planck mass) would lose any quantum particle-like feature, although they could be considered as the intermediate remnants of evaporating classical (large) black holes. ... At larger scales, (about n = 10^20), where spacetime has already emerged as a continuum, one can start to speak about quantum field theory for small masses such that Lc >> Ls. ... the Compton length does not make sense for integer multiples of the Planck mass, while the Schwarzchild radius does not make sense for fractions of the Planck mass. ...

... encoding four qubits, and can perform quantum computation. Finally, we argue that the intrinsic non local aspect of quantum spacetime at the Planck scale discussed in this paper, might be due to virtual wormholes connecting Planckian pixels, as wormholes violate locality. In this view, virtual wormholes should be considered as tiny XOR gates. In other words, in the QCV, the scale which allows a quantum computing spacetime is the scale of quantum gravity, the Planck scale, which is the seat of quantum foam. ... A quantum black hole of Planck mass, comes into existence out of the vacuum, and then evaporates in Planck time, releasing a quantum of Planckian energy back to the vacuum. As this "virtual" process is due to quantum fluctuations of the vacuum, which are non-dissipative, it can be considered a reversible process, unless a measurement takes place. ... spacetime at the Planck scale may be viewed as a sea of virtual Planckian black holes. Probing the Planck scale would then mean losing data inside the event horizons of Planckian black holes. Or, even worse, the data might be realitive to another world if , in measuring an entangled state, one is faced with a virtual wormhole. ...

... from the linearity of the Planck scale level we obtain the nonlinearity of the classical macroscopic level ...[such as]... the non linear equations of General Relativity ... both non linearity and irreversibility, which have no home in the QCV, should be emergent features of spacetime. In the QCV, also locality is lost: "spacetime" itself is non local at the Planck scale, due to the entanglement of pixels/qubits. ... the QCV, spin networks, qubits, and virtual black holes are different aspects of the same buildig blocks of quantum spacetime at the Planck scale. In philosophical terms, spin networks (at least in their original form introduced by Penrose as purely combinatorial objects, without the introduction of causal sets ... are fundamental in the sense that they are attributes of substance. Spin networks are the boundary (at the Planck scale) between the physical world and its foundations. Thus, in the QCV, also qubit states associated with spin networks are a different way of saying the same thing: quantum information on its own is an attribute of substance. However, when quantum computation is taken into account, and a (reversible) dynamical evolution arises, we are already a little bit upward the boundary between substance and the physical world. Virtual black holes, (and wormholes) constitute mini quantum computers, which "prepare" the phisical world. ...".

 


 
THE TRANSITION AT THE PLANCK ENERGY 
that the D4-D5-E6-E7 model of physics undergoes 
is a Phase Transition 
from 
a state in which every point is connected to 
a nearest neighbor proper subset of all other points, 
to 
a state in which every point is connected directly to 
every other point.   
 
A low-energy physical analogy is the phenomenon of Critical Opalaescence 
( It looks something like the pearl handles

of the deputy sheriff revolver of my grandfather James Madison Smith, which had a defective firing pin mechanism. When I found that repair would be difficult, I threw away that revolver, but kept the pearl handles. )

at the critical point of a liquid-gas phase system, 
at which the correlation length among the particles 
of the system becomes infinite.  
 
To see how the phase transition is approached from 
below the Planck energy, 
consider the Feynman checkerboard picture.
 
Here is how Feynman did weights in his checkerboard: 
 
Consider a discrete lattice spacetime, 
lattice length  e  (one time-step),
a time interval T of N lattice time-steps, T = Ne. 
 
Also consider a particle of mass  m  =/= 0, 
because a massless particle is "stuck" on a light-cone 
trajectory and cannot "change directions". 
 
Then Feynman weights each change of directions in 
a given path from time t = 0 to time t = T 
by  ime    (where i is complex imaginary i).  
 
Although Planck's constant h is not explicit in  ime, 
you can say that it is implicit in the lattice interval  e. 
 
If e goes to zero, there is continuous spacetime, 
and the weighting goes to zero, 
BUT the number of changes goes to infinity, 
so you have to be careful about how you take 
the limit as e goes to zero.  
 
If e goes to T (the entire time interval you consider 
in your given experiment), THEN is when the 
ManyWorlds goes away due to fewer and fewer possible 
spacetime paths from t = 0  to t = T.  
 
In this case, e going to T, you could just as well 
say that you are designing experiments such that you 
are looking at shorter and shorter times T such that 
T goes to e.  That is equivalent to doing experiments 
at higher and higher energies, approaching the Planck energy.  
 
So the breakdown of spacetime ManyWorlds 
(due to fewer and fewer possible futures) 
occurs when you do Planck energy experiments.  
 
That is equivalent to a phase transition from
spacetime ManyWorlds with complex amplitude phases 
into a new regime ABOVE the Planck energy 
in which ManyWorlds is based on discrete superpositions 
of Spin(0,8) systems and their Cl(0,8) Clifford algebras 
all regarded as subsystems of one huge N-simplex 
with Cl(0,N) Clifford algebra, 
and everything connected to everything else.  
 

  Compare this Simplex Physics to my Quantum Sets, to the HyperDiamond Feynman Checkerboard Model, and to MetaClifford Algebras, as well as the construction of Clifford Algebras from Set Theory.  
   

 

Tony Smith's Home Page

...

...