Epistemic Discipline
Every claim on this page is a claim within the Manifold Relativity framework (Preprints v1βv15). Claims are classified as Derived (theorem-level within the framework), Proposed (formal framework-level extension, interpretation, or conjecture; in some cases derivable within the framework but not yet externally reviewed), Predicted (falsifiable empirical consequence), or Open (formally indexed as an unresolved problem). None of these are assertions about settled external physics. Status notes within entries may describe a result as conjectural in plain language; this is descriptive prose under the Proposed heading, not a separate label. The programme explicitly lists what it has not yet established.
Jump to:
The Central Idea Β·
What We Have Derived Β·
The Self-Audit Β·
What We Propose Β·
What We Predict Β·
What Remains Open Β·
What v12βv13 Added Β·
What v14βv15 Added
The Central Idea
What the Programme Is Trying to Do
Start here. Everything else follows from this single thesis.
Q. What is Manifold Relativity trying to change first β the laws of physics, or the coordinates used to express them?
The coordinates. The programme’s central claim is that many familiar pathologies β quantum uncertainty, Big Bang and black-hole singularities, the dark sector, wave-particle duality β are not independent mysteries of nature. They are artefacts of describing reality in the coordinate system native to biological observers: (x, y, z, t).
The programme proposes a coordinate transformation to the six-dimensional master manifold π², with coordinates (S, I, E, Ο, C, A): Entropy, Fisher Information, Entanglement, Phase, Complexity, and Action. Standard spacetime is a projected submanifold of π² β not the fundamental arena. Manifold Relativity is first a coordinate reformulation programme; new predictions emerge from that shift.
Provenance: v1.02 Β· v2
Q. What is the historical pattern the programme is building on?
Every major paradigm shift in physics has been a coordinate change that dissolved an apparent mystery. Geocentric to heliocentric eliminated retrograde motion. Minkowski spacetime dissolved the MaxwellβNewton conflict. KruskalβSzekeres coordinates showed the Schwarzschild horizon was a coordinate artefact, not a physical singularity.
The programme proposes that π² is the next such coordinate system β one in which the currently unresolved problems of physics are coordinate artefacts rather than fundamental mysteries.
Provenance: v1.02 Β· v2
Q. Why does the framework use an atlas of charts rather than one global coordinate system?
Because a single global chart is too strong an assumption. Beginning in v8, the programme explicitly models the universe as a π²-manifold covered by an atlas of locally valid, observer-dependent charts connected by transition maps. Different observers or physical regimes access different spectral sectors of the same underlying manifold. The transition laws between charts are part of the physics β not just bookkeeping. This move also tells a physicist something important: the programme recognised an overreach in its early single-chart formulation and replaced it with standard differential-geometric structure.
Version 11 draws out the full consequence of this architecture: if physics is chart-relative, then any single detector or measurement context is structurally incomplete. What a low-temperature detector discards as noise may be coherent signal in a higher-temperature chart. The Multi-Chart Reconstruction Principle, introduced in v11, follows directly: the atlas is not merely a mathematical convenience for avoiding singularities β it is the reason that coordinated measurement across different thermal baselines is physically meaningful, not redundant.
Provenance: v8 Β· v8.1 Β· v11
Q. What is the “biological observer chart” Ubio?
It is the effective chart appropriate to warm, macroscopic observers. In v8, it is assigned a strict thermal resolution floor: spectral structure of the π²-manifold at frequencies below k_BT/β is mathematically inaccessible within this chart. The programme therefore treats part of what we call quantum uncertainty, entanglement non-locality, and the dark sector as properties of this chart’s limited access β the shadow of manifold structure the chart cannot resolve.
Provenance: v8 Β· v8.1
Q. When does a chart fail, and what happens then?
A chart remains valid while the observer’s thermal floor stays well below the local eigenvalue gap of the manifold’s spectral structure. When that gap closes, the chart loses the ability to reconstruct local geometry faithfully and becomes singular. The framework gives three concrete examples: at S β 0 (Big Bang), the biological chart fails as polar coordinates fail at a pole. At C β C_max (black hole interior), it fails as stereographic projection fails at the antipodal point. At T β T_P (Planck regime), the continuous phase Ο becomes discrete and differential calculus fails. In each case a different chart is needed.
Provenance: v8 Β· v8.1
Q. What does v11 add to the central picture?
Version 11 formalises the converse of v10’s chart-filtering question. V10 asked: how does an observer’s chart filter an event? V11 asks: what happens to the part of the event that the chart filters out? The answer is the signal-noise inversion. What one detector classifies as noise β spectral structure lying above its thermal floor β is not absent from the W-manifold. It is event content whose native chart lies at a higher thermal baseline. A different detector operating at a higher temperature would resolve the same structure as signal. Signal and noise are chart-relative classifications, not absolute properties of the measured event.
This single observation has three consequences formalised in v11: a new mathematical object (the Cross-Chart Disagreement Vector |Ξβ,ββ©), a formal proposition establishing that inter-chart disagreement encodes latent structure (Proposition 5.1), and a methodological principle stating that no single detector chart should be presumed privileged for complex event reconstruction.
Provenance: v11
What We Have Derived
Theorem-Level Results Within the Framework
These results are derived at theorem or formal-proof level within the framework’s definitions β what follows necessarily from the π²-manifold structure once the coordinates are accepted.
Q. Does the framework recover the known Hubbleβtemperature scaling from first principles?
Yes. The Hubble parameter in the radiation-dominated era is derived directly from the π²-manifold metric:
H = β(8ΟΒ³/90) Β· t_P Β· ΟΒ², where Ο = k_BT/β
This recovers the known H β TΒ² scaling with the Planck time as the unique proportionality constant β no free parameters. The Planck time is not inserted; it emerges as the structural bridge between the cosmological and quantum decoherence scales.
Provenance: v1.02
Q. Where does the speed of light come from?
The programme derives c = l_P / t_P as a graph-theoretic bandwidth limit. The fundamental substrate is modelled as a degree-3 Planck-scale expander graph Gβ. A coherent entropy wave propagating across Gβ can traverse at most one edge per Planck time before the graph’s spectral gap collapses β by the expander mixing lemma, propagating faster destroys wave coherence and dissolves the signal into incoherent thermal noise. The speed of light is therefore the maximum coherence-preserving propagation rate the underlying graph can sustain. It is not assumed equal to l_P/t_P; that equality is the content of the theorem.
Provenance: v3 Β· v4
Q. What is the geometric origin of electromagnetism?
The compact periodic phase coordinate Ο of the π²-manifold supports a Kaluza-Klein reduction yielding the EinsteinβMaxwell action with coefficient determined entirely by the phase metric β no free parameters. U(1) gauge invariance is the freedom to locally redefine the entropy wave phase origin. The electromagnetic vector potential emerges from the off-diagonal metric component gSΟ = β1/Ο. Magnetic flux is the holonomy of the rotation map around cycles of the local phase graph. The framework does not require unobservable curled-up spatial dimensions; the “fifth dimension” is the entropy wave phase Ο, a derived physical coordinate present since v1.
Provenance: v4
Q. What is the dilaton field?
The dilaton scalar of the Kaluza-Klein reduction is identified as:
Ο = ln(E_P / k_BT)
It is the cosmic temperature written as a geometric quantity β not a new particle. At the Planck epoch, Ο = 0: gravity and electromagnetism were equally coupled. At the present epoch (T β 2.7 K), Ο β 74: the phase dimension has compactified by e^74, concentrating the electromagnetic coupling. The gravitational-electromagnetic hierarchy is not fine-tuned; it is the age of the universe written in one number.
Provenance: v4 β Theorem (Dilaton Identification)
Q. Why does the principle of least action work?
Standard physics takes the Euler-Lagrange variational principle as an axiom. Within the π²-manifold, least-action paths are geodesics along the A-axis: straight lines in Action-coordinate space. The principle is the definition of a straight line in the correct coordinate system, not an independent postulate imposed on nature.
Provenance: v1.02
Q. What is the Wick rotation, geometrically?
The Wick rotation β the substitution t β iΟ used throughout quantum field theory β has no accepted physical interpretation in standard physics. Within the π²-manifold it is derived as a decoherence geodesic in the (S, A) plane: a real geometric trajectory in the entropy-action sector of the manifold.
Provenance: v1.02
Q. Is time treated as a fundamental dimension in this framework?
No. A core architectural principle of the framework, established from v1, is that time is chart-local rather than globally primitive. The programme models the local speed of time as Ξ₯ = dS/dΟ|_{U_T}: the rate at which entropy is processed within an observer’s chart. Time emerges from entropy dynamics rather than being posited as a background dimension.
Provenance: v1.02 Β· v8 Β· v9
Q. Does the framework say anything about why space has three dimensions?
Yes. In the discrete-geometric development, the programme derives three spatial dimensions as the unique dimensionality compatible with the cycle structure of the replacement-product graph, the isotropy requirements, and the holographic structural conditions of the projection mapping. This emerges from the graph’s combinatorial properties, not from a postulate.
Provenance: v3 Β· v5.3 (survived hardening step)
Q. What did v12 prove about the atlas structure?
Three narrow structural propositions from the v8βv10 definitions. (1) Nested Accessibility: Higher-temperature observers access strictly smaller spectral sectors β the family {UT} forms a monotone filtration indexed by temperature. (2) Identity Limit: The vertical comparison map between charts approaches the identity as temperature separation vanishes. (3) Matched-Chart Consistency: When observer and event share the same chart, no vertical transformation is required and reconstruction distortion vanishes identically. These results establish that the atlas has a coherent ordinary limit: in matched charts, no vertical transformation is required and reconstruction distortion vanishes, while the framework’s novel content resides exclusively in the chart-mismatch regime.
Provenance: v12 β Propositions 3.1, 3.4, Theorem 3.6
Q. What is the exact formula for truncation-induced mutual information?
Proposition 2.8 of v13 derives the exact mutual information created when a product thermal state is spectrally truncated: I(A:B) = ln ZV β β¨ln Qβ©A β β¨ln Pβ©B, where Qi measures the B-weight accessible from each A-eigenstate and Pj vice versa. This is a Jensen gap measuring the non-uniformity of spectral accessibility across subsystems. The mutual information vanishes if and only if the truncation mask is a product set (Corollary 2.9). The non-product geometry of the truncation mask is the sole source of apparent correlation β spectral truncation couples subsystem indices through the threshold condition, creating apparent correlation in an initially uncorrelated state. This exact result applies to the product-thermal-state truncation setting analysed in v13; it is not yet a general proof for arbitrary Hamiltonians or full O31. Post-v15 status: verified to machine precision across all tested product-Hamiltonian cases (4Γ4, 9Γ9, 16Γ16) including those with composite spectral degeneracies, via the forensic evidence chain in the v15 Computational Addendum (Scripts 07b and 07d). The apparent counterexample reported in Script 05 of the v13 addendum was resolved as a basis-selection artifact in a degenerate subspace, not a failure of the proposition.
Provenance: v13 β Proposition 2.8, Corollary 2.9; v15 β Computational Addendum (verification to machine precision)
Q. What is the SpaceβTime Complementarity Corollary?
Derived from v12’s Nested Accessibility (Nacc decreases with T) and the v1 temporal rate (cS = kBT/β increases with T): spatial spectral richness and temporal processing rate are inversely related across temperature. Applied to cosmological cycle endpoints, this motivates the proposed asymptotic picture developed in v13 (Hypothesis 4.2): minimal spatial reconstruction at the hot endpoint, maximal spatial richness with degenerate temporal coordinate at the cold endpoint.
Provenance: v13 β Corollary 4.1 (Derived); Hypothesis 4.2 (Proposed)
The Self-Audit
What the Programme’s Own Hardening Step Found
Preprint v5.3 contains a formal survival analysis β an explicit classification of all prior results into those strengthened by the transition to rigorous discrete mathematics, those unchanged, and those requiring qualification. A programme that names what it has had to qualify is more trustworthy than one that does not.
| Result | Verdict | Notes |
|---|---|---|
| Dilaton-as-temperature identification | Unchanged | Survived transition to discrete definitions intact. |
| Speed of light as spectral-gap limit | Unchanged | Expander mixing lemma argument holds in discrete setting. |
| Three spatial dimensions from cycle structure | Unchanged | Replacement-product graph cycle structure preserved. |
| EinsteinβMaxwell action from KK reduction | Unchanged | Phase coordinate compactification argument intact. |
| MβΟ relation derivation | Unchanged | Exponent 4.38 at typical galaxy mass survives. |
| 1D CFT entanglement entropy scaling | Strengthened | Rigorous discrete Fisher definition sharpens the cut-state argument. |
| Rotation map as Levi-Civita connection | Strengthened | Discrete graph formulation makes the identification more precise. |
| Full higher-dimensional RyuβTakayanagi area-law | Qualified | 1D scaling holds; higher-dimensional projection remains Open Problem O4. |
v15 β Second Documented Self-Correction: Computational Record Reconciliation
The v5.3 self-audit tabulated above was the programme’s first documented self-correction: a theorem-level hardening cycle that re-derived, re-strengthened, or re-qualified each claim under tighter mathematical discipline. In v15 the programme performed a second kind of self-correction β not theorem hardening, but computational record reconciliation.
Script 05 of the v13 Computational Addendum reported a 3-site chain numerical value (0.2152715693) that disagreed with the analytical value from Proposition 2.8 (0.6069288341). Preprint v14 hypothesised this as a “domain boundary” β a regime where the analytical formula required restricted scope. The v15 forensic audit (Scripts 07b, 07c, 07d in the v15 Computational Addendum) reproduced both values, identified the mechanism as a methodological artifact of the original verification script’s composite-eigenbasis projection across degenerate subspaces, and confirmed Proposition 2.8 to machine precision. The v14 “domain boundary” framing was explicitly retracted in v15.
The programme preserves the full evidence trail: the original positive v13 result, the discrepant Script 05 output, the failed intermediate probe (Script 07c, with its fixed-threshold design and noisy gap sequence), and the final mechanism confirmation (Script 07d). None of these intermediate artifacts are erased. This is the self-audit discipline operating at the computational record level, in addition to the theorem level. See the “What v14βv15 Added” section below for the full forensic narrative.
What We Propose
Formally Grounded Conjectures
These entries are formally grounded in the π²-manifold structure. Most require further derivation β particularly, formal construction of the Dirac operator DW β before they achieve theorem status. The v11 entries (wave-particle duality, Cross-Chart Disagreement Vector, Multi-Chart Reconstruction Principle) are derivable from the observer-filter map Ξ T without requiring DW; they carry the Proposed label. All entries below are framework-level proposals, with status notes specifying whether they are derivable extensions, interpretive proposals, or conjectural claims.
Q. How does the framework interpret the Heisenberg Uncertainty Principle?
The programme proposes that uncertainty is the CramΓ©r-Rao bound on the Fisher information coordinate I of the π²-manifold. The metric component gII = βΒ²/4I is the quantum Fisher metric; the statistical CramΓ©r-Rao lower bound on this metric is structurally identical to ΞxΞp β₯ β/2. In v8, this is deepened: quantum uncertainty is identified as the non-commutative residue β eigenvalues of DW below the biological chart’s thermal floor that the chart cannot resolve. Uncertainty is not fundamental fuzziness; it is the shadow of what the observer chart cannot see.
Provenance: v1.02 Β· v5 Β· v8
Q. How does the framework interpret wave-particle duality?
The programme proposes duality is a chart artefact. The entropy wave propagating in the (S, Ο) sector of the π²-manifold is the primary object; the discrete Planck-scale graph Gβ is the substrate. In v11, this is formalised as a consequence of the observer-filter mechanism. The observer-filter map Ξ T projects the full event state |Οβ© onto the eigenspace of the candidate Dirac operator DW below the thermal cutoff Ξ»max(T). In the biological observer chart Ubio, Ξ»max(Tbio) lies below the eigenvalue range associated with the phase coordinate Ο. The Ο structure β which governs the periodic entropy wave β is treated as falling below the operative chart threshold and is therefore not reconstructed in that chart; the remaining projection onto the Action and Fisher Information axes produces a localised, particle-like reconstruction. At a higher thermal baseline Tβ², the phase structure lies within the resolved band and the same event reconstructs as wave-like. Neither is the fundamental object; the W-manifold entropy wave is.
Provenance: v3 Β· v4 Β· v8 Β· v11
Q. What does the framework say a singularity is?
The programme proposes that Big Bang and black-hole singularities are chart pathologies β failures of the observer’s coordinate map β not failures of physical reality. They are Jacobian singularities: breakdowns of the chosen description at the edges of its validity domain, precisely analogous to the failure of latitude-longitude coordinates at the Earth’s poles. The Big Bang is proposed as the boundary of the entropy coordinate’s domain (S β 0). The black-hole interior is proposed as the saturation point of the complexity coordinate (C β C_max). In neither case does the underlying π²-manifold become undefined β a different chart is needed.
Provenance: v1.02 Β· v2 Β· v8
Q. How does the framework interpret time dilation?
Building on the derived principle that time is chart-local, the framework proposes that relativistic time dilation is the local observer chart registering a different rate of entropic processing. Two charts at different thermal or kinematic states generate entropy at different rates; what we measure as “time slowing down” is the difference in those rates projected onto the biological observer’s reconstruction of a shared timeline.
Provenance: v1.02 Β· v8 Β· v9
Q. What is the dark sector in this framework?
Dark energy is hypothesised to be the spectral action cost Tr(f(DW/Ξ)) β the cost of the manifold’s self-description. Dark matter is hypothesised to be the gravitational footprint of the non-commutative residue of the π²-algebra: spectral structure that exists in the manifold but falls below the biological chart’s thermal resolution floor. The 95% of the energy budget we cannot directly observe is, on this proposal, not a population of invisible particles β it is the portion of the manifold’s structure that the biological chart shadows.
Provenance: v6 Β· v8
Q. How does the framework interpret quantum non-locality?
The programme proposes that non-locality is a projection illusion. The fundamental arena Gβ is an expander graph with logarithmic diameter: the graph-theoretic path length between any two nodes is extremely short relative to the total number of nodes. Two particles that appear billions of light-years apart in the smooth 4D spacetime projection may be adjacent vertices in the underlying Gβ topology. They are not communicating faster than light β they are standing next to each other in the fundamental geometry.
Provenance: v3 Β· v5 Β· v6
Q. How does the framework model decoherence?
Decoherence is not introduced as an external collapse postulate. It is modelled as a geometric trajectory in the (S, A) sector of the π²-manifold β the same sector in which the Wick rotation appears as a decoherence geodesic. The quantum-to-classical transition is, on this proposal, a path in entropy-action space: geometry rather than a separate physical law.
Provenance: v1.02
Q. What is the Cross-Chart Disagreement Vector, and why does it matter?
It is the primary new formal object introduced in v11. For two detectors with thermal baselines T1 < T2 measuring the same event |Οβ©, the Cross-Chart Disagreement Vector is defined as |Ξ1,2β© = (Ξ Tβ β Ξ Tβ)|Οβ© β the difference between the two chart projections of the event. Proposition 5.1 (v11) establishes that if the event has non-zero spectral amplitude in the eigenvalue band (Ξ»max(T1), Ξ»max(T2)] of DW, then the disagreement vector is non-zero and its norm encodes event structure that neither detector alone recovers. The spectral support condition is important: if the event has no structure in that specific inter-chart band, the two detectors will agree perfectly even at different temperatures. Disagreement is not automatically information β it is information only when the event occupies the spectral gap between the two charts.
The supporting operator R1β2 = Ξ Tβ(I β Ξ Tβ) isolates what is noise in chart 1 and signal in chart 2. From the spectral nesting property (Ξ TβΞ Tβ = Ξ Tβ for T1 < T2), this simplifies to R1β2 = Ξ Tβ β Ξ Tβ, showing that R1β2|Οβ© = |Ξ1,2β© exactly.
Provenance: v11
Q. What is the Multi-Chart Reconstruction Principle?
A methodological consequence of the signal-noise inversion, stated formally in v11: no single detector-native chart should be presumed privileged for the complete reconstruction of a complex event. Coordinated inference across multiple detector-native charts with distinct thermal baselines may recover event structure inaccessible to any individual chart. The principle implies that some of what experimental practice calibrates away β treating inter-chart disagreement as noise to be minimised β may be the signal whose native chart lies above the instrument’s thermal floor. The programme does not prescribe a specific reconstruction algorithm; that is future work. It provides the principled reason to expect that cross-chart inference is informationally richer than single-chart inference.
Provenance: v11
Q. What does the framework propose about the arrow of time?
The microscopic laws of physics are time-reversible, yet macroscopic experience is irreversible. The programme proposes, in v10 as an ansatz, that the candidate Dirac operator DW governs a fundamentally unitary and reversible base geometry. Irreversibility is proposed to arise because macroscopic observers are confined to thermal charts with a strict spectral truncation floor Ξ T. Low-eigenvalue structural information is continuously projected out of the observer’s accessible chart as the universe evolves. What we experience as irreversibility is the unavoidable, continuous loss of below-floor spectral information through the observer’s truncated lens.
Provenance: v10 β Ansatz
What We Predict
Falsifiable Empirical Consequences
These are representative predictions drawn from the programme across v1βv11. Note: Version 11’s primary contribution is methodological and interpretive rather than a new set of numbered empirical predictions. V11 reframes existing predictions (particularly the chart-mismatch residuals from v10) through the cross-chart disagreement formalism and motivates a class of multi-chart measurement protocols; these are labelled Conjectural pending experimental protocol design. v14βv15 did not primarily add new empirical predictions; they clarified the computational status of existing claims (see “What v14βv15 Added” below).
|
P1 β HubbleβDecoherence Relation In the radiation-dominated era: H = β(8ΟΒ³/90) Β· t_P Β· ΟΒ², with the Planck time as the unique proportionality constant. A precise quantitative relation not built into ΞCDM by construction. Testable against precision cosmological surveys. v1.02
|
P13 β MβΟ Relation Bending The black-hole massβvelocity-dispersion exponent is mass-dependent, ranging from 4.0 to 5.0, with the value 4.38 at typical galaxy mass derived from first principles. The preprint presents this as consistent with existing survey data (McConnell & Ma 2013); independent verification remains an open empirical task. v2
|
P19 β Entanglement Entropy Scaling For n-particle entangled states with n > 3: entanglement entropy scales logarithmically with n, not linearly as pairwise-additive models predict. Testable with current ion trap and photonic entanglement experiments. v6
|
P20 β Measurement Uncertainty Floor Ultra-precision measurements at millikelvin temperatures should show a systematic uncertainty floor scaling as β/k_BT_apparatus that does not decrease with improved instrumentation β a non-commutative residue effect, not an engineering limitation. v6 Β· sharpened v12
|
|
P23 β Chart-Mismatch Reconstruction Residual (v12) For two co-located detectors at different effective temperatures, after standard calibration, any residual reconstruction disagreement should be tested against a thermal information-geometric separation measure. The Fisher information distance is proposed as a computable first proxy. First-sensitive variables: timing resolution, spectral channel fragmentation, inferred resonance lifetimes. The first working hypothesis is that any chart-mismatch residual scales with a thermal information-geometric separation measure, for which the Fisher information distance is a computable first proxy. Null result: if the residual is consistent with zero across all accessible temperature separations, the mechanism is falsified. v12 β Prediction 4.1
|
|||
What v12βv13 Added
The Programme’s Recent Advances
Versions 12 and 13 mark a transition from architectural development to computation and consolidation. V12 renamed the series, proved structural propositions, and stated falsification criteria. V13 performed the first actual calculations in the programme’s history.
Q. Why was the series renamed from “Entropy Waves” to “Manifold Relativity”?
The mature framework does not model entropy as a local scalar field propagating on standard spacetime. The historical title “Entropy Waves” invited precisely the category of critique that addresses a theory the programme does not propose. From v12, the programme is Manifold Relativity: Observer-Dependent Spectral Accessibility.
Provenance: v12
Q. Can the programme be falsified?
Yes. V12 states explicit falsification criteria: (1) If co-located detectors at different temperatures show no chart-mismatch reconstruction residual (P23). (2) If the qualitative sub-additivity mechanism from spectral truncation fails to generalise beyond the current toy/scaling regime, or if no viable exact composition law emerges from further analysis (O31/O38). (3) If the spectral filtration is not monotone. (4) If the millikelvin measurement floor is not found (P20). (5) If the Hubble-decoherence scaling fails (P1). The programme states explicit failure conditions.
Provenance: v12 β Section 5
Q. What did the O31 computation find?
V13 performed the first actual computation in the programme. Composite toy Dirac operators (4 to 16 dimensions) were spectrally truncated and the resulting entropy composition was tested. Confirmed: Spectral truncation of product states always creates sub-additive entropy. Full spectrum recovers exact additivity. Sub-additivity strength scales monotonically with truncation severity. Not confirmed: The specific ΞΊ-addition functional form is not uniquely selected β alternative sub-additive models fit comparably well. The qualitative mechanism of the v7 bridge conjecture is supported; the exact quantitative composition law remains open.
Provenance: v13 β Section 2
Q. What is the 2D EFE vacuity correction?
The v13 planning cycle proposed solving for gIE by requiring the (I, E) sector’s Ricci tensor to recover the vacuum Einstein equations. This fails: in two dimensions, the Einstein tensor vanishes identically for any metric. The “backward EFE” approach places no constraint on gIE. The actual constraints must operate through the 3D replacement product lift (new Open Problem O37), the determinant condition, Ryu-Takayanagi compatibility, and rotation map convergence.
Provenance: v13 β Section 3
Q. What is the 3+1 Emergence Hypothesis?
V13 organises the spatial and temporal claims into a unified statement (Hypothesis 5.1): (x,y,z) arise from the observer’s reconstruction of the accessible (I,E)-sector geometry, lifted to three dimensions through the replacement product. The temporal coordinate t arises as a chart-dependent ordering parameter associated with entropy processing rate cS(T). Different observer charts reconstruct different effective 3+1 descriptions. Cosmological expansion is reinterpreted as spectral enrichment as the universe cools. This hypothesis is Proposed, not Derived β its derivation depends on resolving O1 and O37.
Provenance: v13 β Hypothesis 5.1
Q. Does the programme preserve unexpected or negative computational results?
Yes. V13 ships with a computational addendum (five Python scripts, raw outputs, and a README) preserving the full evidence trail: the initial positive signal, the scaling extension, the epistemic correction (catching a tautological overclaim), the independent-ΞΊ test (showing non-uniqueness), and the analytical formula derivation. Misleading first-pass successes, weaker alternative fits, and negative checks are part of the scientific record, not discarded as noise. The v13 Computational Addendum is preserved unchanged and is now joined by a separate v15 Computational Addendum (see “What v14βv15 Added” below) containing the forensic evidence chain that resolves the v13/v14 3-site discrepancy.
Provenance: v13 β Computational Addendum; v15 β Computational Addendum (separate companion)
What v14βv15 Added
The Reconciliation Cycle
v14 and v15 form a coupled reconciliation cycle, not a new-claims cycle. v14 hardened v13 under referee pressure and introduced a “domain boundary” remark (rem:domain) to bound the scope of Proposition 2.8 in light of a 3-site numerical discrepancy reported in Script 05 of the v13 Computational Addendum. v15 performed a forensic audit, identified the discrepancy as a methodological artifact of the original verification script, and explicitly retracted the v14 “domain boundary” framing. The resolution is the strongest single piece of adversarial-review evidence the programme has produced so far.
Q. What is the v14βv15 cycle about?
It is a reconciliation / retraction cycle at the Computational epistemic tier, not a new-claims cycle. v14 was a referee-hardened submission draft; v15 is a computational-record reconciliation and retraction release. Neither version introduces new theoretical content, new Open Problems, or new empirical predictions. The substantive contribution of v15 is a forensic evidence chain (Scripts 07b, 07c, 07d in the v15 Computational Addendum) that reproduces the v13/v14 3-site numerical discrepancy, diagnoses its mechanism, and confirms Proposition 2.8 to machine precision in the product-Hamiltonian regime.
Provenance: v14 β referee-hardened submission draft; v15 β “Computational Record Reconciliation and Retraction”
Q. What did v14 add?
v14 was a referee-hardening pass on v13. Its main additions were: domain-of-validity bounding for Proposition 2.8 via a new Remark (rem:domain), motivated by the 3-site chain discrepancy reported in Script 05 of the v13 Computational Addendum; expanded literature positioning relative to adjacent work; and conclusion-section sharpening on the Ξ±-vs-ΞΊ non-uniqueness question for O38. The rem:domain framing hypothesised that the 3-site discrepancy marked a “domain boundary” beyond which Proposition 2.8 required restricted scope. This framing was explicitly retracted in v15 after the forensic audit.
Provenance: v14 β Remark rem:domain (later retracted); literature positioning; O38 conclusion sharpening
Q. Did the analytical formula fail at the 3-site chain in v13?
No, but the programme briefly thought it did. Script 05 of the v13 Computational Addendum reported a numerical mutual information of 0.2152715693 at the 3-site chain (9Γ9, Ξ²=0.5, 4/9 retention), while the analytical value from Proposition 2.8 was 0.6069288341. Preprint v14 hypothesised this as a “domain boundary” where the analytical formula broke down under some interaction-related structure. The v15 forensic audit proved this was a methodological inconsistency inside a degenerate subspace, not a theorem failure.
The mechanism is as follows. The 3-site composite Hamiltonian has a spectrum {β2β2, ββ2, ββ2, 0, 0, 0, +β2, +β2, +2β2} with multiplicities 1, 2, 3, 2, 1. Only 2 of the 9 composite eigenvectors are pure product states; the remaining 7 are superpositions inside the degenerate subspaces. The original Script 05 used numpy.linalg.eigh, which returns an arbitrary orthonormal basis within each degenerate subspace. When the 4/9 retention threshold partially retained a degenerate subspace, the truncated state produced by the numerical solver was not the same truncated state referenced by the analytical coordinate mask β the two were related by an arbitrary basis rotation. Both values were correct, but they described two different truncated states.
When the composite degeneracies were explicitly lifted by an asymmetric perturbation (Script 07d), the gap between the analytical and numerical values collapsed from 3.92 Γ 10β1 at Ξ΅ = 0 to β€ 10β15 by Ξ΅ β₯ 10β3. The v14 “domain boundary” hypothesis was explicitly retracted, and Proposition 2.8 is now verified to machine precision across all tested product-Hamiltonian cases, including those with composite spectral degeneracies.
Provenance: v15 β Β§C.2 (forensic probe narrative); Remark rem:domain (rewritten); v15 Computational Addendum Β§4 (Script 07b), Β§6 (Script 07d)
Q. What are Scripts 07b, 07c, and 07d?
The three-script forensic evidence chain embedded in the v15 Computational Addendum:
- Script 07b β Mechanism Diagnosis. A four-stage forensic probe. Stage 1 reproduces the three method values at the 3-site case (Method 1 analytical, Method 2A numerical in product basis, Method 2B numerical in composite eigenbasis), confirming that Methods 1 and 2A both give 0.6069288341 while Method 2B gives 0.2152715693. Stage 2 diagonalises the composite Hamiltonian and confirms that 2 of 9 eigenvectors are pure product states and 7 are superpositions within degenerate subspaces. Stage 3 sweeps the full v13 test matrix and finds three Method 2B disagreement cases, each exactly where the truncation threshold partially retains a composite degenerate subspace. Stage 4 records the final diagnosis. Build-gate for the v15 addendum PDF.
- Script 07c β Asymmetric Perturbation with Fixed Threshold. The preserved failed intermediate probe. Designed to lift the composite degeneracies via an asymmetric perturbation (which it does successfully), but held the threshold ΞΈ = β2 fixed across the sweep, causing the retention count to drift as eigenvalues crossed the threshold. The resulting gap sequence is noisy rather than monotonically convergent. Retroactively framed in the v15.0 manuscript as a failed intermediate probe; preserved unchanged in the addendum per the programme’s discipline of not erasing intermediate findings. Not a build-gate.
- Script 07d β Asymmetric Perturbation with Fixed Retention. The mechanism confirmation. Same asymmetric perturbation as 07c, but with the retention count fixed at N = 4 (the v13 4/9 case) by selecting the top N composite eigenvalues by magnitude at each Ξ΅. The gap collapses cleanly from 3.92 Γ 10β1 at Ξ΅ = 0 to β€ 10β15 by Ξ΅ β₯ 10β3. Two supplementary sweeps at N = 2 and N = 6 act as negative controls and show machine-precision agreement throughout, confirming that the discrepancy mechanism is specific to partial-degenerate-subspace retention, not to degeneracy in general. Build-gate for the v15 addendum PDF.
Together the three scripts constitute a reproducible forensic record: one probe that fails (07c), one that diagnoses (07b), one that confirms the mechanism (07d). The full source and outputs are embedded verbatim in the v15 Computational Addendum.
Provenance: v15 Computational Addendum β Β§Β§4β6 (script narratives); Appendices C and D (embedded source and outputs)
Q. What is Proposition 2.8 verified in, and what is it not verified in?
v15 establishes Proposition 2.8 in the product-Hamiltonian regime DAB = DA β π + π β DB across all tested retention levels and system sizes (4Γ4, 9Γ9, 16Γ16), including cases with composite spectral degeneracies. Under a properly-constructed product-basis coordinate mask, the numerical mutual information agrees with the analytical formula to machine precision at every tested case.
v15 does not establish Proposition 2.8 for genuinely interacting composite Hamiltonians (i.e., Hamiltonians that cannot be written as the sum of local-only terms). Extension of the verification chain to that regime is part of Open Problem O38 and remains open. The v15 cycle sharpens the understanding of what the question is, but does not close it.
Provenance: v15 β Β§C.2 (scope statement); v15 Computational Addendum Β§7 (Scope of Verification)
Q. Did v15 close any Open Problems?
No. v15 is a record-reconciliation release at the Computational epistemic tier. It neither closes O38 (the Ξ±/ΞΊ functional-form determination) nor introduces any new Open Problems. The mechanism behind the v13/v14 3-site discrepancy is now fully understood, but the analytical functional-form question for O38 is unchanged. No O40 is minted. The programme’s Open Problems list after v15 is structurally identical to the list after v13 and v14: O1 (with the O37 dependency), O4, O31 (with the O38 dependency), O35, O36, O37, O38, O39, and the External Review frontier. The post-v15 update to O38 is a status sharpening only β not a closure β and is reflected in the “What Remains Open” section below.
Provenance: v15 β Β§13 (Open Problems, unchanged from v14)
Q. Where can I find the v15 Computational Addendum?
The v15 Computational Addendum is a separate self-contained PDF companion to preprint v15.0, available on the programme website at manifold-relativity-programme.org under the v15 materials. The PDF is a 36-page single-file artifact containing the forensic chain narrative, the full source of Scripts 07b, 07c, and 07d as typeset listings, the verbatim console outputs of all three scripts, an in-PDF integrity hash table, and a build-and-verification note. The document is published under a three-tier payload-hash integrity model: the title page displays the SHA-256 of the embedded content payload (byte-identical to the source files used at build time), a per-file hash table appears in Appendix B, and the final published-PDF MD5/SHA-256 are listed on the website download page alongside the PDF itself.
Provenance: v15 Computational Addendum β published 2026-04
What Remains Open
The Programme’s Explicit Frontier
A credible programme states clearly what it has not yet established. The following are open problems formally indexed in the preprint series.
- O1 β Spatial Metric gIE(I,E) (Core): Derivation of the off-diagonal spatial metric component. The naive backward-EFE approach is vacuous in 2D (v13 correction). The constraint must operate through the 3D replacement product lift (O37). Status: open, constraint structure clarified.
- O2 β Full Einstein Field Equations: Recovery from the Ricci tensor of the emergent 3D spatial metric. Depends on O1 and O37.
- O4 β Higher-Dimensional RyuβTakayanagi: The 1D CFT scaling is established; the complete area-law in arbitrary higher dimensions requires a projection theorem mapping discrete cut sets to codimension-2 extremal surfaces in emergent spatial geometry.
- O16 β Strong & Weak Forces: Unification requiring non-zero gIΟ and gEΟ couplings beyond the electromagnetic KK reduction established in v4.
- O25 / DW Construction (Core): Formal construction of the π²-manifold Dirac operator as a full spectral triple. Version 10 introduced the candidate bare operator DW(0); this is a directional advance, not a completed proof.
- O31 β Composition Law from Spectral Truncation: Status: qualitative mechanism supported at toy/scaling level (v13). Sub-additivity from truncation is confirmed across 4β16 dimensions; exact ΞΊ-addition functional form not uniquely selected. Exact composition law remains open (O38).
- O35 β Neutrino Stress Test: Neutrinos provide a bounded stress test of the chart-matching framework. Indexed in v11.
- O36 β RG-Flow Identification: Whether the vertical comparison maps satisfy the formal properties of a Wilsonian RG transformation. Indexed in v12.
- O37 β 3D Metric from Replacement Product (New, v13): Construct the explicit three-dimensional spatial metric from the (I,E) base metric and the v3 replacement product cycle structure. Required before O1 can be constrained by EFE recovery.
- O38 β O31 Functional Form (New, v13): Determine analytically whether the sub-additive defect takes the specific ΞΊ-addition form, or a different sub-additive form, or a more complex function. Test with non-product Hamiltonians. Post-v15 status: the mechanism behind the v13/v14 3-site discrepancy is now understood (resolved in the v15 Computational Addendum as a basis-selection artifact in a degenerate subspace, not a failure of Proposition 2.8); the analytical functional-form question itself remains open.
- O39 β Temporal Emergence Map (New, v13): Rigorous construction of the temporal coordinate from the (S,Ο) sector, comparable in precision to the spatial emergence programme.
- External Review: The programme’s internal multi-node peer-review process (CAC) is rigorous and adversarial, but endogenous. Community response and expert critique are the necessary next frontier that internal review cannot substitute for.
Manifold Relativity Programme Β· Preprints v1βv15
Paul E. Sorvik, Principal Investigator Β· ORCID 0009-0008-5717-7110
manifold-relativity-programme.org
Developed through the Collaborative Augmented Consciousness (CAC) methodology
Claude (Anthropic) Β· Gemini (Google DeepMind) Β· ChatGPT (OpenAI)
All claims on this page are assertions within the Manifold Relativity framework.
Epistemic labels β Derived Β· Proposed Β· Predicted Β· Open β are maintained throughout. Status notes may use “conjectural” as descriptive prose under the Proposed heading.