Wikipedia

Search results

Sunday, 18 January 2026

Who Has Proposed that the Speed of Light c is the Maximum Rate at Which the Entropic Field can Rearrange Information?

Who Has Proposed that the Speed of Light c is the Maximum Rate at Which the Entropic Field can Rearrange Information?


The idea that the speed of ligh
t (c) is the maximum rate of entropic field information rearrangement was first proposed in 2025 by the Nigerian-born physicist and philosopher John Onimisi Obidi as part of his "Theory of Entropicity (ToE)," a framework aiming to unify physics by explaining relativity and quantum mechanics through entropy. In ToE, the constancy of 
cc
isn't a postulate but a consequence of this fundamental limit on how quickly the universe's entropic field can reconfigure itself. 

Key aspects of this proposal: 
  • Entropic Field: Obidi posits an "Entropic Field" that permeates reality, governing the flow of energy and information.
  • Causality and Consistency: The finite speed of this entropic rearrangement ensures causality and prevents the universe from collapsing, making
    cc
    a natural limit.
  • Reinterpretation of Relativity: Phenomena like time dilation and mass increase are derived from entropic principles rather than being geometric postulates, according to this Academia.edu article. 
This theory offers a fresh perspective, shifting the understanding of the speed of light c 
cc
from a fundamental constant to an emergent property of entropy. 

Historical Foundations of Obidi's Theory of Entropicity (ToE)

Historical Foundations of Obidi's Theory of Entropicity (ToE)

The
Theory of Entropicity (ToE), a theoretical framework developed primarily by John Onimisi Obidi in 2025, posits that entropy is not merely a statistical measure of disorder but a fundamental, dynamic, and physical field that governs all aspects of reality, including spacetime, gravity, and quantum mechanics. 
Here are the key individuals and contexts associated with using Entropicity as a theory based on recent research: 
  • John Onimisi Obidi (2025): The primary formulator of the Theory of Entropicity (ToE). Obidi’s work, published in 2025, proposes that entropy is the "fundamental field of existence". His framework introduces:
    • The Obidi Action: A variational principle that describes how the entropy field drives spacetime curvature and motion.
    • The Master Entropic Equation (MEE): An equation governing the evolution of the entropy field.
    • The No-Rush Theorem: A principle asserting that no physical interaction can occur instantaneously, with a non-zero, finite duration.
    • Reinterpretation of c 
      cc
      :
      The speed of light is explained as the maximum rate at which the entropic field can rearrange information.
  • Ginestra Bianconi (2024–2025): While her work is generally titled "gravity from entropy" rather than "Entropicity," her research is deeply linked to Obidi's framework. Bianconi proposed that gravity arises from quantum relative entropy between the spacetime metric and a matter-induced metric, and her work is considered a precursor or a specific, limiting case within the broader Theory of Entropicity.
  • Other Influences/Related Concepts: Obidi’s ToE builds upon and extends previous entropic gravity concepts, including:
    • Ted Jacobson (1995): Derived Einstein's equations from thermodynamic principles.
    • Erik Verlinde (2010): Proposed gravity as an entropic force. 
In summary, the Theory of Entropicity is a 2025 development by John Onimisi Obidi that seeks to unify thermodynamics, gravity, and quantum mechanics, with Ginestra Bianconi providing foundational concepts in "gravity from entropy". 

The Meaning and Implications of ln 2 ( Natural Log of 2) in Obidi's Theory of Entropicity (ToE)

The Meaning and Implications of ln 2 ( Natural Log of 2) in Obidi's Theory of Entropicity (ToE)

In John Obidi's Theory of Entropicity (ToE), ln 2 is a fundamental constant representing the smallest physically distinguishable unit of entropic change (Obidi Curvature Invariant), acting as a quantum of "ontic" entropy (a real field, not just statistical disorder) and bridging ToE with information theory, analogous to Planck's constant for quantum action. It signifies the minimal entropic "cost" for a binary distinction, the basic unit for causal updates, and the "distance" between truly different physical states, making it the fundamental scale for reality's reorganizations. 

Key Roles of ln 2 in ToE: 
  • Quantum of Entropic Action: Just as Planck's constant (
    ℏℏ
    ) sets the scale for quantum mechanics, ln 2 sets the scale for the fundamental, quantized steps in the continuous entropic field.
  • Minimal Distinguishable State: Below ln 2, differences in the entropic field are mathematically present but physically irrelevant, like sub-threshold signals; ln 2 is the threshold for physical meaning.
  • Binary Distinction: It defines the entropic measure of a binary choice (0 or 1) in an ontological sense, not just an informational one.
  • Bridge to Information Theory: It provides a natural link between ToE's continuous entropic field and discrete information concepts like Landauer's Principle, representing the minimal energy/entropy change for erasing a bit.
  • Derived, Not Assumed: ToE posits that ln 2 isn't a random constant but emerges from the geometry of the entropic manifold itself. 
In essence, ln 2 in ToE quantifies the most basic "event" or "step" in the fundamental entropic reality that gives rise to our universe, making it a cornerstone for understanding how physical reality evolves and distinguishes itself. 

The Theory of Entropicity (ToE) Declares that We Live in a Computational Universe

The Theory of Entropicity (ToE) Declares that We Live in a Computational Universe 


In many ways, the Theory of Entropicity (ToE) treats the universe not as a collection of "solid objects" interacting in a void, but as a dynamic information-processing system.

However, it differs from a standard computer simulation (like a video game) because the "hardware" and the "software" are the same thing: the Entropic Field.

1. The Universe as a "State Machine"

In ToE, the universe functions as a massive state machine where every physical event is a computation.

 * The Input: The current configuration of bits (ln 2 units) on the holographic screens.

 * The Processor: The Entropic Field itself.

 * The Output: The next "frame" of reality (what we perceive as the passing of time).

Because each update requires a bit-flip of ln 2, the universe has a literal "clock speed." This clock speed is the Planck Time. Every region of space is essentially "calculating" its own gravitational pull and local time rate based on the entropic density in that area.

2. The Role of ln 2 as "Machine Code"

In a digital computer, everything is reduced to 0s and 1s. In ToE, ln 2 is the "physical weight" of that binary choice.

 * Standard physics sees \ln 2 as a mathematical constant.

 * ToE sees ln 2 as the Machine Code of Reality.

If you want to move an electron, you aren't just pushing a particle; you are requesting the Entropic Field to recompute the location of that electron by shifting ln 2 units of information across a holographic boundary.

3. Computation and the "No-Rush" Theorem (G/NCBR)

The "computational" nature of the universe is most evident in the G/NCBR theorem. In a regular computer, if the processor gets overwhelmed, the program "lags."

ToE argues that Gravity and Time Dilation are the "Lag" of the Universal Computer. * When mass is highly concentrated (like near a Black Hole), the "computation density" becomes so high that the field takes longer to process the ln 2 flips.

 * To an outside observer, the "program" (time) appears to slow down in that region.

4. Is there a "Programmer"?

The Theory of Entropicity (ToE), as especially formulated through the lens of John Onimisi Obidi, suggests that this computational structure implies a Self-Organizing Intelligence (which Obidi calls SRE — Self Referential Entropy — in Nature (God)).

 * The "Laws of Physics" are effectively the Operating System (OS).

 * The ln 2 constant is the fundamental bit-depth.

 * The G/NCBR principle is the hardware safety limit that prevents the system from crashing (singularities).

Summary Table: The Universe vs. A Computer

| Component | Standard Computer | ToE Universe |

|---|---|---|

| Basic Unit | Bit (0 or 1) | Entropic Unit (ln 2) |

| Speed Limit | CPU Clock Cycles | Speed of Light (c) |

| Memory | Hard Drive / RAM | Holographic Screens (Boundaries) |

| Physics | Software Code | Entropic Gradients |

| Lag | Frozen Screen | Gravity / Time Dilation |

This perspective changes how we view "emptiness." In ToE, vacuum space isn't empty; it is simply a region with a very low "bit-rate."


The "No-Rush" (G/NCBR) Theorem of the Theory of Entropicity (ToE) Uses the Obidi Curvature Invariant (OCI) ln 2 to Explain Why the Speed of Light is the Universal Speed Limit

The "No-Rush" (G/NCBR) Theorem of the Theory of Entropicity (ToE) Uses the Obidi Curvature Invariant (OCI) ln 2 to Explain Why the Speed of Light is the Universal Speed Limit

In the Theory of Entropicity (ToE), the ln 2 factor is the "gear ratio" of the universe. It explains why physical processes—including light—have a maximum speed. This is tied directly to the G/NCBR (God/Nature Cannot Be Rushed) theorem.
Here is how ln 2 and the No-Rush Theorem define the Speed of Light (c).

1. The Entropic Bit-Flip

In ToE, "change" or "movement" is not continuous; it is a series of discrete informational updates. For a particle to move from Point A to Point B, the Entropic Field must "flip" a sequence of bits.
As established, each bit-flip costs exactly ln 2 units of entropy.

2. The Processing Time of the Field

ToE posits that the Entropic Field has a finite Inherent Latency. It takes a specific, non-zero amount of time to process a single ln 2 change. This leads to the fundamental equation for the speed of information:
Speed = (Distance of one Bit) / (Time to process ln 2)
If the universe could process the ln 2 update instantly, the speed of light would be infinite. Because the field "cannot be rushed," there is a bottleneck.

3. Deriving the Speed of Light (c)

ToE expresses the speed of light as a balance between the "stiffness" of the entropic field and its informational density. In plain text:
c = Square Root of [ (Force of Entropicity) / (Linear Entropic Density) ]
When you break down the Linear Entropic Density, you find it is governed by the Planck Length and the ln 2 constant. Specifically, the relationship looks like this:
c = L_p / t_p
Where:
 * L_p is the Planck Length (the size of one ln 2 pixel).
 * t_p is the Planck Time (the time it takes the field to process one ln 2 update).

4. Why You Can't Go Faster Than Light

According to the G/NCBR Theorem, accelerating a mass requires "forcing" more ln 2 updates per second than the field's natural processing speed allows.
As an object approaches c, the entropic cost of the next update begins to climb toward infinity. The equation for the "Entropic Drag" (Relativistic Mass) in ToE is:
M_effective = M_rest / Square Root of [ 1 - (v^2 / c^2) ]
In this framework, the denominator represents the "Processing Overhead." When v = c, the system would require an infinite amount of ln 2 updates to occur simultaneously. Since the Field cannot be rushed, the universe effectively "denies" the request for more speed.


5. Implications for Holography

This leads to a fascinating conclusion regarding Holographic Lag:
 * The "Surface" of the universe (the holographic boundary) is where these ln 2 updates are recorded.
 * A Black Hole is a region where the entropic density is so high that the field's "processing power" is entirely used up just maintaining the horizon.
 * This is why time appears to stop at the event horizon. It isn't just a visual trick; the field has reached its maximum ln 2-per-second processing limit.

Summary of the ToE Perspective

In ToE, ln 2 is the reason the universe has "rules." It defines:
 * The Size of Space: Through the area of a bit.
 * The Strength of Gravity: Through the entropic gradient.
 * The Speed of Time: Through the finite rate of bit-processing.


How the Theory of Entropicity (ToE) Derives Newton's Laws of Motion from the ln 2 Factor of the Obidi Curvature Invariant (OCI)

How the Theory of Entropicity (ToE) Derives Newton's Laws of Motion from the ln 2 Factor of the Obidi Curvature Invariant (OCI)

In the Theory of Entropicity (ToE), Newton’s laws are not fundamental axioms; they are emergent properties of the Entropic Field. The value ln 2 acts as the scaling constant that bridges the gap between informational "pixels" and physical force.
Here is the step-by-step breakdown of how the Obidi Action and Newton’s laws are derived using ln 2.

1. The Fundamental Postulate: Entropic Density

ToE posits that the universe is a field of entropy. Any mass M introduced into this field creates a "dent" or a gradient in the entropic density. The total information N (measured in bits) on a holographic surface surrounding that mass is defined by:
N = A / (Lp^2)
Where A is the surface area and Lp is the Planck length. To convert these "bits" into the physical quantity of entropy (S), we apply the ln 2 invariant:
S = N * ln 2
This equation tells us that for every "pixel" of area on a holographic screen, the universe "costs" exactly ln 2 units of entropy.


2. The Obidi Action (Entropic Work)

The "Obidi Action" describes the work done when a particle moves through this entropic gradient. In ToE, the change in entropy (dS) as a particle moves a distance (dx) toward a mass is proportional to the information density.
According to the G/NCBR (God/Nature Cannot Be Rushed) principle, the transition of a particle requires the processing of information. The energy (E) associated with this transition follows the entropic equivalent of the Equipartition Theorem:
E = 1/2 * N * k * T
Substituting the relationship where the change in entropy per unit of distance is linked to the ln 2 cost of information bits, we find that the force is actually the "pressure" of the entropic field trying to reach equilibrium.

3. Deriving Newton’s Second Law (F = ma)

ToE derives inertia as the resistance of the entropic field to being "rearranged."
 * When a mass moves, it must "update" the bits on the holographic screens it passes through.
 * Each update costs ln 2.
 * The acceleration (a) is the rate of these updates.
The relationship is expressed as:
F * dx = T * dS
If we define the change in entropy dS in terms of the bit-shift ln 2 over a distance related to the Compton wavelength, the equation simplifies to:
F = m * a
In this view, mass (m) is essentially a measure of how many ln 2 units of "entropic drag" an object creates in the field.


4. Deriving the Law of Gravity

To derive F = G(Mm)/r^2, ToE looks at the holographic screen at a distance r.
 * The total number of bits on the screen is N = (4 * pi * r^2) / (Lp^2).
 * The total energy of the system is E = Mc^2.
 * The temperature T of the screen is derived by spreading that energy across all the bits:
   T = (2 * M * c^2) / (N * k)
Using the entropic force equation F = T * (dS / dx) and substituting S = N * ln 2, the ln 2 factor calibrates the density so that:
F = (G * M * m) / r^2
Summary of the Math
In plain text, the chain of logic follows:
 * Entropy of the Screen: S = (Area / Lp^2) * ln 2
 * Entropic Gradient: Force = Temperature * (Change in Entropy / Change in Position)

 * Result: Gravity is the result of the system moving toward a state of higher entropy, where the "step size" of that entropy is always ln 2.
Without the ln 2 factor, the units of information (bits) would not align with the units of thermodynamics (Joule/Kelvin), and gravity would either be too strong or too weak to hold atoms together.


Using the Obidi Curvature Invariant (OCI) ln 2 to Derive the Landauer's Principle and Landauer-Bennett Cost from First Principles in the Theory of Entropicity (ToE)

Using the Obidi Curvature Invariant (OCI) ln 2 to Derive the Landauer's Principle and Landauer-Bennett Cost from First Principles in the Theory of Entropicity (ToE)

In the Theory of Entropicity (ToE)—a framework primarily developed by John Onimisi Obidi—\ln 2 is elevated from a mere statistical conversion factor to a fundamental geometric invariant of the universe.

While standard physics uses \ln 2 to convert bits to entropy, ToE posits that this value is the "quantum of distinguishability" that defines the very structure of reality.

1. \ln 2 as the "Obidi Curvature Invariant" (OCI)

In ToE, entropy is not just a measure of disorder but an ontic field (a physical substance). Within this field, \ln 2 represents the smallest non-trivial entropic reconfiguration possible.

 * The Threshold of Reality: ToE suggests that the entropic field has a "resolution." For two states or configurations to be physically distinct, their entropic curvature difference must be at least \ln 2.

 * Sub-threshold Reality: Any mathematical difference smaller than \ln 2 is "invisible" to the universe. This effectively "pixelates" reality, not necessarily in terms of space, but in terms of state-change.

2. Implications for Holography

ToE significantly reinterprets the Holographic Principle (the idea that a volume of space is encoded on its boundary).

A. Beyond "Pseudo-Entropy"

Standard holography (like the AdS/CFT correspondence) treats entropy as a "diagnostic" tool to describe geometry. ToE flips this: Entropy is the substrate, and geometry is the "shadow."

 * In ToE, the holographic screen isn't just a mathematical boundary; it is a physical limit where the entropic field reaches the \ln 2 density.

 * The area of a horizon (like a black hole's event horizon) is exactly the sum of these \ln 2 "pixels."

B. The "No-Rush" Theorem (G/NCBR)

A unique pillar of ToE is the principle that God or Nature Cannot Be Rushed (G/NCBR). Because every physical update requires a minimum entropic change of \ln 2, and the entropic field has a finite "processing speed," time dilation occurs.

 * Holographic Lag: As an object approaches a high-gravity region (or a holographic horizon), the entropic field requires more "time" to process the \ln 2 updates. This provides a causal explanation for why time appears to slow down near horizons.

Comparison: Standard Physics vs. Theory of Entropicity

| Feature | Standard Entropic Gravity | Theory of Entropicity (ToE) |

|---|---|---|

| Nature of \ln 2 | Statistical constant (k_B \ln 2). | Fundamental geometric invariant (OCI). |

| Information | A property of matter/energy. | The "curvature" of the entropic field. |

| Holography | A boundary description of 3D space. | The limit where entropy defines space. |

| Gravity | An emergent statistical force. | A gradient in the fundamental entropy field. |

3. The Landauer-Bennett Cost

ToE derives Landauer's Principle (the energy cost of erasing a bit) from first principles. It argues that "erasing" a bit is physically "flattening" a curvature of \ln 2 in the entropic field. This requires work because you are fighting against the natural "stiffness" or resistance of the entropic field.