Mathematical truths, such as 2+2=4 under Peano's axioms and the Pythagorean theorem in Euclidean geometry, are not just considered objectively true; they are, in fact, objectively true in reality, not merely in theory. This is because they are logically deduced from clearly defined axioms and rules. These truths do not rely on observation or interpretation; they are guaranteed by the structure of the system to which they belong.
In mathematics, once the definitions and axioms are established (for example, the basic properties of numbers in Peano's axioms or the properties of flat space in Euclidean geometry), certain statements, like the Pythagorean theorem, can be logically proven to follow from these axioms and definitions. These statements, or theorems, are then true universally within that formal system (e.g., Euclidean geometry) with absolute certainty, meaning they cannot turn out to be false in reality as long as the system’s definitions and axioms hold true in reality. For example, unless Euclidean principles are violated in the real world—as they are in curved space, which requires Riemannian geometry to model accurately—the Pythagorean theorem is guaranteed to hold true universally, not just in theory but also in the real world.
This differs from a hypothesis or hearsay, which may be true but lacks the rigorous, step-by-step proof that mathematics provides. Mathematical truths are built by applying rules of inference in first-order logic to reach conclusions that logically follow from the axioms, making them universally valid within the theoretical framework of the system.
Therefore, the only way a mathematical truth in a formal system could fail to hold true in practice, in the real world, is if one of the definitions or axioms does not accurately represent reality. For example, 2+2 is always 4 within Peano’s system unless one of Peano's axioms is violated.
This is why, once definitions and axioms are aligned with reality, disagreement is impossible. I cannot lie to you, nor can you lie to me, unless our axioms and definitions fail to align with reality. Thus, if we limit axioms to objective, independently verifiable real-world facts, lying becomes absolutely provably impossible within that formal system.
How to Never Lose Money
In the context of a sound formal system, assuming no errors in proof, all correctly deduced logical claims—including any and all correct corollaries, lemmas, and theorems—are guaranteed to hold universally within that system. Since mathematical proofs are independently verifiable for accuracy, the probability of a logically deduced claim—such as 2 + 2 = 4—being false, given that Peano's axioms (from which this claim is deduced using standard first-order logic inference rules) hold true, is zero.
This means that as long as the axioms accurately reflect reality, the theorems derived from them are guaranteed to hold true not only within the formal system but also in the real world. Therefore, when Peano's axioms are valid descriptions of the properties of natural numbers in reality, 2 + 2 = 4 is assured to be true both theoretically and practically. Thus, barring any violation of Peano's axioms, 2 + 2 = 4 is guaranteed to hold true in reality, not just within abstract algebra.
This is where our "Wall Street-style" inference rules become stricter and more formal than the free-for-all, wild-goose-chase approach currently permitted under the less rigorous "child's play" rules of inference used by some theoretical mathematicians. For those constrained by the demands of working on Wall Street, our stricter inference rules are essential. While we often remind our clients that investments can always result in losses, our approach—particularly in statistical arbitrage—ensures that we don’t lose our own money. I speak confidently from my experience trading at RBC and running my own hedge fund. Our former colleagues at Renaissance Technologies also engage in statistical arbitrage—look up their strategies. If you don’t want to lose money as we don’t, you must adhere to rules that are more stringent than those you may be accustomed to—such as those outlined in this white paper.
As Don Corleone—the Godfather—famously says, “It's an old habit. I spent my whole life trying not to be careless. Women and children can afford to be careless, but not men.” Similarly, on Wall Street, carelessness can lead to repercussions that go far beyond financial losses, often resulting in long prison sentences, as seen in high-profile cases involving figures like Sam Bankman-Fried, Michael Milken, and others. As practicing mathematicians in the financial industry, we cannot afford mistakes—and we don’t make any—because we follow rigorous, fail-proof inference rules.
To borrow a line from another iconic movie, Carlito’s Way (1993), when Carlito Brigante tells David Kleinfeld, “Dave, you’re a gangster now. A whole new ballgame. You can’t learn about it in school.” Well, in our school of applied Wall-Street-style mathematics, you can. The term “old man Funt” comes from Ilf and Petrov’s classic 1931 book The Golden Calf, where “Funt” refers to a character whose role is to take the fall for a fraudulent businessman—much like Joe Jett’s role at Kidder Peabody when I started trading stat-arb there. Mathematicians don’t take the fall; a Funt does.
So, what distinguishes our inference rules from those used by others who risk legal repercussions—or, alternatively, don’t play the game at all, meaning they have no money? There are two primary distinctions. First, we don’t mistake hypotheses for axioms. Our axioms must be self-evidently true, as stated in any math book. For instance, Milton Friedman proposed the hypothesis that the central bank caused the Great Depression. While plausible and likely accurate, it remains a hypothesis and could, by definition, be disproven. This is why, on Wall Street, we rely on the Arrow-Debreu framework—a formal system that mitigates the risks associated with conflating hypotheses with axioms, a common pitfall in other approaches.
We use the self-evident axiom that the Great Depression was caused by deflation. Therefore, any volatility in the price level is detrimental to economic growth. This is evidenced by the fact that central banks universally fear deflation above all else and work vigorously to prevent excessive inflation. This is not merely theoretical—it is an objective, real-world fact. Wall Street-style axioms don’t lie about reality. We may lie to our girlfriends, but we do not lie to our clients when we tell them they may lose their money, and we never lie to ourselves by mistaking hypotheses for axioms. Ever.
The requirement that nothing must contradict reality naturally extends to inference rules. In reality, everything is defined dually in relation to its opposite; nothing exists in isolation. Hot is defined in relation to cold, love to hate, and even at the fundamental level of theoretical physics, everything is a duality, as exemplified by the particle-wave duality. This duality is mirrored in all properly structured formal systems, such as algebra, which is based on Peano's arithmetic and represents reality through an object-action duality.
In Peano's arithmetic, the object is the absence-existence duality, represented by (0-1), and the action is the addition-subtraction duality, represented by the operations "+" and "–". From this foundation, everything else is recursively defined in terms of actions (addition-subtraction) on objects (natural numbers). Multiplication is the recursive application of addition, and division is its dual—the recursive application of subtraction. Similarly, root-exponent relationships follow this pattern. All of these concepts are described and defined by Peano's axioms, from which all other claims are logically proven.
Thus, the dual aspect of our formal inference rules being consistent with reality is that, just as all axioms must be self-evidently true and not contradict reality, so too must the inference rules. This requires that everything be dually defined, as in Peano's arithmetic. This principle extends to geometry (which models actions based on the line-point duality), trigonometry (sine-cosine duality), and other branches of mathematics, such as optimization, where each problem has a corresponding dual.
Therefore, "Wall-Street" style inference rules add a dual mandate:
Not only: Axioms must be self-evidently true.
But also: Everything must be properly and dually defined.
That’s it. Beyond this, we adhere rigorously—or, you could say, "religiously"—to the existing formal rules of inference used in first-order logic, which model relationships dually, such as "if cause, then effect," reflecting the inherent duality that underpins causal relationships observed in reality.
An illustrative example of the necessity of dual consistency under our Wall-Street style inference rules involves applying mathematical concepts to reality. Peano’s axioms, which define the natural numbers, include the principle that every natural number n has a unique successor n′, implying an infinite series. However, when applied to physical scenarios—such as counting the moons of Mars, of which there are two, Phobos and Deimos—the concept of infinity does not correspond to the finite reality of celestial bodies. Although the statement 2 + 2 = 4 is theoretically sound within Peano’s axioms, applying this in the context of Mars’s moons would incorrectly assume a set of four moons, contradicting the observed reality of two. Here, external applicability fails, revealing the need for empirical alignment in applied mathematics.
Thus, in a dually consistent formal system regarding Mars’s moons, Peano’s second axiom would be adapted to limit n to 2, achieving a fully sound, externally consistent system. This example illustrates why we don’t lose money on Wall Street: we don’t throw darts at the board by assuming 2 + 2 = 4 without proof. We only bet on sure things.