Mathematical Rigour

I. Building Blocks

i. Rigour

Rigour refers to a process of adhering absolutely to certain constraints. It is the practice of maintaining strict consistency with certain predefined parameters. The mathematical definitions of conjecture, proof and theorem are used i.e. a conjecture withstood the scrutiny of existing knowledge, a proof is unfalsifiable, a theorem has been proven and a law defines the boundaries of the environment and is beyond scrutiny. This is in contrast to their scientific counterparts. Additionally, a truth is a multidisciplinary abstraction that has remained standing in the face of the aforementioned. The application of a theorem or a truth is itself subject to rigour.

ii. Intellectual Rigour

Intellectual rigour is a process of thought which is consistent, does not contain self-contradiction, and takes into account the entire scope of available knowledge on the topic. If a topic or case is dealt with in a rigorous way, it typically means that it is dealt with in a comprehensive, thorough and complete way, leaving no room for inconsistencies.

iii. Mathematical Proof

A mathematical proof is an inferential argument for a mathematical statement, showing that the stated assumptions logically guarantee the conclusion. The argument may use other previously established statements, such as theorems; but every proof can, in principle, be constructed using only certain basic or original assumptions known as axioms, along with the accepted rules of inference. Proofs are examples of exhaustive deductive reasoning which establish logical certainty, to be distinguished from empirical arguments or non-exhaustive inductive reasoning which establish "reasonable expectation".

iv. Deterministic Algorithms

A deterministic algorithm is an algorithm that, given a particular input, will always produce the same output, with the underlying machine always passing through the same sequence of states. Formally, a deterministic algorithm computes a mathematical function; a function has a unique value for any input in its domain, and the algorithm is a process that produces this particular value as output. Deterministic algorithms can be defined in terms of a state machine: a state describes what a machine is doing at a particular instant in time. State machines pass in a discrete manner from one state to another. Just after we enter the input, the machine is in its initial state or start state. If the machine is deterministic, this means that from this point onwards, its current state determines what its next state will be; its course through the set of states is predetermined.

II. Maxims

i. Pessimistic view of Probability

The downside of a statistical improbability should be treated with the same gravity as an absolute certainty but the upside of a statistical probability should not be relied upon at all [see Solved Game].

ii. Simulation and heuristics

Simulation and heuristics may highlight flaws but shouldn't be used to establish confidence.

iii. Tests

Unlike experiments, tests should be deterministic and conclusive.

iv. Durability

A truth should be durable, at least for as long as its validity is relied upon.

v. Timeline

A truth should be established in the context of infinity and eternity.

vi. Abstraction and application

If an abstraction is applied in practice, the application should meet the same rigorous standards as the abstraction.

vii. Style

The purpose of style is to remove bloat, avoid fragility and prevent decay,
not add to it.

viii. Total Evidence

In reckoning a probability we must take into account all the available information.

ix. Statistical Probability

Statistical probability has nothing to do with the likelihood of something occurring and everything to do with the number of times it will occur at infinity.

x. Probability within limits

The optimal interaction with, or exposure to, a less-than-infinite view of a known probability is knowable, deterministic and should be treated with rigor.

xi. Solved Game

A game is considered solved when all the possibles states can deterministically be traced to all valid available outcomes.

xii. Classic Fundamental Laws

Every effort should be made to keep classic fundamental laws at the forefront of thought, especially when it is in contradiction with the latest developments.

xiii. The Self-Sustaining nature of Truth

A truth is self-sustaining. There is no good that can be done to improve its truthfulness. If there is ever a need to correct the outcome, it can only be done by returning all constituent parts to the basic laws and fundamental truth itself.

xiv. The Self-Authenticating nature of Truth

In the absence of perfect knowledge, the validity of a truth is not subject to argument, awareness or observability.

xv. Durability

A truth does not decay when it is extrapolated between abstraction layers.

xvi. Defining the domain

There will always be things we don't know but "not knowing what we don't know" highlights a flaw in the domain definition rather than a lack of knowledge. Once a domain has been defined everything about it becomes knowable.

xvii. Testing Rigour

If the last few tests are considered to be more complicated or less valuable than the first few tests then cherry-picking has taken place.

xviii. Boundaries

A truth explicitly states its boundaries.

ζ

Further reading:

Border between what we know and what we don't know Dealing with non-determinism
Recon with reality Deterministic testing.
Transparency and verifiability
Simplicity
Not context sensitive
Provable and proven
Consistency
Independence and interaction
Mark to market
What constitutes a complete body of knowledge.
Truth in maths vs arithmetic.
Scale.
Solving equations by collapsing abstraction e.g. into a domain specific language.
Moving between abstraction layers.
Valid expressions of truth and a humane representation of thought.
Multidisciplinary communication.
Ethics.
Self-inflicted side effects. Iatrogenics and escalating flip-flops.
Outliers
Accounting practice : Mathematical rigour over generally accepted
Consistent Impact (setting a precedent that is larger than the problem)
Asymmetrical cost
Measurement, discipline, consistency, enforcement
Respectable outcome
Unanimous vs Unilateral or Activist
Wholesomeness
Statistical inference
Consistent Weight
The role of regression testing, simulation and heuristics.
Valid application of abstract truths.
Immutable
Limits at infinity
Black swans, fooled by randomness
Expected utility and the dire consequences of being 99.99% right

Glossary

Total Evidence

In reckoning a probability, we must take into account all the information we have.