LangInteger

Note for Constructive Logic

This is a note for my learning of Constructive Logic by Frank Pfenning. Logic is the study of the principles of valid inferences and demonstration, which constitutes an important area in the disciplines of philosophy and mathematics.

1 Inference Rule and Natural Deduction

Two principal notions of logic are propositions and proofs. One approach to build foundations for these notions is to understand the meaning of a proposition by understanding its proofs. The system of inference rules that arises from this point of view is natural deduction. The cornerstone of the foundation of logic is a clear separation of the notions of judgment and proposition.

The most important judgment form in logic is $\mathit{‘’A~is~true’’}$, in which $A$ is a proposition. A judgement is evident if we in fact know it, or in other words, we have a proof for it.

The general form of an inference rule and the conjunction introduction rule are

in which $A$ and $B$ are schematic variables, $J_1\dots J_n$ are called the premises, the judgment $J$ is called the conclusion. The inference rules for logic connectives $\wedge$, $\vee$, $\supset$, $\top$, $\bot$ are:

The judgments, propositions and inference rules we have defined so far collectively form a system of natural deduction. It is a minor variant of a system introduced by Gentzen aiming at devising rules to model mathematical reasoning as directly as possible.

So far, we have defined the meaning of the logical connectives by their introduction rules, which is the so-called verificationist approach. Another common way to define a logic connective is by a notational definition, which gives the meaning of the general form of a proposition in terms of another proposition whose meaning has already been defined. Some common notational definition in intuitionistic logic

We sometimes give a notational definition and then derive introduction and eliminaton rules for the connective. It should be unerstood that these rules have a different status from those that define a connective.

2 Harmony of Logic Connective

In the verificationist definition of the logical connectives, the introduction and elimination rules we defined are in harmony. We establish two properties: local soundness and local completeness to show it.

  • Local soundness shows that the elimination rules are not too strong: no matter how we apply elimination rules to the result of an introduction we cannot gain any new information. We use the below notation to show the local reduction of a deduction $\mathcal{D}$ to another deduction $\mathcal{D’}$.
  • Local completeness shows that the elimination rules are not too weak: there is always a way to apply elimination rules so that we can reconstitute a proof of the original proposition from the results by applying introduction rules. We use the below notation to show the local expansion of a deduction $\mathcal{D}$ to another deduction $\mathcal{D’}$.

Another criterion we would like to apply uniformly is that both introduction and elimination rules do not refer to other propositional constants or connectives besides the one we are trying to define, which could create a dangerous dependency of the various connectives on each other.

Connectives are properly defined only if their introduction and elimination rules are in hormony in the sense that they are locally sound and complete. Next, we show how to prove them.

Proof for local soundness:

  • Conjunction $\wedge$
    • $\wedge~E_L$
    • $\wedge~E_R$
  • Implication $\supset$
    • $\supset E$
  • Disjunction $\vee$
    • $\vee E$
  • Truth $\top$. There is no elimination rule, so there is no case to check for local soundness.
  • Falsehood $\bot$
    • $\bot~E$. Local soundness is trivially satisfied since we have no introduction rule

Proof for local completeness:

  • Conjunction $\wedge$
  • Implication $\supset$
  • Disjunction $\vee$
    • In the proof here the elimination rule is applied last rather than first. Mostly, this is due to the notion of natural deduction. It represents the step from using the knowledge of $A\vee B~true$ and eliminating it to obtain the hypotheses $A~true$ and $B~true$ in the two cases.
  • Truth $\top$
  • Falsehood $\bot$
    • It seems there is no case to consider since there is no introduction rule for $\bot$. Nevertheless, the following is the right local expansion.

3 Proof as Programs

The computational interpretation of constructive proofs on the propositional fragment of logic is called the Curry-Howard isomorphism. We use the following judgement to illustrate the relationship between proofs and programs:

This dual interpretations of the same judgement is the core of Curry-Howard isomorphism.

  • conjunction $A\wedge B$ corresponds to the product type $A\times B$
  • Truth $\top$ corresponds to the unite type with only one element
  • implications $A\supset B$ corresponds to the function type $A\rightarrow B$
  • disjunction $A\vee B$ corresponds to the sum type $A+B$
  • Falsehood $\bot$ corresponds to the empty type with zero element

4 Intuitionistic Instead of Classical

The specific interpretation of the truth judgment underlying these rules is intuitionistic or constructive. This differs from the classical or Boolean interpretation of truth.

Classical logic is based on the principle that every proposition must be true or false. It will accept the proposition $A\vee(A\supset B)$ as true for arbitrary $A$ and $B$. In contrast, intuitionistic logic is based on explicit evidence, and evidence for a disjunction requires evidence for one of the disjuncts. TBD: intuitionisti logic has a direct connection to functional computation, which classical logic lacks.

Classical logic can be characterized by a number of equivalent axioms:

  • Proof by Contradiction
  • Law of the Excluded Middle: $A\vee \neg A$
  • Double-Negation Elimination: $\neg\neg A\supset A$
  • Peirce’s Law: $((A\supset B)\supset A)\supset A$

We might consider making the logic we have seen so far classical by adding any rule that corresponds to one of these aximos like:

TBD: there is a double-negation translation that translates any classical theorem into one that is intuitionisti valid.

5 Sequent Calculus

Sequent calculus is another formal system for proof search in natural deduction. It is designed to exactly capture the notion of a verification.

Verifications($A\uparrow$) and uses($A\downarrow$) are a pair of comcepts that capture the proof search strategies to use introduction rules from the bottom up and elimination rules from the top down. Verifications are defined for each introduction rule. Uses are defined for each elimination rule.

When constructing a verification, we are generally in a state of the following form:

A sequent is a local notation for such a partially complete verification. We write

The judgements on the left are assumptions called antecedents, the judgment on the right side is the conclusion called the succedent. The rules taht define the $A\,\mathsf{left}$ and $A\,\mathsf{right}$ judgment are systematically constructed from the introduction and elimination rules.

  • Introduction rules already work from the conclusion to the premises (bottom-up), the mapping is straightforward
  • Elimination rules work top-down, so they have to be flipped upside-down in order to work as sequent rules

The right and left rules are constructed as follows from verifications and uses one by one for $\wedge$, $\top$, $\supset$, $\vee$, $\bot$

6 Local Completeness/Soundness - Global Version

Same as introduction and elimination rules, connectives are properly defined only if their left and right rules are in hormony. To show this, we introduce two theorems.

Identity Theorem: For any proposition $A$, we have $A\Longrightarrow A$. This is the global version of the local completeness property for each individual connective. This can be explained as:

  • If we assume $A~left$ we can prove $A~right$
  • The left rules of the sequent calculus are strong enough so that we can reconstitute a proof of $A$ from the assumption A

Proof: By induction on the structure of $A$.

  • Case: $A$ is an atomic proposition. $A=P$. Then
  • Case: $A = A_1 \wedge A_2$. By i.h. on $A_1$ and weakening, we have $A_1\wedge A_2,A_1\Longrightarrow A_1$. By i.h. on $A_2$ and weakening, we have $A_1\wedge A_2,A_2\Longrightarrow A_2$. Then
  • Case: $A=A_1\supset A_2$. By i.h. on $A_1$ and weakening, we have $A_1\supset A_2,A_1\Longrightarrow A_1$. By i.h. on $A_2$ and weakening, we have $A_1\supset A_2,A_1,A_2\Longrightarrow A_2$. Then
  • Case: $A=A_1\vee A_2$. By i.h. on $A_1$ and weakening, we have $A_1\vee A_2,A_1\Longrightarrow A_1$. By i.h. on $A_2$ and weakening, we have $A_1\vee A_2,A_2\Longrightarrow A_2$. Then

Cut theorem: If $\Gamma\Longrightarrow A~right$ and $\Gamma,A~left\Longrightarrow C$ then $\Gamma\Longrightarrow C$. This is the global version of local soundness, which can be explained as:

  • If we have a proof of $A~right$, we are licensed to assume $A~left$.
  • the left rules are not too strong

Proof: By induction on the structure of cut formula $A$, the derivation $\mathcal{D}$ of $\Gamma\Longrightarrow A$ and $\mathcal{E}$ of $\Gamma,A\Longrightarrow C$. We show how to transform

in this constructive proof.

We build the proof by the combination of the last inference rule that is applied in $\mathcal{D}$ and $\mathcal{E}$. To simplify the cases we need to consider, we classify the cases as follows:

  • The last inference rule is initial sequent in $\mathcal{D}$ or $\mathcal{E}$
    • Case 1: the last inference rule in $\mathcal{D}$ is initial sequent
    • Case 2: the last inference rule in $\mathcal{E}$ is initial sequent using cut formula $A$
    • Case 3: the last inference rule in $\mathcal{E}$ is initial sequent not using cut formula $A$
  • The last inference rule is not initial sequent in $\mathcal{D}$ and $\mathcal{E}$
    • Case 4: $A$ is the principle formula of the last inference rule (which is not initial sequent) in both $\mathcal{D}$ and $\mathcal{E}$
    • Case 5: $A$ is not the principle formula of the last inference rule (which is not initial sequent) in $\mathcal{D}$
    • Case 6: $A$ is not the principle formula of the last inference rule (which is not initial sequent) in $\mathcal{E}$

The proof for the six cases are as follows.

  • Case 1: $\mathcal{D}$ is an initial sequent likeThen $\Gamma=\Gamma’,P$. We have $\Gamma’,P,P\Longrightarrow C$ by $\mathcal{E}$. By contraction theorem, we have $\Gamma’,P\Longrightarrow C$, which shows $\Gamma\Longrightarrow C$.
  • Case 2: $\mathcal{E}$ is an initial sequent using cut formula We have $C=P$, which shows $\Gamma\Longrightarrow C$.
  • Case 3: $\mathcal{E}$ is an initial sequent not using cut formula. Then So $\Gamma=\Gamma’,P$ and $C=P$. By rule init and weaking, we have $\Gamma’,P\Longrightarrow P$, so $\Gamma\Longrightarrow C$
  • Case 4: $A$ is the principle formula of the last inference in both $\mathcal{D}$ and $\mathcal{E}$. There are no cases for $\top$ and $\bot$ as they only have left or right rule, not both.

    • Case 4.1 $A = A_1\wedge A_2$

      By weakening on $\mathcal{D}$ (adding $A_1$ to the context), we get a structurally identical deduction $\mathcal{D’}$ as $\scriptsize
      \dfrac{
      \begin{array}{c}
      \mathcal{D_1} \
      \Gamma,A_1\Longrightarrow A_1
      \end{array}
      \quad
      \begin{array}{c}
      \mathcal{D_2} \
      \Gamma,A_1\Longrightarrow A_2
      \end{array}
      }
      {\Gamma,A_1\Longrightarrow A_1\wedge A_2}
      \wedge R$.

      Consider the following induction hypothesis.

    • Case 4.2 $A=A_1\supset A_2$Consider the following induction hypothesis.
    • Case 4.3 $A=A_1\vee A_2$Consider the following induction hypothesis.
  • Case 5: $A$ is not the principle formula of the last inference in $\mathcal{D}$. In this case $\mathcal{D}$ must end in a left rule, for which we can analyze case by case.
    • Case 5.1 $\wedge L_1$Then we have:
      • $\Gamma=\Gamma’,A_1\wedge A_2$
      • By weakening on $\mathcal{E}$, $\scriptsize\begin{array}{c}\mathcal{E’}\\Gamma’,A_1\wedge A_2,A_1,A\Longrightarrow C\end{array}$
      • By i.h. on $A_1\wedge A_2$, $\mathcal{D_1}$, $\mathcal{E’}$, then we have $\Gamma’,A_1\wedge A_2,A_1\Longrightarrow C$
      • By rule $\wedge L_1$, we have $\Gamma’,A_1\wedge A_2\Longrightarrow C$, thus $\Gamma\Longrightarrow C$
    • Case 5.2 $\wedge L_2$, the proof is symmetric to 5.1
    • Case 5.3 $\supset L$Then we have:
      • $\Gamma=\Gamma’,A_1\supset A_2$
      • By i.h. on $A_1\supset A_2$ and $\mathcal{D_2}$ and $\mathcal{E}$, we get $\Gamma’,A_1\supset A_2,A_2\Longrightarrow C$
      • By $\supset L$ on $\mathcal{D_1}$ and the above result, we get $\Gamma’,A_1\supset A_2\Longrightarrow C$, which shows that $\Gamma\Longrightarrow C$
    • Case 5.4 $\vee L$Then we have:
      • $\Gamma=\Gamma’,A_1\vee A_2$
      • By i.h. on $A_1\vee A_2$, $\mathcal{D_1}$ and $\mathcal{E}$, we have $\Gamma’,A_1\vee A_2,A_1\Longrightarrow C$
      • By i.h. on $A_1\vee A_2$, $\mathcal{D_2}$ and $\mathcal{E}$, we have $\Gamma’,A_1\vee A_2,A_2\Longrightarrow C$
      • By $\vee L$ on the above two results, we have $\Gamma’,A_1\vee A_2\Longrightarrow C$, which shows that $\Gamma\Longrightarrow C$
    • Case 5.5 $\bot L$Then $\Gamma = \Gamma’,\bot$. Apply $\bot L$, we have $\Gamma’,\bot\Longrightarrow C$, which show $\Gamma\Longrightarrow C$
  • Case 6: $A$ is not the principle formula of the last inference in $\mathcal{E}$. In this case, $\mathcal{E}$ must end in a left or right rule that not gets $A$ involved. We can analyse this case by case.
    • Case 6.1 $\wedge R$
      • $C=C_1\wedge C_2$
      • By i.h. on $A$, $\mathcal{D}$ and $\mathcal{E_1}$, we have $\Gamma\Longrightarrow C_1$
      • By i.h. on $A$, $\mathcal{D}$ and $\mathcal{E_2}$, we have $\Gamma\Longrightarrow C_2$
      • By rule $\wedge R$ on above two, we have $\Gamma\Longrightarrow C_1\wedge C_2$, thus $\Gamma\Longrightarrow C$
    • Case 6.2 $\supset R$
      • $C=C_1\supset C_2$
      • By i.h. on $A$, $\mathcal{D}$ and $\mathcal{E_1}$, we have $\Gamma,C_1\Longrightarrow C_2$.
      • Apply $\supset R$ with the above result, we have $\Gamma\Longrightarrow C_1\supset C_2$
    • Case 6.3 $\vee R_1$
      • $C=C_1\vee C_2$
      • By i.h. on $A$, $\mathcal{D}$ and $\mathcal{E_1}$, we have $\Gamma\Longrightarrow C_1$.
      • Apply $\vee R_1$ with the above result, we have $\Gamma\Longrightarrow C_1\vee C_2$
    • Case 6.4 $\vee R_2$. The proof is symmetric to 6.3
    • Case 6.5 $\top R$Then $C = \top$. Apply $\top R$, we have $\Gamma\Longrightarrow \top$, which show $\Gamma\Longrightarrow C$
    • Case 6.6 $\wedge L_1$Then we have:
      • $\Gamma=\Gamma’,A_1\wedge A_2$
      • By i.h. on $A$, $\mathcal{D}$, $\mathcal{E_1}$, we have $\Gamma’,A_1\wedge A_2,A_1\Longrightarrow C$
      • Apply $\wedge L_1$ with the above result, we get $\Gamma’,A_1\wedge A_2\Longrightarrow C$, which shows $\Gamma\Longrightarrow C$
    • Case 6.7 $\wedge L_2$, the proof is symmetric to 6.6
    • Case 6.8 $\supset L$
      • $\Gamma=\Gamma’,A_1\supset A_2$
      • By i.h. on $A$, $\mathcal{D}$, $\mathcal{E_1}$, we have $\Gamma’,A_1\supset A_2\Longrightarrow A_1$
      • By i.h. on $A$, $\mathcal{D}$, $\mathcal{E_2}$, we have $\Gamma’,A_1\supset A_2, A_1\Longrightarrow C$
      • Apply $\supset L$ with the above two results, we get $\Gamma’,A_1\supset A_2\Longrightarrow C$, which shows $\Gamma\Longrightarrow C$
    • Case 6.9 $\vee L$
      • $\Gamma=\Gamma’,A_1\vee A_2$
      • By i.h. on $A$, $\mathcal{D}$, $\mathcal{E_1}$, we have $\Gamma’,A_1\vee A_2,A_1\Longrightarrow C$
      • By i.h. on $A$, $\mathcal{D}$, $\mathcal{E_2}$, we have $\Gamma’,A_1\vee A_2, A_2\Longrightarrow C$
      • Apply $\vee L$ with the above two results, we get $\Gamma’,A_1\vee A_2\Longrightarrow C$, which shows $\Gamma\Longrightarrow C$
    • Case 6.10 $\bot L$Then $\Gamma = \Gamma’,\bot$. Apply $\bot L$, we have $\Gamma’,\bot\Longrightarrow C$, which show $\Gamma\Longrightarrow C$

7 Cut Elimination

The cut elimination is the theorem connecting cut-free sequent calculus and sequent calculus with cut. The cut elimination theorem states that if $\mathcal{D}$ is a deduction of $\Gamma\Longrightarrow C$ possibily using the cut rule, then there exists a cut-free deduction $\mathcal{D’}$ of $\Gamma\Longrightarrow C$. That is

In the following constructive proof building $\mathcal{D’}$,

Proof: by induction on the structure of $\mathcal{D}$.

  • Case $init$let $\mathcal{D’} = \mathcal{D}$.
  • Case $\top R$, Case $\bot L$ are same to the above case.
  • Case $\wedge L_1$By i.h. on $\mathcal{D_1}$, there is a cut-free deduction $\mathcal{D_1’}$ of $\Gamma’,B_1\wedge B_2,B_1\Longrightarrow A$. Apply $\wedge L$ to the result, we build the deduction $\mathcal{D’}$ of $\Gamma\Longrightarrow A$.
  • Case $\wedge R$By i.h. on $\mathcal{D_1}$, there is a cut-free deduction $\mathcal{D_1’}$ of $\Gamma\Longrightarrow A_1$.
    By i.h. on $\mathcal{D_2}$, there is a cut-free deduction $\mathcal{D_2’}$ of $\Gamma\Longrightarrow A_2$.
    Apply $\wedge R$ to the above results, we build the deduction $\mathcal{D’}$ of $\Gamma\Longrightarrow A$.
  • Case $\supset L$, Case $\supset R$, Case $\vee R_1$, Case $\vee R_2$, Case $\vee L$ are similar to the above two cases
  • Case $cut$By i.h. on $\mathcal{D_1}$, there is a cut-free deduction $\mathcal{D_1’}$ of $\Gamma\Longrightarrow B$.
    By i.h. on $\mathcal{D_2}$, there is a cut-free deduction $\mathcal{D_2’}$ of $\Gamma,B\Longrightarrow A$.
    Apply $cut$ theorem with the above results, we build the deduction $\mathcal{D’}$ of $\Gamma\Longrightarrow A$

Reference