\documentclass{article}
\renewcommand\baselinestretch{0.95}
\topmargin -0.65in
\oddsidemargin -0.65in
\evensidemargin -0.65in
\textwidth 7.8in
\textheight 9.3in
\usepackage{amsfonts}
\def\spo{\hspace{1ex}}
\def\eq{\mathord{=}}
\def\eqd{\mathord{\spo\spo=_{\mbox{\footnotesize\it def}}\spo\spo}}
\def\ne{\mathord{\not\approx}}
\def\lthen{\mathord{\spo \rightarrow \spo}}
\def\liff{\mathord{\spo \leftrightarrow \spo}}
\def\land{\mathord{\spo \wedge \spo}}
\def\lor{\mathord{\spo \vee\spo}}
\def\Mthen{\mathord{\spo\spo\Rightarrow\spo\spo}}
\def\Miff{\mathord{\spo\spo\Leftrightarrow\spo\spo}}
\def\A{\mathord{{\mathfrak A}}}
\def\sA{\mathord{^{\A}}}
\def\Cn{\mathord{\mbox{Cn}}}
\def\B{\mathord{{\mathfrak B}}}
\def\G{\mathord{{\cal G}}}
\def\N{\mathord{{\mathfrak N}}}
\def\M{\mathord{{\mathfrak N}}}
\def\Nat{\mathord{{\mathbb N}}}
\newcommand\R{{\mathbb R}}
\newcommand\Z{{\mathbb Z}}
\newcommand\Lang{{\cal L}}
\newcommand\E{\mbox{\em Exp}}
\newcommand\Mod{\mbox{\hspace{1pt}Mod\hspace{1pt}}}
\newcommand\Th{\mbox{\hspace{1pt}Th\hspace{1pt}}}
\newcommand\Con{\mbox{\hspace{1pt}Con\hspace{1pt}}}
\def\AS{\mathord{{\mathbb A}_S}}
\def\AL{\mathord{{\mathbb A}_L}}
\def\AE{\mathord{{\mathbb A}_E}}
\def\sN{\mathord{^{\N}}}
\def\sA{\mathord{^{\A}}}
\def\sB{\mathord{^{\B}}}
\def\NS{\mathord{\N_S}}
\def\NL{\mathord{\N_L}}
\def\NA{\mathord{\N_A}}
\def\NM{\mathord{\N_M}}
\def\NE{\mathord{\N_E}}
\newcommand\PAE{{\mathbb P}{\mathbb A}_E}
\def\St{\mathord{{\cal S}t}}
\def\un{\underline}
\def\a{\mathord{\vec{a}}}
\def\app{\mathord{a^{\prime\prime}}}
\def\b{\mathord{\vec{b}}}
\def\c{\mathord{\vec{c}}}
\def\n{\mathord{\vec{n}}}
\def\p{\mathord{\vec{p}}}
\def\q{\mathord{\vec{q}}}
\def\Th{\mathord{Th}}
\def\v{\mathord{\vec{v}}}
\def\w{\mathord{\vec{w}}}
\def\x{\mathord{\vec{x}}}
\def\y{\mathord{\vec{y}}}
\def\z{\mathord{\vec{z}}}
\def\tp{\mathord{t^\prime}}
\def\tpp{\mathord{t^{\prime\prime}}}
\def\sp{\mathord{s^\prime}}
\def\bp{\mathord{b^\prime}}
\def\<{\mathord{\langle}}
\def\>{\mathord{\rangle}}
\def\Prime{\mathord{\mbox{prime}}}
\def\lh{\mathord{\mbox{lh}}}
\def\tf{\mathord{\tilde{f}}}
\def\seqno{\mathord{\mbox{seq$\_$no}}}
\def\restrict{\mathord{|\hspace{-4pt}\stackrel{\backslash}{\hspace{2pt}}}}
\def\twotree{\mathord{\mbox{2-Tree}}}
\def\form{\mathord{\mbox{formula}}}
\def\sent{\mathord{\mbox{sentence}}}
\def\branch{\mathord{\mbox{branch}}}
\def\leaf{\mathord{\mbox{leaf}}}
\def\tail{\mathord{\mbox{tail}}}
\def\appendleaf{\mathord{\mbox{append@leaf}}}
\begin{document}
\centerline{\LARGE\bf Math Logic 2}
\centerline
{\Large\bf $\Sigma_1$-definability, Recursiveness, and Incompleteness}
\vspace{15pt}
\centerline{\large\bf Prologue}
\vspace{7pt}
The following notes contain a reworking of Enderton's Chapter 3, \S\S
3-5. My reason for creating them is that I have trouble keeping track
of what properties Enderton is describing at any particular moment.
So below I concentrate on relations and functions on the natural
numbers definable by formulas in three different form, called
$\Delta_0$, $\Sigma_1$, and $\Pi_1$. These will be defined below.
But an intuition is this:
\begin{description}
\item [$\Delta_0$ formulas] are formulas whose truth depends, obviously,
upon only the arithmetic properties of
a known finite number of natural numbers --- and thus whose truth could
be checked by a (fast enough) computer.
\item [$\Sigma_1$ formulas] are of the
form $\exists x_1\exists x_2 \ldots \exists x_k\phi(\x,\y)$, where
$\phi$ is $\Delta_0$; thus the truth of $\Sigma_1$ formula can be
verified by conducting a simple search for $\x$ satisfying the $\phi$,
and by shouting ``aha!'' when you've found such $\x$. (Don't worry
about the fact that you'll keep on searching forever if you never find
such an $\x$.)
\item [$\Pi_1$ formula] are of the form
$\forall x_1\forall x_2\ldots\forall x_k\phi(\x,\y)$; thus, they are
equivalent to the negations of $\Sigma_1$ formulas.
\item [$\Delta_1$-definable relations] are relations which have both
$\Pi_1$ and $\Sigma_1$ definitions, i.e., relations $R$ where both $R$
and its complement have $\Sigma_1$ definitions. We can test for the
truth of such a formula by timesharing between the search described
above corresponding to the $\Sigma_1$ definition of $R$ and the search
corresponding to the $\Sigma_1$ definition of $\overline{R}$. Because
any tuple $\y$ is in either $R$ or $\overline{R}$, one of these two
searches has to end sooner or later. When the first one ends, we'll
have our answer of whether $R(\y)$ holds.
\end{description}
\vspace{1cm}
\centerline{\large\bf Ye Notes}
\begin{enumerate}
\item Standard {\em abbreviations} when we write formulas --- in
addition to using usual infix order instead of prefix order
when it improves readability:
\[\begin{array}{|c|c|}\hline
\mbox{\bf Abbreviation} & \mbox{\bf Abbreviates} \\ \hline
x \ne y & \neg x \eq y \\
x \le y & (x \eq y \lor x < y) \\
x > y & y < x \\
x \ge y & (x \eq y \lor y < x) \\\hline
\forall x_{1 \land \forall x,y0 \land 2\not|z \land
(\forall q_{**$
--- is 1. (This is a special case of what follows below ---
with $k=-1$ --- under the understanding that the product of 0
numbers is 1.)
The code for a non-empty sequence $(a_0,a_1,\ldots,a_k)$ ---
called $\$ --- is
\[2^{a_0+1}\cdot 3^{a_1+1} \cdots p_k^{a_k+1}.\]
{\bf Comment:} There are many convenient ways to encode a
finite sequence of natural numbers with a natural number.
Another way is the following: Let $\sigma_i$ ($0\le i\le k$)
denote the base 2 representation of $a_i$, considered as a
string of 0's and 1's. Now let the code for the sequence be
the number with {\em base 3} representation
$2\sigma_02\sigma_12\sigma_22\ldots2\sigma_k2$.
The following theorem is expected of almost any sequence coding
method:
{\bf Thrm:} If $\ = \$
then $k=l$ and $a_0=b_0,a_1=b_1,\ldots,a_k=b+k$.
{\bf Proof:} This follows from the uniqueness of factorizations.
{\bf N.B.:} Why are 0, 3, and 21 {\em not} codes for sequences?
{\bf The important point here} is that, after we prove that,
once we show that the encoding and decoding functions are
recursive, we can refer to {\em finite sequences} in natural
numbers when we actually quantify only over single natural
numbers. (G\"odel had to work a good deal harder to get this
since he worked over $\N=(\Nat,0,S,<,+,\cdot)$ [see \S3.7];
Enderton makes life easier for us by including exponentiation
as a function in the structure.)
\item {\bf Thrm:} For each fixed $k\ge -1$, the function mapping
$a_0,a_1,\ldots a_k$ to $\$ is
$\Sigma_1$-definable.
{\bf Proof:} In fact, it's trivially $\Delta_0$-definable:
$\phi_k(a_0,\ldots,a_k,b) \eqd
b = 2^{a_0+1}\cdot 3^{a_1+1} \cdots p_k^{a_k+1}$.
\item {\bf Defn:} Let
$(a)_b = \max x)_b =
(2^{a_0+1}\cdot 3^{a_1+1}\cdots p_k^{a_k+1})_b = a_b$,
and for $b>k$,
$(\)_b =
(2^{a_0+1}\cdot 3^{a_1+1}\cdots p_k^{a_k+1})_b = 0$,
so $(a)_b$ is a decoding function.
% \[(a)_b \eqd \left\{\begin{array}{ll}
% 0 & \mbox{if }a=0 \\
% \max x =
% 2^{a_0+1}\cdot 3^{a_1+1}\cdots p_b^{a_b+1}$
%
% {\bf Thrm:} $(a)_b$, as a function of $a$ and $b$, is
% $\Sigma_1$ definable.
%
% {\bf Proof:}
\item {\bf Defn:} For any sequence code $\$,
the {\em length} of $\$ is $m+1$.
{\bf Thrm:} There is a $\Sigma_1$-definable function $\lh$
such that, for any sequence code $a = \$,
$\lh(a)=m+1$.
{\bf Proof:} $\lh(a)= \mu m(a=0 \lor p_m\not|a)$ works.
{\bf Defn:} Let $\lh$ be the above function.
\item\label{sequence-number-recursive}
{\bf Defn:} A natural number $a$ is a {\em sequence number}
(or {\em sequence code}) if there are natural numbers
$m,a_0,\ldots,a_m$ such that $a = \$.
{\bf Thrm:}
The set of sequence numbers is recursive.
{\bf Proof:} Exercise.
{\bf Defn:} Let $\seqno(a)$ be a $\Sigma_1$-formula asserting that
$a$ is a sequence number.
\item {\bf Defn:} For natural numbers $a=\$
and $b$,
\[a\restrict b = \left\{\begin{array}{ll}
\&\mbox{if }b\le m\\
a &\mbox{otherwise.}
\end{array}\right.
\]
{\bf Thrm:}
The function $a\restrict b$ is a $\Sigma_1$-definable function
of 2 arguments.
{\bf Proof:}
$a\restrict b = \mu n [a=0 \lor
\forall j\le b \forall k,0,\b) \\
f(1,\b) &=& g(\,1,\b)\\
f(2,\b) &=& g(\,2,\b)\\
\vdots& & \vdots \\
f(n,\b) &=& g(\tf(n,\b),n,\b) \\
\vdots& & \vdots
\end{eqnarray*}
{\bf N.B.:} We can prove by induction on $n$ that there is a unique
value $f(n,\b)$ meeting the above definition. Thus
$f$ is a well-defined function.
{\bf N.B.:} Why is this a very specialized case of definition by
recursion? We do allow defining $f(n,\b)$ by using (recursively)
any or all of the values $f(0,\b),f(1,\b),\ldots,f(n-1,\b)$. First,
we do not allow simultaneous recursion on two variables,
as in $f(0,0)=2; f(m,n) = f(m-1,n)^{f(m,n-1)}$ which does not match
the above form. Second, we do not allow changing any of the other
variables in the recursive call, as in
$f(0,a) = 2^a; f(n,a)= 3+f(n-1,2a)$. We even require the recursion
to be on the first argument of $f$, though this is really no
restriction.
\item~\label{theorem-on-primitive-recursion}
{\bf Thrm:} Suppose that a $k+1$ ary function $f$ is defined
from a $k+2$-ary function $g$ by primitive recursion. If
$g$ is $\Sigma_1$-definable, so is $f$.
{\bf Proof:}
We first show that $\tf$ is $\Sigma_1$-definable.
For
\[\tf(a,\b)=\mu s
[seqno(s) \land \lh(s)=a \land
\forall i$ and
$b=\$,
\[a*b = \.\]
{\bf Thrm:} The function $*$ is $\Sigma_1$ definable.
{\bf Proof:}
\[a*b = \mu s [\lh(s) = \lh(a)+\lh(b) \land
\forall i<\lh(a) ((s)_i = (a)_i) \land
\forall i<\lh(b) ((s)_{i+\lh(a)} = (b)_i).\]
Just as with sums and products, we can define the concatenation
as $i$ runs from 0 to $a-1$ of the values of some function; see
Enderton, page 215.
\end{enumerate}
%\newpage
\item {\bf HOMEWORK E:}
\begin{enumerate}
\item Prove Theorem~\ref{characteristic-function}.
\item Prove Theorem~\ref{sequence-number-recursive}.
\item Prove the case of Theorem~\ref{theorem-on-primitive-recursion}
for $g$ a binary function, as in the second part of
Definition~\ref{definition-of-primitive-recursion}.
\item Enderton, \S3.3 (page 216) \# 7
\item\label{exercise-3.3.8}
[Ordinary definition by primitive recursion]
Show that if $g,h$ are $\Sigma_1$-definable functions on $\Nat$,
so is $f$ defined by $f(0,\b)=g(\b); f(a+1,\b)=h(f(a,\b),a,\b)$.
\item\label{definition-by-cases} {\bf Definition by Cases:}
Show that if $R$ is a recursive relation and $g,h$ are
$\Sigma_1$ definable functions, then so is
\[f(\a) = \left\{\begin{array}{ll}
g(\a) & \mbox{if } \a\in R \\
h(\a) & \mbox{otherwise.}
\end{array}\right.\]
% \item Do problem \S3.3\#11 in Enderton, replacing the term
% ``representable'' with ``recursive.''
%
\item\label{sigma-1-inductive-definability}
\begin{enumerate}
\item
Suppose $R$ is a recursive set (unary relation) and that
$f_1,\ldots,f_n$ are $\Sigma_1$-definable functions. (The
$f_i$'s may well have different numbers of argument places;
say that $f_i$ is $k_i$-ary.) The {\em closure} of $R$ under
the $f_i$'s is the smallest set $S$ such that (1) $R\subseteq
S$ and (2) for each $f_i$ and each $x_1,\ldots,x_{k_i}\in S$,
$f_i(x_1,\ldots,x_{k_i})\in S$. This set $S$ can be constructed
recursively as follows:
\begin{eqnarray*}
S_0 &=& R \\
S_{i+1} &=& S_i \cup \bigcup_{1\le i\le n}
\{f_i(x_1,\ldots,x_{k_i}): x_1,\ldots, x_{k_i}\in S_i \\
S &=& \bigcup_{i\in\Nat} S_i
\end{eqnarray*}
Prove that $S$ is $\Sigma_1$-definable (i.e., r.e.).
\item Suppose that, in addition, the functions $f_1,\ldots,f_n$ are
{\em increasing}; that is, whenever
$f_i(x_1,\ldots,x_{k_i}) = y$,
$y>x_1$, $y>x_2$, \ldots, and $y>x_{k_i}$.
Prove that the set $S$ above is then recursive.
\end{enumerate}
\end{enumerate}
%\vspace{15pt}
\newpage
%\vspace{1cm}
\centerline{\large\bf Arithmetrization of Syntax}
\vspace{15pt}
\item One of G\"odel's insights was to observe that, in number theory
(an object being discussed in first order logic), we could find
a way to talk about the formulas of first order logic themselves
and, to an extent, their satisfaction or provability. In a
sense, G\"odel found a way partially to bridge the
object-language/meta-language gap.
To this end we start out by assigning integers to the various
symbols of the language of $\N_E$ (including the ``logical
symbols''). Any ``reasonable'' assignment will do; Enderton
chooses the symbols in the table below. (I added $\bot$ since
it's key in Huth \& Ryan's natural deduction proof system.) Now
recall that every sentence is equivalent to a sentence whose
only propositional connectives are $\neg$ and $\lthen$ (and I
also allow $\bot$) and whose only quantifiers are $\forall$;
thus we may, as Enderton does, treat every other formula as an
{\em abbreviation} for a formula built up under this
restriction. That's why the other connectives and $\exists$
don't appear on the table.
\[\begin{array}{|c@{:}c|c@{:}c|}\hline
\mbox{\bf Symbol}&\mbox{\bf Number}&\mbox{\bf Symbol}&\mbox{\bf Number}
\\\hline
\forall & 0 & ( & 1 \\
0 & 2 & ) & 3 \\
S & 4 & \neg & 5 \\
< & 6 & \lthen & 7 \\
+ & 8 & = & 9 \\
\cdot & 10 & , & 11 \\
E & 12 & \bot & 13 \\
v_n & 14+n\\\hline
\end{array}
\]
Such numbering, in general, is called {\em G\"odel numbering.}
Enderton calls the function given above $h$, i.e., $h(\forall)=0$.
In the defining formulas below the symbol $h$ doesn't actually
occur. When I refer to, say, $h(\forall)$, I'm really referring to
the number 0; when I refer to $h(v_{17})$, I'm really referring to the
number 31. When I refer to ``some variable'', I'm really referring
to ``some number $\ge 14$.''
Having G\"odel numbered the individual symbols, we G\"odel
number the set of strings $s0\ldots s_n$ of such symbols in the
by-now fairly obvious fashion; recall that we are using an actual
formula, not its abbreviation. (Does any remember the full
rules for parenthesization any more?)
\begin{description}
\item [For a string $s_0\ldots s_n$ of language symbols]
{\bf -- expressions -- (e.g., a formula):}
\[\#(s_0\ldots s_n) = \
= 2^{h(s_0)+1}\cdot 3^{h(s_1)+1}\cdots p_n^{h(s_n)+1}.\]
(Note that the G\"odel number of a formula is a natural number.)
\item [For a set $\Phi$ of strings:] $\#(\Phi)=\{\#(\phi):\phi\in\Phi\}$.
(Note that the G\"odel number of a set of formulas is a set of
natural numbers.)
\item [For a sequence] {\bf $\phi_0,\ldots,\phi_k$ of expressions:}
\[\G(\alpha_0,\ldots,\alpha_m) =
\<\#(\alpha_0),\ldots,\#(\alpha_m)\>=
2^{\#(\alpha_0)+1}\cdot 3^{\#(\alpha_1)+1}\cdots
p_m^{\#(\alpha_m)+1}.\]
\item [For a natural deduction proof:]
We don't need to store the justifications for each step; we
can infer them --- we just exhaustively check the current
step against each legal combination of previous steps to see
whether the current step is justified from those by a single
application of a single proof rule. I won't go through that
here --- but you should {\em sketch} such a construction in
your mind.
We do, however, need to store nested subproof structure ---
sort of. To do this, just as we did in the soundness
proof for natural deduction proofs, we shall consider a proof
as a sequence of {\em sequents}
$\chi_0,\chi_1,\ldots,\chi_k\vdash\psi$. We may assume that,
when we add hypotheses, they always get added onto the left.
So, for example, $\rightarrow$-introduction is expressed as
\[\begin{array}{ccc}
\chi_0,\chi_1,\ldots,\chi_k &\vdash& \psi \\\hline
\chi_1,\ldots,\chi_k &\vdash& (\chi_0\rightarrow\psi)
\end{array}\]
We shall represent such a sequent as
$\G(\chi_0,\chi_1,\ldots,\chi_k,\psi)$.
Now that we have the additional hypotheses explicitly represented,
we no longer need to represent the fact that subproofs are
{\em consecutive} sequences of steps. (The soundness proof
showed we didn't need to enforce that.) All that we need to
require is that the proof be a finite sequence of sequents, and
that each sequent follows from 0 or more preceding sequents
by the deduction rules.
Note that the $\forall$-introduction rules is strange in this
format. For $\forall$-introduction we created a subproof
with {\em no} additional assumptions but with a {\em new
variable} ``flagged.'' In this context that will translate
as no additional assumption at all --- just a check to make
sure that the variable is ``new.'' And ``new'' means just
that it doesn't appear in any of the ``hypotheses'' of the
sequent.
In order to keep the description of a proof structure relatively
straightforward, we assume that only the copy rule is allowed
to access a statement {\em outside} the current subproof ---
{\sl i.e.}, a statement whose hypotheses do not exactly match
the hypothesis where we are working.
Now the G\"odel number for a proof $P$ consisting of sequents
$s_1,\ldots,s_k$ is $\G(P)=2^{s_1+1}\cdot 3^{s_2+1}\cdots$
as usual.
\end{description}
\item {\bf Thrm:} The following sets of G\"odel numbers are
recursive:
\begin{enumerate}
\item {\bf The set $Var$ of G\"odel numbers of variables $x$:} by
$x>S^{13}0$.
\item\label{term-numbers}
{\bf The set of G\"odel numbers of terms:}
This is a corollary of
Exercise~\ref{sigma-1-inductive-definability}. The set of
numbers of constant symbols is $\{2\}$, and every finite set
is recursive. Thus $\{2\}\cup Var$ is recursive. (Which result
am I quoting?)
Now the set of terms is the closure of $\{2\}\cup Var$ under the
functions creating larger terms from smaller ones:
\begin{eqnarray*}
f_S(\#(t)) &=& h(S) * \#(t) \mbox{ (Recall that $*$
is used here for concatenation.)} \\
f_+(\#(t_1),\#(t_2)) &=& h(+)*\#(t_1)*\#(t_2) \\
f_{\cdot}(\#(t_1),\#(t_2)) &=& h(\cdot)*\#(t_1)*\#(t_2) \\
f_E(\#(t_1),\#(t_2)) &=& h(E)*\#(t_1)*\#(t_2)
\end{eqnarray*}
The first concatenates a fixed constant, $h(S)=4$, onto the
front of its input string, so it is $\Sigma_1$-definable by
Theorem~\ref{concatenation}. The others are similarly seen
to be recursive. Moreover, it is trivial to check that all
three are increasing: for example, $f_S(\#(t)) \ge h(S)\cdot\#(t)$,
and since $\#(t)>0$ (0 was not a sequence code), it follows that
$f_S$ is increasing.
\item {\bf The set of G\"odel numbers of atomic formulas:} Again, we'll
use the fact that the concatenation operator is increasing ---
and if $a*b = a$ or $b*a = a$ then $b$ must be the empty sequence.
So
\[x \in AtomForm \Miff \exists y,z < x
(y \in Term \land z\in Term \land (x= h(\approx)*y*z \lor
x = h(<)*y*z)).\]
\item {\bf The set of G\"odel numbers of formulas:}
Similar to Theorem~\ref{term-numbers}.
\end{enumerate}
\item {\bf Thrm:} The following functions on G\"odel numbers are
$\Sigma_1$-definable.
\begin{enumerate}
\item {\bf The function $f(n)=\#(S^n0)$.}
\begin{eqnarray*}
f(0) &=& \ \\
f(n+1) &=& \*f(n)
\end{eqnarray*}
which is $\Sigma_1$-definable by Exercise~\ref{exercise-3.3.8}.
\item\label{parsing}
{\bf The parsing functions which,} given the G\"odel number
of a term $S t_1$, $+ t_1 t_2$, $\cdot t_1 t_2$, or $E t_1
t_2$ or atomic formula $\approx t_1 t_2$ or $< t_1 t_2$,
return (i) the G\"odel number of the first operation, $S$,
$+$, $\cdot$, $E$, $\approx$, or $<$, (ii) the arity of that
operator, (iii) the first operand ($t_1$ above), and (iv), if
the operator is binary, the second operand.
{\bf The parsing functions which,} given the G\"odel number of
a formula, determine whether it is atomic, an implication
$\phi\liff\psi$, a negation $\neg\phi$, or a universal
quantification $\forall x \phi$, and return $\phi$ and also
$\psi$ or $x$ if appropriate.
{\bf Proof:} Homework.
\item {\bf The substitution function $Sb$ where} for $\alpha(x)$ a
term or formula, $t$ a term substitutable for $x$ in $\alpha$,
and $\alpha(t)$ the result of the substitution,
$Sb(\#(\alpha(x)),\#(x),\#(t)) = \#(\alpha(t))$.
This function is defined by primitive recursion, broken into
cases using the functions above. The fact that it is in fact
an instance of primitive recursion depends upon the form of
the definition --- note that it does match the required form
for primitive recursion --- plus the fact that if $a$ is a
proper substring of $b$, then $\#(a)<\#(b)$.
\begin{eqnarray*}
Sb(\#(\approx t_1t_2),\#(x),\#(t))
&=& \left\{\begin{array}{ll}
\#(\approx t t) = \*\#(t)*\#(t) &\mbox{if }x=t_1=t_2\\
\#(\approx t t_2)=\*\#(t)*\#(t_2)&
\mbox{if }x=t_1\not=t_2\\
\#(\approx t_1 t)=\*\#(t_1)*\#(t)&
\mbox{if }x=t_2\not=t_1\\
\#(\approx t_1 t_2)=\*\#(t_1)*\#(t_2)&
\mbox{otherwise}
\end{array}\right.\\
Sb(\#(\approx t_1t_2),\#(x),\#(t)) &=&
\left\{\begin{array}{ll}
\#(< t t) = \*\#(t)*\#(t) & \mbox{if }x=t_1=t_2\\
\#(< t t_2)=\*\#(t)*\#(t_2)&\mbox{if }x=t_1\not=t_2\\
\#(< t_1 t)=\*\#(t_1)*\#(t)&\mbox{if }x=t_2\not=t_1\\
\#(< t_1 t_2)=\*\#(t_1)*\#(t_2)&\mbox{otherwise}
\end{array}\right.\\
Sb(\#(\neg\phi),\#(x),\#(t)) &=& \*
Sb(\#(\phi),\#(x),\#(t))*\ \\
Sb(\#(\phi\rightarrow\psi),\#(x),\#(t)) &=&
\*Sb(\#(\phi),\#(x),\#(t))
*\*
Sb(\#(\psi),\#(x),\#(t))*\ \\
Sb(\#(\forall y\phi),\#(x),\#(t)) &=&
\left\{\begin{array}{ll}
\#(\forall y\phi) & \mbox{if }x=y \\
\ *
Sb(\#(\phi),\#(x),\#(t)) & \mbox{otherwise.}
\end{array}\right.
\end{eqnarray*}
In the last clause above I used Nerode and Shore's
parenthesization. Writing the actual $\Sigma_1$ formula is
left for the student!
\item {\bf The function $\tail$} which, given a sequence code
$\$, returns its tail (a.k.a. CDR),
that is, $\$: Left for students.
\end{enumerate}
\item {\bf Thrm:} The following relations are recursive:
\begin{enumerate}
\item {\bf The relation $Fr$ such that $\<\#(\alpha),\#(x)\>\in Fr$ iff
$x$ occurs free in $\alpha$:}
$Sb(a,b,\#(0))\not= a$ defines $Fr$; $Sb(a,b,\#(0))\not= a$
defines $\overline{Fr}$.
\item {\bf The set of G\"odel numbers of sentences:} $\#(a)$ is a sentence
iff, for all $\#(b)<\#(a)$, $b$ doesn't not occur free in $a$.
\item {\bf The set of ({\em syntactically correct}) sequents} $s$ as
described above. This just checks that $s$ is a sequence code,
length$(s)\ge 1$, $s_{\mbox{length}(s)}$ is a formula, and
for each $i<\mbox{length}(s)$, $s_i$ is either a formula or
$3^j$ for some $j$ in $Var$.
\item {\bf For each proof rule, the set of all valid applications
of that proof rule.}
We have do to this proof rule by proof rule --- I'll go through
a couple of examples and leave the rest for you. (But remember
that we aren't using $\exists$, $\wedge$, and $\vee$ here, so
we don't have to worry about those rules.)
\begin{itemize}
\item {\em Using an assumption}\/: $g =
\G(\alpha_0,\alpha_1,\alpha_k)$ is a legal sequent if
$\alpha_k$ is one of the hypotheses --- one of the
$\alpha_i$'s for $i y)$.\\
{\bf [L5]}& $\forall x,y,z (x**