General Recursive Function: Difference between revisions

From BusyBeaverWiki
Jump to navigation Jump to search
JLM (talk | contribs)
Utilizing Minimization: Simplify LRpart2 fcn explanation
JLM (talk | contribs)
Utilizing Minimization: Simplify LRpart3 fcn explanation.
Line 495: Line 495:
|<math>LRpart3[f] := C(LRpart2[f], C(TriP, P^2_1), P^2_1, P^2_2)</math>
|<math>LRpart3[f] := C(LRpart2[f], C(TriP, P^2_1), P^2_1, P^2_2)</math>
|<math>40 + |f|</math>
|<math>40 + |f|</math>
|<math>\lambda xy. LRpart2[f](TriP(x), x, y)</math>
|<math>\lambda xy. f(y\dot - (TriP(x)+1), TriP(x)+x\dot - y)</math>
|}
|}


[[Category:functions]]
[[Category:functions]]

Revision as of 05:03, 11 April 2026

General recursive functions (GRFs), also called µ-recursive functions or partial recursive functions, are the collection of partial functions k that are computable. This definition is equivalent using any Turing complete system of computation. See Wikipedia:general recursive function for background.

Historically it was defined as the smallest class of partial functions k that is closed under composition, recursion, and minimization, and includes zero, successor, and all projections (see formal definitions below). In the rest of this article, this is the formulation that we focus on exclusively. In this way, it can be considered to be a Turing complete model of computation. In fact, it is one of the oldest Turing complete models, first formalized by Kurt Gödel and Jacques Herbrand in 1933, 3 years before λ-calculus and Turing machines.

BBµ(n) is a Busy Beaver function for GRFs:BBμ(n)=max{f()|fGRF0,|f|=n}

where fGRFk means that f:k is a k-ary GRF and |f| is the "structural size" of f (defined below). In other words, it is the largest number computable via a 0-ary function (a constant) with a limited "program" size. It is more akin to the traditional Sigma score for a Turing machine rather than the Step function in the sense that it maximizes over the produced value, not the number of steps needed to reach that value.

Definition

Structure

Define GRFk inductively based on the following construction rules, start with Atoms and combine them using Combinators.

Atoms
  • Zero: k,ZkGRFk is the constant 0 function Zk(x1,,xk)=0
  • Successor: SGRF1 is the successor function S(x)=x+1
  • Projection: 1ik,PikGRFk is a projection function Pik(x1,xk)=xi
Combinators
  • Composition: k,m,hGRFm,g1,gmGRFk,C(h,g1,gm)GRFk is the composition or substitution of the gs into h: C(h,g1,gm)(x1,xk)=h(g1(x1,xk),gm(x1,xk))
  • Primitive Recursion: k,gGRFk,hGRFk+2,R(g,h)GRFk+1 is primitive recursion using g as the base case and h as the inductive step.
  • Minimization / Unlimited Search: k,fGRFk+1,M(f)GRFk is the µ-operator which allows unlimited search.

Primitive Recursion

R models a typical iterative function definition over ℕ.

Base case: Rk(g,h)(0,x2,x3,...,xk)=g(x2,x3,...,xk)

Iterative case (for x1>0): Rk(g,h)(x1,x2,...,xk)=h(x11,v,x2,x3,...,xk) where v=R(g,h)(x11,x2,x3,...,xk).

R can be recursively evaluated following its definition directly. Or it can be iterated over its first argument, starting with 0 (and thus a call to g), then 1, 2, 3, etc. until x1 is reached, each time calling h with the prior iteration count for its first argument and the result of the prior call for its second.

Minimization

Mk(f)(x1,...,xk)min{i:f(i,x1,...,xk)=0}

In computational language, when M(f) is evaluated it can be considered to calculate f(i,x1,...,xk) with i=0, then i=1, then i=2 etc. until one of the calls to f returns 0. It returns the value of i which first gave a result of 0. If no first argument causes f to return 0, M(f) doesn't return. (This is the only way for a GRF to not halt.)

Macros

In order to improve readability we define the following macros. For all fGRF1

Macro arity Definition Size Function
Plus constant Plus[n] 1 Plus[1]:=SPlus[n+1]:=C(S,Plus[n]) 2n1 λx.x+n
Constant Kk[n] k Kk[0]:=ZkKk[n]:=C(Plus[n],Zk) 2n+1 λx1xk.n
Iteration Rep[f,n] 1 Rep[f,n]:=R(K0[n],C(f,P22)) |f|+2n+4 λx.fx(n)
Iteration of successor RepSucc[f] 2 RepSucc[f]:=R(S,C(f,P23)) |f|+4 λx y.fx(y+1)
Diagonalization RepDiag[f] 1 RepDiag[f]:=C(RepSucc[f],S,S) |f|+7 λx.fx+1(x+2)
Ackermann iteration Ack[n,f] 1 Ack[0,f]:=fAck[n+1,f]:=Rep[Ack[n,f],1] 6n+|f|
Knuth base 2 up-arrows Knuth2[n] 1 Knuth2[0]:=Rep[Plus[2],0]Knuth2[n+1]:=Rep[Knuth2[n],1] 6n+7 λx.2nx
Ackermann Diagonalization

on Triangular nums

ADT[n] 1 ADT[0]:=TriADT[n+1]:=RepDiag[ADT[n]] 7n+7 λx.>10n(x+1)
Polygonal Poly[n] 1 Poly[n]:=R(Z0,R(S,C(Plus[n],P23))) 2n+5 λx.n2x(x1)+x
Tri 1 Tri:=Poly[1]=R(Z0,R(S,C(S,P23))) 7 λx.x(x+1)2
Square 1 Square:=Poly[2] 9 λx.x2

Champions

n BBµ(n) Champion Champion Found Holdouts Proven
1 = 0 Z0 Shawn Ligocki 8 Dec 2025 By hand
2 = 0 M(Z1),M(P11),C(Z0) Shawn Ligocki 8 Dec 2025 By hand
3 = 1 K0[1] Shawn Ligocki 8 Dec 2025 By hand
4 = 1 C0(K0[1]) Jacob Mandelson 3 Apr 2026
5 = 2 K0[2] Jacob Mandelson 3 Apr 2026
6 = 2 C0(K0[2]) Jacob Mandelson 3 Apr 2026
7 = 3 K0[3] Jacob Mandelson 3 Apr 2026
9 ≥ 4 K0[4]
11 ≥ 5 K0[5]
13 ≥ 6 K0[6]
14 ≥ 7 C(RepDiag[S],K0[2]) Shawn Ligocki 10 Apr 2026
16 ≥ 9 C(RepDiag[S],K0[3]) Shawn Ligocki 10 Apr 2026
17 ≥ 10 C(Tri,K0[4]) Shawn Ligocki 9 Dec 2025
18 ≥ 21 C(RepDiag[Tri],K0[1]) Jacob Mandelson 9 Apr 2026
20 ≥ 1540 C(RepDiag[Tri],K0[2]) Jacob Mandelson 9 Apr 2026
22 ≥ 26'357'430 C(RepDiag[Tri],K0[3]) Jacob Mandelson 9 Apr 2026
24 ≥ 64'449'908'476'890'321 > 1016.8 C(RepDiag[Tri],K0[4]) Jacob Mandelson 9 Apr 2026
25 >1010106.899>103.8388 C(ADT[2],K0[1]) Jacob Mandelson 9 Apr 2026
27 > 106.0834 C(ADT[2],K0[2]) Jacob Mandelson 9 Apr 2026
29 > 108.1944 C(ADT[2],K0[3]) Jacob Mandelson 9 Apr 2026
31 > 1010.2800 C(ADT[2],K0[4]) Jacob Mandelson 9 Apr 2026
32 > 10108.1944 C(ADT[3],K0[1]) Jacob Mandelson 9 Apr 2026
34 > 104 C(ADT[3],K0[2]) Jacob Mandelson 9 Apr 2026
36 > 105 C(ADT[3],K0[3]) Jacob Mandelson 9 Apr 2026
38 > 106 C(ADT[3],K0[4]) Jacob Mandelson 9 Apr 2026
7k+11 > 10k2 C(ADT[k],K0[1]) Jacob Mandelson 9 Apr 2026
7k+13 > 10k3 C(ADT[k],K0[2]) Jacob Mandelson 9 Apr 2026
7k+15 > 10k4 C(ADT[k],K0[3]) Jacob Mandelson 9 Apr 2026
7k+17 > 10k5 C(ADT[k],K0[4]) Jacob Mandelson 9 Apr 2026
6k+17 2k4 C(Knuth2[k],K0[4]) Shawn Ligocki 8 Dec 2025
6k+19 2k5 C(Knuth2[k],K0[5]) Shawn Ligocki 8 Dec 2025
6k+21 2k6 C(Knuth2[k],K0[6]) Shawn Ligocki 8 Dec 2025

Macro Bounds

ADT[k]

  • ADT[0](n)=Tri(n)=n(n+1)2>12n2
  • ADT[0]k(n)>2(n2)2k
  • ADT[1](n)=Trin+1(n+2)>2(n+22)2n+1=2((n+2)24)2n>A1010n/A where A=1log10(2) (if n5, for smaller values, this can be calculated directly)
  • ADT[1]k(n)>A(10)2knA>102k (if n4)
  • ADT[2](n)=ADT[1]n+1(n+2)>10(2n+2) (if n2)
  • ADT[k](n)>10k(n+1) (proof by induction on k with ADT[2] bound above as base case)

Utilizing Minimization

All the current champions are primitive recursive functions. In other words none use the minimization combinator M. This fundamentally limits their growth rate. In fact, no primitive recursive function can grow faster than the Ackermann function and we can see that above where the assymtotic growth of the known BBµ bound is Ackermann growth: BBμ(6k+17)2k4.

But, like the traditional BB function, BBµ grows uncomputably fast, so eventually it must surpass primitive recursive functions. In order to do that, it needs to use the M combinator. However, in order to do arbitrary computation, you need a way to store arbitrarily large amounts of data into a single integer and extract it back out. In other words, you need to implement a pairing function. Thus there is value in finding small pairing/unpairing functions. A set of pairing functions is a triple Pair,Left,Right such that for all a,b: Left(Pair(a,b)) = a and Right(Pair(a,b)) = b. When functions consume both the left and right values, common subexpression elimination can be used to reduce the number of operations below that from calling Left and Right individually. The smallest known pairing functions are:

Smallest Pairing Functions
Macro arity Definition Size Function
Pair 2 Pair:=C(AddS,C(Tri,Add),P12) 20 λxy.(x+y)(x+y+1)2+x+1
Left 1 Left:=C(RMonus,C(TriP,InvTriCeil),Pred) 38 Left(Pair(x,y))=x
Right 1 Right:=C(RMonus,P11,C(Tri,InvTriCeil)) 36 Right(Pair(x,y))=y
LRCall[f2] 1 LRCall[f]:=C(LRpart3[f],InvTriCeil,P11) 61+|f| LRCall[f](Pair(x,y))=f(x,y)

Where these are based on the following definitions:

Macros
Macro arity Definition Size Function
Addition Add 2 Add:=R(P11,C(S,P23)) 5 λxy.x+y
AddS 2 AddS:=R(S,C(S,P23)) 5 λxy.x+y+1
Predecesor Pred 1 Pred:=R(Z0,P12) 3 λx.x˙1
Monus RMonus 2 RMonus:=R(P11,C(Pred,P23)) 7 λxy.y˙x
Triangular numbers Tri 1 Tri:=R(Z0,AddS) 7 λx.x(x+1)2
TriP 1 TriP:=R(Z0,Add) 7 TriP(x+1)=Tri(x)
Inverting Tri RMonusTri 2 RMonusTri:=C(RMonus,C(Tri,P12),P22) 18 λxy.y˙Tri(x)
InvTriCeil 1 InvTriCeil:=M(RMonusTri) 19 λy.min{x|Tri(x)y}
Combined LRCall RightPiece 3 RightPiece:=R(P22,C(Pred,P24)) 7 λxyz.z˙x
LeftPiece 3 LeftPiece:=C(RMonus,C(S,P23),P13) 12 λxyz.x˙(y+1)
LRpart1[f] 3 LRpart1[f]:=C(f,LeftPiece,RightPiece) 20+|f| λxyz.f(x˙(y+1),z˙x)
LRpart2[f] 3 LRpart2[f]:=C(LRpart1[f],P33,P13,R(P12,C(S,P24))) 28+|f| λxyz.f(z˙(x+1),x+y˙z)
LRpart3[f] 2 LRpart3[f]:=C(LRpart2[f],C(TriP,P12),P12,P22) 40+|f| λxy.f(y˙(TriP(x)+1),TriP(x)+x˙y)