## Recursive algorithms

### September 14, 2013

A *recursive* algorithm calls itself with a subset of the initial data until its base case is reached (e.g. data cannot be split further).

The canonical example is the factorial operation: .

A straightforward Python function that implements this idea follows:

def fact(x): if x <= 1: return 1 else: return x*fact(x-1)

In practice, recursion actually can speed up the execution of divide-and-conquer (DNQ) algorithms (i.e. in which the initial problem is sequentially split in more manageable sub-problems), as discussed in the next section.

**Analysis: the Master Theorem**

An asymptotic estimate of the running time of a recursive DNQ algorithm with *deterministically equal size subproblems* is given by the following formula: , where is the data size, and are the number of recursive calls per level (i.e. the branching factor of the recursion tree) and the associated reduction in input size, and describes the computational effort associated with preprocessing (i.e. all that is done before the recursions begin).

One can then identify three cases (for simplicity, is assumed to be a power of ):

- If then
- If then
- If then