In the previous post we saw two iterative sorting algorithms: selection sort and insertion sort both work using a loop to put the right values in the right places, one at a time. They are two of the simplest sorting algorithms. But they’re not very efficient — for long lists, they’re too slow.(1)
When we need a search algorithm, the data structure we want to search is more important than what we want to find. We could even say that linear search and binary search work on different data structures: linear search uses a list, and binary search uses a list which must be kept in order.(2)
Like all important ideas, recursion can be thought about in many ways, and there is no single “correct” or “best” way to think about it. Rather, each way of thinking is useful for some purposes, but not helpful in every situation.
- What recursion is,
- How to find recursive structures in problems, and
- How to write simple recursive functions to solve those problems.
We haven’t yet thought about how recursive functions actually work — what happens when a recursive function is called? In this post we’ll see three different ways to think about how recursive functions are actually computed.
A linked list is a recursive structure — a linked list node has a value, and a pointer to the next linked list node. This means we can solve problems on linked lists using structural recursion. In this post we’ll see three examples, to demonstrate the process of writing a structurally recursive algorithm.
Recursion is one of the most difficult ideas in programming, but sometimes it can be simple. A recursive function is a function which can call itself. Some recursive functions are very puzzling, but others are so natural that you can understand them even without knowing what recursion is.