Read sections 5.15.6 in chapter 5 of the textbook Explorations in Computing.
For example, if we call sum_odd([3, 1, 4, 1, 5, 9, 2]), then the function should return the value 19 since 3 + 1 + 1 + 5 + 9 = 19.
Hint: what should the function return if the list is empty? What should it return if the first item in the list is odd? What should it return if the first item is not odd?
One way to convince yourself that the algorithm works is to show that every possible position in the ordered list, including both the values themselves (when the key is found) and the positions between values (when the key is not found and the binary search should return nil), can be derived by following the algorithm for updating mid, low, and high. To do this, we're going to construct a complete binary search tree. First we introduce some notation for depicting nodes in the search tree:def binsearch(list, key) low = 1 high = list.length while low+1 < high mid = (low+high) / 2 if key == list[mid] then return mid elsif key < list[mid] then high = mid else low = mid end end return nil end


(i) In the diagram below, we've filled in the numeric values for three of the nodes for the case where the key is "Aardvark". The binary search always begins with low = 1 and high = 5, giving mid = 2, which puts us at "Charlie". So all searches of this list start at "Charlie". If the key is less than this value, the algorithm tells us to set high equal to mid, so we have a high of 2 and a low of 1, which gives a mid of 0, taking us to node "Alpha", as shown. If we go left from "Alpha" (because the key is less than "Alpha"), the next node has a high of 0 and a low of 1. Since now low+1 = high, the algorithm returns nil, making this node a terminal (leaf) node. Print out the diagram, or redraw it yourself, and complete the tree by following the algorithm to fill in all the missing values, so that every node has either two or three numbers written next to it. For example, you might choose a key of "Bravo" and see how the algorithm gets to that node. Then choose a key of "Banana" and see at which leaf node the algorithm ends. And so on. Include your complete annotated tree in the pages you hand in.
(iii) Notice that all the nonterminal (branch) nodes contain elements of the list. What pattern do you notice in their mid values?
NonDestructive Merge Sort Algorithm:
1. Sort the first half of the array using merge sort.
2. Sort the second half of the array using merge sort.
3. Merge the two sorted halves to get a sorted result array.
list[list.length/2]
), which we
will call the pivot.
All of those that are less than the pivot go into one list
and all of those that are greater than or equal to the pivot go
into a second list. Then we sort these two sublists (recursively).
The final sorted list is the first sorted sublist followed by
the pivot followed by the second sorted sublist.
For example, if we want to sort the list
list = [56, 42, 82, 75, 18, 58, 27, 61, 84, 41, 21, 15, 71, 90, 33]we create two lists, separating the elements based on the pivot 61:
list1 = [56, 42, 18, 58, 27, 41, 21, 15, 33] list2 = [82, 75, 84, 71, 90]
Each sublist is sorted recursively (final results shown for each sublist):
list1 = [15, 18, 21, 27, 33, 41, 42, 56, 58] list2 = [71, 75, 82, 84, 90]
and the final result is
[15, 18, 21, 27, 33, 41, 42, 56, 58] + [61] + [71, 75, 82, 84, 90] => [15, 18, 21, 27, 33, 41, 42, 56, 58, 61, 71, 75, 82, 84, 90]
Consider this quick sort algorithm sorting a list of n integers.
Into what two sublists would this algorithm split the
list [56, 42, 18, 58, 27, 41, 21, 15, 33]
?
This algorithm is recursive. What is the base case for this recursion? (In other words, when does quick sort stop partitioning a list into two sublists?) BE CAREFUL: This is a little trickier than it seems.
The two sublists list1 and list2 need not be of the same length. One could even be empty. It depends on the value of the pivot element. Explain what sort of input would ensure that this algorithm would have an order of complexity of O(n log n). Give a simple example to justify your answer.
Explain how an unlucky sequence of pivots could give this algorithm a worst case complexity of O(n^{2}). Give a simple example to justify your answer.