Back to Student Worksheet
CS Grade 9-12 Answer Key

CS: Merge Sort and Quick Sort Algorithms

Comparing divide-and-conquer sorting strategies

Answer Key
Name:
Date:
Score: / 12

CS: Merge Sort and Quick Sort Algorithms

Comparing divide-and-conquer sorting strategies

CS - Grade 9-12

Instructions: Read each problem carefully. Show your work in the space provided. For tracing problems, write the array after each important step.
  1. 1

    Trace one full merge sort on the array [8, 3, 5, 1]. Show how the array is split and then merged into sorted order.

    Keep splitting until each subarray has one element, then merge in sorted order.

    The array splits into [8, 3] and [5, 1], then into [8], [3], [5], and [1]. The pairs merge as [3, 8] and [1, 5]. Finally, [3, 8] and [1, 5] merge into [1, 3, 5, 8].
  2. 2

    Trace the partition step of quick sort on the array [6, 2, 8, 4, 1] using 6 as the pivot. Write the values that should go to the left and right of the pivot after partitioning.

    Compare each other value with the pivot value 6.

    The values less than 6 are 2, 4, and 1, so they go to the left of the pivot. The value greater than 6 is 8, so it goes to the right. After partitioning, one valid arrangement is [2, 4, 1, 6, 8].
  3. 3

    Explain why merge sort is called a divide-and-conquer algorithm.

    Merge sort is called divide-and-conquer because it divides the array into smaller subarrays, solves each smaller sorting problem recursively, and then combines the sorted subarrays by merging them.
  4. 4

    For the array [9, 7, 5, 3, 1], suppose quick sort always chooses the first element as the pivot. Explain why this is a poor pivot choice for this array.

    A good pivot usually divides the array into two smaller parts of similar size.

    Choosing the first element as the pivot is poor because 9 is the largest value, so one side of the partition is empty and the other side contains almost all the remaining elements. This creates unbalanced partitions and can lead to O(n^2) running time.
  5. 5

    What is the worst-case time complexity of merge sort, and why does it have that complexity?

    The worst-case time complexity of merge sort is O(n log n). The algorithm makes about log n levels of recursive splitting, and each level performs O(n) total work to merge the elements.
  6. 6

    What is the worst-case time complexity of quick sort, and what type of partitioning causes it?

    Think about what happens if each recursive call only removes one element from the problem.

    The worst-case time complexity of quick sort is O(n^2). This happens when each partition is extremely unbalanced, such as when the pivot is always the smallest or largest element.
  7. 7

    Compare the extra memory usage of merge sort and quick sort in a typical array implementation.

    Merge sort usually requires O(n) extra memory because it creates temporary arrays during merging. Quick sort is often done in place and usually needs only O(log n) extra memory for the recursion stack when partitions are balanced.
  8. 8

    The array [4, 1, 6, 2, 5, 3] is being sorted using merge sort. After the array is split into [4, 1, 6] and [2, 5, 3], what are the sorted versions of those two halves before the final merge?

    Sort each half independently before combining them.

    The sorted version of [4, 1, 6] is [1, 4, 6]. The sorted version of [2, 5, 3] is [2, 3, 5]. These two sorted halves are ready for the final merge.
  9. 9

    Merge the two sorted lists [2, 6, 9] and [1, 5, 7, 10]. Show the final sorted list.

    The final sorted list is [1, 2, 5, 6, 7, 9, 10]. This is found by repeatedly choosing the smaller front value from the two sorted lists.
  10. 10

    A quick sort implementation chooses a random pivot. Explain one reason this can improve performance compared with always choosing the first element.

    Random choices make it harder for a specific input order to always produce bad partitions.

    Choosing a random pivot can improve performance because it reduces the chance of repeatedly making very unbalanced partitions. This makes the average running time more likely to be O(n log n).
  11. 11

    A sorting algorithm is stable if equal values keep their original relative order. Is standard merge sort usually stable, and why?

    Standard merge sort is usually stable when the merge step takes the left element first when two values are equal. This preserves the original relative order of equal values from the input.
  12. 12

    You need to sort a very large array, and you want guaranteed O(n log n) worst-case time. Between merge sort and quick sort, which would you choose and why?

    Focus on the guarantee in the worst case, not just average performance.

    Merge sort is the better choice if guaranteed O(n log n) worst-case time is required. Quick sort has O(n log n) average time, but it can degrade to O(n^2) in the worst case with poor pivot choices.
LivePhysics™.com CS - Grade 9-12 - Answer Key