Download presentation
Presentation is loading. Please wait.
1
Ch. 10 Algorithm Efficiency & Sorting
O( ): Big-Oh An algorithm is said to take O(f (n)) if its running time is upper-bounded by cf(n) e.g., O(n), O(n log n), O(n2), O(2n), … Formal definition O(f(n)) = { g(n) | ∃c > 0, n0 ≥ 0 s.t.∀n ≥ n0, cf(n) ≥ g(n) } g(n) ∈ O(f(n)이 맞지만 관행적으로 g(n) = O(f(n))이라고 쓴다. 직관적 의미 g(n) = O(f(n)) ⇒ g grows no faster than f
2
Ω(f(n)) Θ(f(n)) 적어도 f(n)의 비율로 증가하는 함수 O(f(n))과 대칭적 f(n)의 비율로 증가하는 함수
3
Time requirements as a function of the problem size n
4
A comparison of growth-rate functions
5
A comparison of growth-rate functions
6
Types of Running-Time Analysis
Worst-case analysis Analysis for the worst-case input(s) Average-case analysis Analysis for all inputs More difficult to analyze Best-case analysis Analysis for the best-case input(s) Mostly not meaningful
7
Running Time for Search in an Array
8
Sorting Algorithms
9
Selection Sort An iteration
Find the largest item Swap it to the rightmost place Exclude the rightmost item Repeat the above iteration until only one item remained
10
The largest item Worst case Average case
11
selectionSort(theArray[ ], n) {
for (last = n-1; last >=1; last--) { largest = indexOfLargest(theArray, last+1); Swap theArray[largest] & theArray[last]; } indexOfLargest(theArray, size) largest = 0; for (i = 1; i < size; ++i) { if (theArray[i] > theArray[largest]) largest = i; selectionSort(theArray[ ], n) { for (last = n-1; last >=1; last--) { largest = indexOfLargest(theArray, last+1); Swap theArray[largest] & theArray[last]; } indexOfLargest(theArray, size) largest = 0; for (i = 1; i < size; ++i) { if (theArray[i] > theArray[largest]) largest = i; Running time: 두 함수의 for loop의 iteration 수의 합이 좌우 indexOfLargest가 n-1 번 call 되고, call 될 때마다 indexOfLargest의 수행시간은 한 단계씩 가벼워진다. Running time: 두 함수의 for loop의 iteration 수의 합이 좌우 indexOfLargest가 n-1 번 call 되고, call 될 때마다 indexOfLargest의 수행시간은 한 단계씩 가벼워진다.
12
Bubble Sort Worst case Average case
13
Insertion Sort Worst case: 1+2+···+(n-2)+(n-1)
Average case: ½ (1+2+···+(n-2)+(n-1))
14
An insertion sort partitions the array into two regions
15
Mergesort Algorithm mergeSort(S)
{ // Input: sequence S with n elements // Output: sorted sequence S if (S.size( ) > 1) { Let S1, S2 be the 1st half and 2nd half of S, respectively; mergeSort(S1); mergeSort(S2); S merge(S1, S2); } Algorithm merge(S1, S2) { sorting된 두 sequence S1, S2 를 합쳐 sorting 된 하나의 sequence S를 만든다
16
Animation (Mergesort)
7 2 | 9 4 7 | 2 7
17
Animation (Mergesort)
2 7 7 2 | 9 4 4 9 7 7 | 2 2 9 9 | 4 4 7 2 9 4
18
A mergesort with an auxiliary temporary array
19
A mergesort of an array of six integers
20
A worst-case instance of the merge step in mergesort
!4/04/2018(제10강)
21
Quicksort Algorithm quickSort(S)
{ // Input: sequence S with n elements // Output: sorted sequence S if (S.size( ) > 1) { x pivot of S; (L, R) partition(S, x); // L: left partition, R: right partition quickSort(L); quickSort(R); return L • x • R; // concatenation } Algorithm partition(S, x) { sequence S에서 x보다 작은 item은 partition L로, x보다 크거나 같은 item은 partition R로 분류.
22
Animation (Quicksort)
1 2 1 2 1 2 2 1 4 4 6 8 6 8 8 6 6 8 1 1 6 6
23
A partition with a pivot
Partitioning 방법은 다양하다 이것은 그 중 한 가지 방법
24
Radix Sort radixSort(A[ ], d)
{ // Sort n d-digit integers in the array A[ ] for (j = d downto 1) { Do a stable sort on A[ ] by jth digit; } Stable sort 같은 값을 가진 item들은 sorting 후에도 원래의 순서가 유지되는 성질을 가진 sorting
25
0123 2154 0222 0004 0283 1560 1061 2150 1560 2150 1061 0222 0123 0283 2154 0004 0004 0222 0123 2150 2154 1560 1061 0283 0004 1061 0123 2150 2154 0222 0283 1560 0004 0123 0222 0283 1061 1560 2150 2154
26
Comparison of Sorting Efficiency
Worst Case Average Case Selection Sort n2 Bubble Sort Insertion Sort Mergesort nlogn Quicksort Radix Sort n Heapsort
Similar presentations