Sal
This blog post is a part of the "Data Structures and Algorithms" series.We'll be learning about arrays, time complexity and basic sorting algorithms in this post.

INTRODUCTION TO ARRAYS

An array is a collection of items stored at contiguous memory locations and elements can be accessed randomly using indices of an array. They are used to store similar types of elements as in the data type must be the same for all elements. They can be used to store collection of primitive data types such as int, float, double, char, etc.

dsa1.png


BRIEF DESCRIPTION OF TIME COMPLEXITY

The efficiency of algorithms is important in competitive programming. Usually, it is easy to design an algorithm that solves the problem slowly, but the real challenge is to invent a fast algorithm. If the algorithm is too slow, it will get only partial points or no points at all.

The time complexity of an algorithm estimates how much time the algorithm will use for some input. The idea is to represent the efficiency as a function whose parameter is the size of the input. By calculating the time complexity, we can find out whether the algorithm is fast enough without implementing it.

It is represented as O(…) where “...” represents some function in n (n being the size of input e.g. size of array/string).

  • COMPLEXITY CLASSES:
Big O or O(…): It describes the worst-case scenario. It describes the upper bound running time complexity of an algorithm.
Big Omega or Ω(…): It describes the best-case scenario. It describes the lower bound running time complexity of an algorithm.
Big Theta or Ɵ(…): It describes the average case scenario. It represents the most realistic time complexity of an algorithm.
O(n): A linear algorithm goes through the input a constant number of times. This is often the best possible time complexity, because it is usually necessary to access each input element at least once before reporting the answer.

LOOPS:

The complexity of the following code is O(n):

for (int i = 1; i <= n; i++){
	// code
}


Sorting


Bubble Sort #1

It is one of the simplest sorting algorithm that compares adjacent elements and swaps their positions if they are not in the intended order.


Working


-First cycle-

Starting from the base of the array, compare the first and the second element. If the first element > the second, swap them. Now, compare the second and the third element. This process is repeated till the end of the array. And at the end of this cycle, the largest element will get placed at the end of the array.

-Second cycle-

Again starting from the base elements are compared. But this time only till the 2nd last element i.e. till the N-1 element.These cycles are done till our array is sorted.

Example :-

First Cycle:-

dsa6.png

Comparison of the first and the second element is done. As 3>2, they are swapped.

dsa3.png

In the next comparison 3 is not greater than 4, so no swapping is done.

dsa4.png

Here swapping is done as 4>1 and the first cycle is completed. At the end of this cycle ‘4’ is placed at its appropriate position. So, in the next cycles there is no need to do comparison with the last element.

dsa12.png

Second Cycle:-

Similar to first cycle comparisons are done.

dsa10.png

dsa17.png


Algorithm


bubbleSort(array)
	for cycle = 0 to indexOfLastUnsortedElement-1
		for i = 0 to indexOfLastUnsortedElement - cycle -1
			if leftElement > rightElement
				swap leftElement and rightElement
end bubbleSort


Complexity


dsa21.png

Total number of comparisons => (n - 1) + (n - 2) + (n - 3) +.....+ 1 = n(n - 1) / 2
Big-Oh => O(n^2)


Code



Selection Sort #2

It is one of the simplest sorting algorithm that divides the entered data into two parts ,sorted path at the left end and the unsorted at the right end. The smallest element is selected from the unsorted part and swapped with left most element of unsorted part.


Working


-First cycle-

Starting from the base of the array, iterate over the unsorted array keeping track of the minimum value. Swap the minimum element and the first element in the unsorted array.

-Second cycle-

Again starting from first element of the unsorted array, the minimum element is chosen and swapped with the leftmost element of unsorted array. But this time only N-1 elements are checked while choosing the minimum.

These cycles are done till our array is sorted.

Example :-

dsa9.png

First Cycle:-

Minimum element is chosen from 3,2,4,1 , therefore in the first cycle 1 is chosen and swapped with 3.

dsa2.png

Second Cycle:-

The next minimum element is 2 but it is at is intended place. In the next cycle, 3 is chosen as its minimum element and 3 is exchanged with 4

dsa14.png

So, in the next cycles there is no need to do comparison as the array is sorted.


Algorithm


selectionSort(array)
	for cycle = 0 to indexOfLastUnsortedElement-1
		min_index=i
		for i = i+1 to indexOfLastUnsortedElement
			if leftElement > arr[min_index]
				swap leftElement and arr[min_index]
end selectionSort


Complexity


dsa21.png

Total number of comparisons => (n - 1) + (n - 2) + (n - 3) +.....+ 1 = n(n - 1) / 2
Big-Oh => O(n^2)


Code



Insertion Sort #3

In this sorting technique, array is virtually divided into a sorted part and an unsorted part. Elements from the unsorted region are picked and are inserted at the appropriate position in the sorted part.


Working


-First cycle-

Assume the first element in the array is sorted. Take the second element and store it as the ‘Key’. Compare the first element with Key and if the first element > Key , copy the first element to the second element's place.
Now place the value of Key, in the place of the first element. With this, the first and second elements are sorted.

-Second cycle-

Now take the third element as the Key and compare it with the second element. If the second element> Key, copy the value of the second element in the place of third.
If copying of value is done, compare the Key with the first element. If first element > Key, repeat the above else place the value of the Key on the second position.
With this cycle, first 3 elements are sorted.

Similar cycles are repeated until the array is sorted.

Example

dsa6.png

First Cycle:-

6(second element) is taken as the key and it is compared with elements on the left of it. For now there is only 8 on the left. As 8>6, 8 is copied to second place, and later Key is placed in the first element.

dsa7.png

Second Cycle:-

3(3rd element) is taken as the Key. Then it is compared with the elements on the left one by one until its appropriate location is found. In this example 3<6<8 so 3 is placed on the first address

dsa16.png

Third Cycle:-

5(4th element) is taken as the Key. Here when 5 is compared with 3; 3 comes out to be less than 5. And 5(Key) is copied to the second address.

dsa15.png

Fourth Cycle:-

Here Key is taken as the 5th element. Then Key is compared with all the elements on its left. And is placed on its appropriate place.

dsa5.png


Algorithm


insertionSort(array)
mark first element as sorted
for each unsorted element X
	'extract' the element X
	for j <- lastSortedIndex down to 0
		if current element j > X
			move sorted element to the right by 1
			break loop and insert X here
end insertionSort

Complexity

dsa22.png

Total number of comparisons => 2*( 1 + 2 + 3 +..... + (n - 2) + (n - 1)) = n(n - 1)
Big-Oh => O(n^2)

Code


Additional Resources :