In programming, an algorithm is a set of instructions or a procedure for solving a specific problem or achieving a specific task. Algorithms can be expressed in any programming language and can be as simple as a sequence of basic operations or as complex as a multi-step process involving different data structures and logic. The main goal of an algorithm is to take in input, process it and provide an output that is expected. Algorithms can be classified based on the time and space complexity, the technique used for solving the problem, and the type of problem it solves. Examples of algorithm are sorting, searching, graph traversals, string manipulations, mathematical operations, and many more.
Algorithms we will be talking about:
Sorting algorithms: Sorting is a fundamental operation in computer science and there are several efficient algorithms for it, such as quicksort, merge sort and heap sort.
Search algorithms: Searching for an element in a large dataset is a common task and there are several efficient algorithms for it, such as binary search and hash tables.
Graph algorithms: Graph algorithms are used to solve problems related to graphs, such as finding the shortest path between two nodes or determining if a graph is connected.
Dynamic programming: Dynamic programming is a technique for solving problems by breaking them down into smaller subproblems and storing the solutions to these subproblems to avoid redundant computation.
Greedy algorithms: Greedy algorithms are used to solve optimization problems by making the locally optimal choice at each step with the hope of finding a global optimum.
Divide and Conquer: Divide and Conquer is an algorithm design paradigm based on multi-branched recursion. A divide and conquer algorithm breaks down a problem into sub-problems of the same or related type, until these become simple enough to be solved directly.
Backtracking: Backtracking is a general algorithmic technique that considers searching every possible combination in a systematic manner, and abandons a particular path as soon as it determines that it cannot be part of the solution.
Randomized Algorithm: Randomized algorithms use randomness to solve a problem. It can be useful to solve problems that cannot be solved deterministically or to improve the average case complexity of a problem.
These algorithms are widely used in various applications and it’s important for a programmer to have a strong understanding of them. So i will try my best to explain them.
- Sorting algorithms:
- Quicksort: Quicksort is a divide-and-conquer algorithm that chooses a “pivot” element from the array and partitions the other elements into two sub-arrays, according to whether they are less than or greater than the pivot. The sub-arrays are then sorted recursively.
`def quicksort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quicksort(left) + middle + quicksort(right)
print(quicksort([3,6,8,10,1,2,1]))`
- Merge Sort: The merge sort algorithm is a divide-and-conquer algorithm that divides an array in two, sorts the two halves, and then merges them back together. `def merge_sort(arr): if len(arr) <= 1: return arr mid = len(arr) // 2 left = merge_sort(arr[:mid]) right = merge_sort(arr[mid:]) return merge(left, right)
def merge(left, right):
result = []
i = 0
j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result += left[i:]
result += right[j:]
return result
print(merge_sort([3,6,8,10,1,2,1]))`
- Heap Sort: The heap sort algorithm is a comparison-based sorting algorithm that builds a heap from the input elements and then repeatedly extracts the maximum element from the heap and places it at the end of the sorted output array.
`def heap_sort(arr):
n = len(arr)
for i in range(n, -1, -1):
heapify(arr, n, i)
for i in range(n-1, 0, -1):
arr[i], arr[0] = arr[0], arr[i]
heapify(arr, i, 0)
def heapify(arr, n, i):
largest = i
l = 2 * i + 1
r = 2 * i + 2
if l < n and arr[i] < arr[l]:
largest = l
if r < n and arr[largest] < arr[r]:
largest = r
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i]
heapify(arr, n, largest)
print(heap_sort([3,6,8,10,1,2,1]))`
- Search algorithms:
- Binary Search: Binary search is an efficient algorithm for finding an item from a sorted list of items. It works by repeatedly dividing in half the portion of the array being searched, until the target value is found.
def binary_search(arr, x): low = 0 high = len(arr) - 1 mid = 0 while low <= high: mid = (high + low) // 2 if arr[mid] < x: low = mid + 1 elif arr[mid] > x: high = mid - 1 else: return mid return -1 print(binary_search([1,2,3,4,5,6,7], 4))
- Hash Tables: A hash table is a data structure that maps keys to values, using a hash function to compute an index into an array of buckets or slots, from which the desired value can be found. class HashTable: ` def init(self): self.size = 10 self.keys = [None] * self.size self.values = [None] * self.size
def put(self, key, data):
index = self.hash_function(key)
while self.keys[index] is not None:
if self.keys[index] == key:
self.values[index] = data # update
return
index = (index + 1) % self.size
self.keys[index] = key
self.values[index] = data
def get(self, key):
index = self.hash_function(key)
while self.keys[index] is not None:
if self.keys[index] == key:
return self.values[index]
index = (index + 1) % self.size
return None
def hash_function(self, key):
sum = 0
for pos in range(len(key)):
sum = sum + ord(key[pos])
return sum % self.size
t = HashTable()
t.put("apple", 10)
t.put("orange", 20)
t.put("banana", 30)
print(t.get("orange"))`
- Graph Algorithm :
- Dijkstra’s shortest path algorithm: Dijkstra’s shortest path algorithm is an algorithm for finding the shortest path between nodes in a graph. import heapq ` def dijkstra(graph, start): heap = [(0, start)] visited = set() while heap: (cost, v) = heapq.heappop(heap) if v not in visited: visited.add(v) for u, c in graph[v].items(): if u not in visited: heapq.heappush(heap, (cost + c, u)) return visited
graph = {
'A': {'B': 2, 'C': 3},
'B': {'D': 4, 'E': 5},
'C': {'F': 6},
'D': {'G': 7},
'E': {'G': 8, 'H': 9},
'F': {'H': 10},
'G': {},
'H': {}
}
print(dijkstra(graph, 'A'))`
- Dynamic Programming:
- Fibonacci sequence: A classic example of a problem that can be solved using dynamic programming is the Fibonacci sequence. `def fibonacci(n): if n <= 0: return 0 elif n == 1: return 1 else: return fibonacci(n-1) + fibonacci(n-2)
print(fibonacci(10))`
- Greedy Algorithms:
- Huffman coding: Huffman coding is a lossless data compression algorithm that uses a greedy algorithm to construct a prefix code for a given set of symbols. from collections import Counter, namedtuple
def huffman_encoding(data):
"""
Generates a Huffman encoded string of the input data
"""
# Create a frequency counter for the data
freq_counter = Counter(data)
# Create a namedtuple for the Huffman tree nodes
HuffmanNode = namedtuple("HuffmanNode", ["char", "freq"])
# Create a priority queue for the Huffman tree
priority_queue = PriorityQueue()
# Add all characters to the priority queue
for char, freq in freq_counter.items():
priority_queue.put(HuffmanNode(char, freq))
# Combine nodes until only the root node remains
while priority_queue.qsize() > 1:
left_node = priority_queue.get()
right_node = priority_queue.get()
combined_freq = left_node.freq + right_node.freq
combined_node = HuffmanNode(None, combined_freq)
priority_queue.put(combined_node)
# Generate the Huffman code for each character
huffman_code = {}
generate_code(priority_queue.get(), "", huffman_code)
# Encode the input data
encoded_data = ""
for char in data:
encoded_data += huffman_code[char]
return encoded_data, huffman_code
print(huffman_encoding("aaaaabbbcccc"))
- Divide and Conquer :
Merge Sort: already explained above
Backtracking:
The N-Queens Problem: The N-Queens problem is a classic problem that can be solved using backtracking. The goal is to place N queens on an NxN chessboard in such a way that no queen can attack any other queen.
`def solveNQueens(n):
def could_place(row, col):
# check if a queen can be placed on board[row][col]
# check if this row is not under attack from any previous queen in that column
for i in range(row):
if board[i] == col or abs(board[i] - col) == abs(i - row):
return False
return True
def backtrack(row=0, count=0):
for col in range(n):
if could_place(row, col):
board[row] = col
if row + 1 == n:
count += 1
else:
count = backtrack(row + 1, count)
return count
board = [-1 for x in range(n)]
return backtrack()
print(solveNQueens(4))`
This algorithm starts placing queens in the first row, and for every placed queen it checks if it is under attack from any previous queen. If not, it proceeds to the next row and repeats the process. If a queen is placed in a position where it is under attack, the algorithm backtracks and tries a different position. This continues until all queens are placed on the board without any attacking each other.
- Randomized Algorithm: — Randomized QuickSort: A variation of quicksort algorithm where pivot is chosen randomly. import random
`def randomized_quicksort(arr):
if len(arr) <= 1:
return arr
pivot = random.choice(arr)
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return randomized_quicksort(left) + middle + randomized_quicksort(right)
print(randomized_quicksort([3,6,8,10,1,2,1]))`
These are some of the most commonly used algorithms that every programmer should be familiar with. Understanding these algorithms and their implementation can help a programmer to make better decisions when it comes to designing and implementing efficient solutions.
Top comments (0)