Skip to content

My assignments, labs, and code snippets from INF135 (Advanced Programming) at UiB (University of Bergen)

Notifications You must be signed in to change notification settings

simsam8/info135

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Recap notes

Data Structures

  • Linear Data Structures
    • Array
    • Linked List
    • Stack
    • Queue
  • Non-Linear Data Structures
    • Graph
    • Tree

Search

Search Best case Average case Worst case
Linear search $O(1)$ $O(n)$ $O(n)$
Binary search $O(1)$ $O(log(n))$ $O(log(n))$

Sorting algorithms

Explanation and code examples can be found here

Hashing

Hashing is the process of mapping a search key to a limited range of available array indices (positions), referred to as slots (buckets), with the goal of providing direct access to the keys.

Hash Function: a function that maps a key to the slot where that item belongs in the hash table. Putting a hash function and an array together, we can build a data structure called a Hash Table.

Hash Table: a data collection that stores pairs of keys and values, designed to enable quick retrieval of values by their associated keys.

Slot (Bucket): a unique position in a hash table, capable of holding one or more items, which are mapped to it via their keys.

Types of hash functions

  • Division: key % table_size
  • Multiply-Add-and-Divide(MAD): ((a*key + b) % p) % m
  • Truncation: ignore some columns in the key, and use the rest to compute the hash values.
  • Folding: split a key into multiple parts and combine them into a single integer by adding them.
  • Hashing Strings: summing the ASCII values of the individual characters.

Problem solving

Greedy algorithms

  1. Optimizing by making the best choice at the moment
  2. Does not always find the optimal solution, but it is fast.
  3. May require almost no memory
  4. Example: Dijkstra's Algorithm

Divide and Conquer

  1. Optimizes by breaking down the problem into subproblems, typically using recursion to solve the problem.
  2. Finds the optimal solution, but is slower than Greedy Algorithms.
  3. May require some memory.
  4. Example: Merge sort

Dynamic Programming

  1. Optimizes by caching the answers to each subproblem as not to repeat the calculation.
  2. Finds the optimal solution, but may be pointless to use on small data.
  3. May require a lot of memory.
  4. Example: memoized fibonacci.

Class relationships

Class relationship examples can be found here

Big O

Common big-O functions listed from smallest to largest order of magnitude

$f(.)$ Common name
$1$ constant
$log(n)$ logarithmic
$n$ linear
$n log(n)$ log linear
$n^2$ quadratic
$n^3$ cubic
$a^n$ exponential

Concurrency

I/O-Bound Process

  • Program spends most of its time accessing slow devices, like a network connection, a hard drive, or a printer.
  • Speeding it up involves overlapping the times spent when waiting for these devices.
  • Multihreading recommended

CPU-Bound Process

  • Program spends most of its time doing CPU operations.
  • Speeding it up involves finding ways to do more computations at the same amount of time.
  • Multiprocessing recommended.

Releases

No releases published

Packages

No packages published

Languages