Tuesday, July 12, 2016

Big O




·      


We only care about the most significant portion of complexity.
n2 + 2n = n2    //// 2n =  0.0002%


·       O(1)- constant complexity - Time to complete is the same regardless of the size of input set. An example is accessing an array element by index.
  • 1 item: 1 second
  • 10 items: 1 second
  • 100 items: 1 second
    •  
O(Log N) - Logarithmic complexity-( divide and conquer
) (Imagine a classroom of 100 students in which you gave you pen to one person. 
·       )Time to complete increases roughly in line with the log2(n). For example 1024 items takes roughly twice as long as 32 items, because Log2(1024) = 10 and Log2(32) = 5. An example is finding an item in a binary search tree (BST).
  • 1 item: 1 second
  • 10 items: 2 seconds
  • 100 items: 3 seconds
  • 1000 items: 4 seconds
  • 10000 items: 5 seconds
    •  
·       O(N) - linear complexity -Time to complete that scales linearly with the size of the input set. In other words if you double the number of items in the input set, the algorithm takes roughly twice as long. An example is counting the number of items in a linked list.
  • 1 item: 1 second
  • 10 items: 10 seconds
  • 100 items: 100 seconds
·        
Sort ----
·       O(N Log N) - Time to complete increases by the number of items times the result of Log2(N). An example of this is heap sort and quick sort.
·       O(N^2) - Quadratic complexity- Time to complete is roughly equal to the square of the number of items. An example of this is bubble sort.
  • 1 item: 1 second
  • 10 items: 100 seconds
  • 100 items: 10000 seconds
·        
·       O(N!) - Time to complete is the factorial of the input set. An example of this is the traveling salesman problem brute-force solution.



Big Oh for (n log n) [closed]


int n = 100 for(int i = 0; i < n; i++) //this loop is executed n times, so O(n) {     for(int j = n; j > 0; j/=2) //this loop is executed O(log n) times     {      } }
Explanation: The outer for loop should be clear. It is executed n times. Now to the inner loop. In the inner loop you just take n and always divide it by 2. so you ask yourself: How many times can I divide n by 2. It turns out that this is in O (log n). (in fact the base of log is 2, but in Big-Oh notation we leave the base since it would only add some factors to our log which we are not interested in.). So you are executing a loop n times and within that loop you are executing another loop log(n) times so you have O(n)*O(log n) --> O(n log n)


No comments:

Post a Comment