Question: What is
complexity?
Solution:
Complexity is the efficiency of an
algorithm. How much time to evaluate a code is called the complexity of an
algorithm.
That means: How efficient is an
algorithm or piece of code will run or taken time or space in memory which is
depended on it is called the complexity of an algorithm.
We
express complexity using big-O notation. For a problem of size N:
- a constant-time method is "order 1":
O(1)
- a linear-time method is "order N": O(N)
- a quadratic-time method is "order N
squared": O(N2)
Bubble sort Complexity:
Therefore the complexity is:
O((N-1)*(N/2)) = O(N2/2
– N/2) = O(N2)
Binary search trees (BST) complexity:
Binary search trees (BST) follow a
specific ordering (pre-order, in-order, post-order) among sibling nodes. The
tree must be sorted, unlike heaps:
A binary tree is balanced if
each node has (roughly) the same number of descendants in its left subtree as
it has in its right subtree.
Important fact: For
balanced binary trees, the height is proportional to the base-two logarithm of
the number of nodes in the tree: h = O(lg(n)).
BST have average of O(logn)O(logn) for insertion,
deletion, and search.
Binary Heaps have average O(1)O(1) for findMin/findMax and O(logn)O(logn) for insertion and deletion.
Binary Heaps have average O(1)O(1) for findMin/findMax and O(logn)O(logn) for insertion and deletion.
for (i = 0; i < N; i++) {
sequence of statements
}
The loop executes N
times, so the sequence of statements also executes N times. Since we assume the
statements are O(1), the total time for the for loop is N * O(1), which is O(N)
overall.
Complexity for nested
for Loop:
First we'll consider
loops where the number of iterations of the inner loop is independent of the
value of the outer loop's index. For example:
for (i = 0; i < N; i++) {
for (j = 0; j < M; j++) {
sequence of statements
}
}
The outer loop executes N times. Every
time the outer loop executes, the inner loop executes M times. As a result, the
statements in the inner loop execute a total of N * M times. Thus, the
complexity is O(N * M). In a common special case where the stopping condition
of the inner loop is j < N instead of j < M (i.e.,
the inner loop also executes N times), the total complexity for the two loops
is O(N2).
Some methods may require
different amounts of time on different calls, even when the problem size is the
same for both calls. For example, consider the add method that
adds an item to the end of the list. In the worst case (the array is full),
that method requires time proportional to the number of items in the list
(because it has to copy all of them into the new, larger array). However, when
the array is not full, add will only have to copy one value
into the array, so in that case its time is independent of the length of the
list; i.e., constant time.
In general, we may want
to consider the best and average time
requirements of a method as well as its worst-case time requirements. Which is
considered the most important will depend on several factors. For example, if a
method is part of a time-critical system like one that controls an airplane,
the worst-case times are probably the most important (if the plane is flying
towards a mountain and the controlling program can't make the next course
correction until it has performed a computation, then the best-case and
average-case times for that computation are not relevant -- the computation
needs to be guaranteed to be fast enough to finish before the plane hits the
mountain).
On the other hand, if
occasionally waiting a long time for an answer is merely inconvenient (as
opposed to life-threatening), it may be better to use an algorithm with a slow
worst-case time and a fast average-case time, rather than one with so-so times
in both the average and worst cases.
Note that calculating
the average-case time for a method can be tricky. You need to consider all
possible values for the important factors, and whether they will be distributed
evenly.
No comments:
Post a Comment