| CPSC 320: Analysis of Algorithms | Spring 2026 |
(a) If $f(n)$ is $O(g(n))$, then $g(n)$ is $O(f(n))$.
(b) If $f(n)$ is $\Theta(g(n))$, then $g(n)$ is $\Theta(f(n))$.
(b) We say $f(n)$ is $o(g(n))$ if, for every constant $c > 0$, there exists a constant $n_0 \ge 0$ such that $f(n) < cg(n)$ for all $n \ge n_0$. Equivalently, $f(n)$ is $o(g(n))$ if $\lim_{n \to \infty} f(n)/g(n) = 0.$ In mathematical analysis, the $o$-notation is used when $f(n)$ becomes insignificant relative to $g(n)$ as $n$ approaches infinity. In algorithm analysis, if algorithms $A$ and $B$ run in $f(n)$ and $g(n)$ time, respectively, where $f(n)$ is $o(g(n))$, then we say $A$ is asymptotically faster than $B$. A simple example of a function that is $o(n)$ is $\log n$. Give another example of a nonnegative, nonconstant function that is $o(n)$. Give an example of a function that is $O(n)$ but not $o(n)$. Justify your claims.
for $i \leftarrow 1$ until $n$ do(a) What is the asymptotically tight running time of this algorithm? More precisely, find a function $f(n)$ such that, for an input of size $n$, the number of operations performed by the above algorithm is $\Theta(f(n))$. Prove your claim; in particular, show that the number of operations is $O(f(n))$ and $\Omega(f(n))$.
begin
$s \leftarrow 0$;
for $j \leftarrow 1$ until $i$ do
$s \leftarrow s + A[j]$;
$B[i] \leftarrow s / i$
end;
return $B$
(b) Design another algorithm that is asymptotically faster than the above method to solve this problem. Prove its correctness and analyze its efficiency.
|
|
CPSC 320 home page |
|
|