Write your function with the formal definitions and simplify until only axioms remain,then it will be proven. For big omega: you have the formal definition: For small omega: you have the formal definition: and for big theta: you have the formal definition: More information here...

algorithm,big-o,computer-science,big-theta

Wouldn't it suffice to give a special input and show that the running time is at least f(n)? Yes, assuming you are talking about the worst case complexity. If you are talking about worst case complexity - and you have proved it is running in O(f(n)), if you find...

algorithm,time-complexity,complexity-theory,asymptotic-complexity,big-theta

Outermost loop will run for ceil(log n) times. The middle loop is dependent on the value of i. So, it's behaviour will be : 1st iteration of outermost-loop - 1 2nd iteration of outermost-loop - 2 ..................................... ceil(log n) iteration of outermost-loop - ceil(log n) Innermost loop is independent of...

algorithm,time-complexity,big-theta

Well, I'm not sure, but I'll give it a shot. The factorial is, in essence, a gamma function. And gamma function is defined not only for natural numbers, but also for real numbers. So, there is, in theory, a inverse gamma function, which is defined for cases, where factorial is...

algorithm,big-o,complexity-theory,big-theta

The Master theorem doesn't even apply, so not being able to use it isn't much of a restriction. An approach which works here is to guess upper and lower bounds, and then prove these guesses by induction if the guesses are good. a(0) = 0 a(1) = 1 a(2) =...

algorithm,runtime,big-o,big-theta

A way to get rid of sums is to compute differences, after multiplying by $n$ (allow me to write LaTeX, even if this site doesn't use it; I hope the formulas are understandable): $$ (n + 1) a_{n + 1} - n a_n = 2 a_n + 5 $$ $$...

As explained in wiki article about big O notation. Big O: O(f(n)) = n^3, because f(n) = 2n^3 + (7n^2)*log(n^4) =< 2n^3 + (7n^2)*n =< 9n^3, for big enough n. As explained below log(n^4) <= n, for big enough n. Big Omega: Omega(f(n)) = n^3; because f(n) = 2n^3 +...

See, f(n)=n^10 and g(n)=(2n)^10. So, f(n)>=((1/4)^10)*(2n)^10 is greater than g(n). So, f(n)>=c1*g(n) for some c1=1/4; Similarly, f(n)<=(c2)/*(2n)^10 is smaller than g(n) for any value of c2 greater than or equal to 1/2. So, f(n)<=c2*g(n). Hence c1*g(n)<=f(n)<=c2*g(n); where c1<1/2 and c2>1/2. Hence, f(n)=Theta(g(n)) OR f(n)=θ(g(n))....

algorithm,big-o,analysis,recurrence-relation,big-theta

A recurrence relation is a way of recursively defining a function. For example, the recurrence relation T(n) = 4T(n / 3) + O(1) says that the function is defined so that if you want to determine T(n), you can evaluate T(n / 3), multiply it by 4, an add in...

algorithm,loops,runtime,analysis,big-theta

n!/3!(n-3)! = n(n-1)(n-2)/3! = (n^2-n)(n-2)/6 = (n^3-2n^2-n^2+2n)/6 = (n^3 -3n^2 + 2n)/6 You can show easily1 that for large enough values of n: 1/2 n^3 < (n^3 -3n^2 + 2n)/6 < 2n^3 So when it comes to asymptotic notation, it is in Theta(n^3), and NOT in o(n^3). (1) One way...