Problem: You have an algorithm that divides `n`

size problem to `six subproblems`

with size of `one quarter`

of the original. For dividing the algorithm makes `100 steps`

and for merging `75n`

. What's the time asymptotic complexity of an algorithm?

So the formula for Master Theorem

and for this problem `a = 6`

and `b = 4`

, but I don't know where to fit the division and merge info.

The acceptable results are: O(n^(1.2924)), *omega*(n^1.2) and O(1.001^n)

# Best How To :

Every time you solve a subproblem, you have to divide the current instance into more subproblems (cost = `100`

steps). Then, you have to merge the results of the subproblems (cost = `75n`

steps).

That means `f(n) = 75n + 100`

because `f(n)`

represents the cost of solving a single subproblem (excluding the cost of the recursion).

`f(n) = 75n + 100`

is `O(n)`

.

Therefore, you're looking at: `T(n) = 6 * T(n/4) + O(n)`

And we know that:

```
a = 6
b = 4
f(n) = 75n + 100
```

Next, we calculate `log_b(a) = log_4(6) = log(6)/log(4) = 1.2924...`

Let's consider case I of the Master Theorem:

If `f(n) = O(n^c)`

where `c < log_b(a)`

, then `T(n) = Ө(n^(log_b(a))`

.

We know that `f(n) = O(n^1)`

, so let's try `c = 1`

.

Is `c < log_b(a)`

? Well, `1 < 1.2924...`

, so yes.

Thus, `T(n) = Ө(n^(log_b(a))`

=> `T(n) = Ө(n^1.2924...)`

.