1

I've seen many similar questions but not quite what I'm looking for. I'm supposed to find the complexity for the code below. What makes this code different from what I've seen here already is that the function I have to find the complexity contains another function with a given complexity.

I think I can solve this but can't give the correct answer. Any detailed explanation would be very nice, also to help me better understand the flow of finding the complexity in those kinds of functions. The code is in C.

 void f(int v[], int n, int i, int j){
    int a = v[i]; 
    int b = v[j]; 
    int m = (i+j)/2;
    g(v,n);
    if(i<j){
        if(a<b) f(v,n,i,m);
        else f(v,n,m,j)
    }
    return; 
}

The f function is called in the main where v is an array: f(v, n, 0, n-1). The g function complexity is O(n).

Now, I really can't decide between O(log n) or O(n log n). Seeing that we're dividing the workspace in half using the int m I know it's logarithmic, but does the G function adds up and turn everything into O(n log n)?

Thank you.

PS: if an answer like this has been asked already, I couldn't find it and redirection would be great in case anyone else stumbles on the same problem as mine.

2
  • 1
    n is not changing, right? And if g is O(n) there is no way in the world the overall complexity can be O(log n) which is lower than O(n). Commented Jul 16, 2020 at 14:31
  • More importantly, a compiler might to be able to unroll this recursion since it isn't strictly tail call. Then you can forget all about "Big O" since the function call overhead and blocked caching opportunities will destroy all that is execution time, no matter what algorithm you use. Commented Jul 16, 2020 at 14:53

1 Answer 1

1

Your f function will execute exactly log(n) times (the range between i and j is always halved); each of these times, it will execute g, with an additional cost of O(n). Therefore, the total complexity is O(n * log(n)), which is the total number of times the inner loop* of g is called.

(* I am assuming that there is an inner loop in g for explanation purposes, because that is what you find in many, but certainly not all, O(n) functions).

Sign up to request clarification or add additional context in comments.

6 Comments

Time complexity is not simply multiplying number of repeats when the complexity of the inner function is dependent on the outer variable. You need to sum the total number of times each operation is actually called. In this case, it is linear.
(The comment assumes the complexity is actually dependent on the range of [i,j]), which I am unsure if this is the case)
The complexity of the inner function does not depend on outer variables @amit; it is always the exact same invocation with the same parameter (the initial v and n).
Yea, I added the caveat due to this, seems to be n is indeed independent on i,j
In some cases, simply multiplying the number of repeats does yield the correct answer :-). There is no data dependency in the code - it always executes exactly n log n times (assuming again that g is as simple loop for easy of explanation)
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.