Microsoft Interview Question for Software Engineer / Developers


Team: Autopilot
Country: United States
Interview Type: Phone Interview




Comment hidden because of low score. Click to expand.
5
of 5 vote

Time complexity of Fibonacci function evaluation is Fibonacci function itself.

- Flux November 05, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

Programming functions growth is the mathematical functions? Can u prove it?

- urik on bb November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
1
of 1 vote

As ghost89413 noticed above below, there is recurrent formula for running time

T(n) = T(n-1) + T(n-2)
For n < 2, T(n) = 1

Which is Fib(n).

Another intuitive approach is to notice that we haven't any type of memoisation or result caching for functions call, so final result is sum of 1-s (if "flatten" recursion tree). Fib(n) consists of Fib(n) ones, so complexity is Fib(n).

- Flux November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
2
of 6 votes

the complexity of a naive recursive fibonacci is indeed 2^n.

T(n<=1) = O(1) // Understood
T(n) = T(n-1) + T(n-2) + O(1)

Assume T(n-1) = O(2n-1), therefore
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2n-1) + O(2n-2) + O(1) = O(2^n)

Here is another approach ...

T(n) = T(n-1) + T(n-2) = T(n-2) + T(n-3) + T(n-3) + T(n-4) = 
= T(n-3) + T(n-4) + T(n-4) + T(n-5) + T(n-4) + T(n-5) + T(n-5) + T(n-6) = ...

in each step you call T twice, thus will provide eventual asymptotic barrier of: T(n) = 2 * 2 * ... * 2 = 2^n

- Ajeet November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
-1
of 3 votes

But you ignored the boundary... in fact.

Let use an example to show what's wrong with your approch.
For example T(4), use the method you mentioned
T(4) = T(3) + T(2) =
T(2) + T(1) + T(1) + T(0) // In this step, the number of T doubled of course
= T(1) + T(0) + T(1) + T(1) + T(0) // But in this step, you can not split T(1) and T(0) in two parts.... So the number of T is not doubled ... that is why your method failed

- ghost89413 November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
1
of 1 vote

First of all we have boundary .. T(n< 2) = O(1) (ignore T(n<=1) = O(1)).

Second .. here i am calculating big O notation, not exactly number of iteration .
So if any case if we have one less or more step still we will consider it O(2^n).

Like in your test case it is O(2^n) -1 = 6 -1 = 5. Still its complexity will be O(2^n).

- Ajeet November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Yea, I agree your big O notation is right.
But we know the solution of recursion equation T(n) = T(n-1) + T(n-2)
is T(n) = \Theta(\phi^n)

And I think your big O notation is too loose.
In fact 2^n grows much much faster than \phi^n .... two functions are not in the same order of growth ....

Anyway, there's no problem to say T(n) = O(f(n)) for any f(n) growing faster than \phi^n.... But in algorithm analyzing what we want is a tight boundary ...

In addition, your analysis is doubtful... although your conclusion is correct mathematically .

- ghost89413 November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Where is the proof for "t(n) = t(n-1) + t(n-2)" is a recurrence for the running time for the naïve fib implementation?

Let's start there. Just because the math formula looks like a recurrence doesn't necessarily mean we can change f to T and be done with it.

- urik on bb November 07, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

This is right answer.

@urik: Pretty simple answer.

if T(n) is the time taken, then T(n) = T(n-1) + T(n-2) , with T(1), T(2) some non-zero constants.

This gives T(n) = Theta(phi^n), phi being the golden ratio.

- Anonymous November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Ok I trust that u can prove it.

But whar u just said has some subtle significant holes..... Not important for cc.

- urik on bb November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
1
of 1 vote

@urik.

Instead of just stating that there are holes, please just state them. I am curious what subtle and significant holes you found and would like to improve my understanding, for, if the holes are significant, my statements could be wrong. Thanks.

Note: The statement that the runtime is Fibonacci itself is not accurate.

More accurate is Fibonacci like, i.e. satisfies same recurrences, but might have different exact values. Asymptotically they are the same: Theta(phi^n).

The proof is standard.

The recurrence T(n) = T(n-1) + T(n-2) satisfies T(n) = Ax^n + By^n where x and y are roots of x^2 = x + 1, A and B are constants depending on T(1) and T(2).

x = phi and y = -1/phi. Thus T(n) = Theta(x^n) (as the y^n term in negligible).

- Anonymous November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
1
of 1 vote

@urik:

To be completely accurate, the runtime recurrence is

T(n) = T(n-1) + T(n-2) + O(1)

The solution to this is still Theta(phi^n).

To see this

in the recurrence H(n) =H(n-1) +H(n-2) + C, substitute G(n) = H(n) + C, we get

G(n) - C = G(n-1) -C + G(n-2) -C + C i.e. G(n) = G(n-1) + G(n-2), and thus G(n) = Theta(phi^n) and so H(n) = Theta(phi^n).

This can be used to show that T(n) = O(phi^n) and Omega(phi^n).

I would say this hole is neither subtle nor significant.

- Anonymous November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

more intutitively: adding a constant time overhead will not change the asymptotic complexity of a recursive method...

- Anonymous November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Haha. You just popped up in this thread (well other characters went quiet and now an Anonymous handle is trying to prove a point).
Good troll.

Building turtles on turtles to make something more accurate does not justify accuracy of initial reasoning. I'm stranded without comp. Access so for now I'll just respond to your very last statement just above...

"More intuitively... " What you mean here?

Assume n power of 2 for conv.
F(n) = F(n/2) for n>1
F(1) = C
=> ?
Vs.

F(n) = F(n/2) + C_0
F(1) = C
=> ?

HMMMMmmmmm seems intuition has failed. We can go back and changr "what we meant" a few times though, right?

- S O U N D W A V E November 10, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

@anon
Just read some of ur recent replies carefully and upvoted one (with the best iteration of mathematics... With G(n)"

Do note u are improving the reasoning not simplying "explaining things that were obvious at first"

And who has been mass downvoting ghost ajeet yoda anon etc. in this thread? A good debate is ongoing stop frucking downvoting for no reason.

- S O U N D W A V E November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

@Brainless:

Yes, the O(1) might be relevant in some cases. Intuition != proof and was the reason I even provided one in the first place. Besides, the statement was more about the constant in O(1).

To apply the intuition and earlier statements with the O(1) term, try this unary version of fibonacci:

void fib(int n) {
    if (n == 1) {printf("1"); return;}
    if (n == 2) {printf("11"); return;}
    fib(n-1); fib(n-2);
}

Anyway, I found it obvious, you didn't. Happens. I will probably be wrong more often, so be it.

- Anonymous November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Lol so wait... We end with you giving me advice?
Your fancy and "obvious" conjecture (which came after two mathematical symbol chugging replies by you) was easily proven wrong in my last reply. And I just picked the very last thing you said.

You gave up too easily man...

- S O U N D W A V E November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

So my main points scattered throughout this thread were
1) Ajeet and yoda were correct and their upperbound is easiest to prove
2) Using the fib math recurrence (whether u keep base cases from math definition directly or use C or O(1) matters little... This is what u call "fibonacci-like" recurrence as one of the intermediate improvements of your argument) DOES NOT become the running time recurrence for the naïve math-translated to programming recursive alg.
(In fact... The "math recurrence _like_" runnning time recurrence only gives the running times for the base cases only under normal assumptions)
This SHOULD be obvious (not the bits u claim to be obvious).
Changing f to T (whether u also change base case to C or not is of little relevance here as base cases will be only costs) does not give tight nor upperbound on running time. Should be obvious.
So... In general it will give an omega lower bound on the total running time only. As it only gathers cost from base case invocation.

But in the case of fib. There recusion tree spreads out exponentially (you can see this in one of many ways) in a way that there are theta(phi^n) base case invocations itself.
So simply counting the runtimes of base case code gives us omega(phi^n) runtime for naivefib(n). This is good enough for cc and interviews.

(Incidentally, to count all costs you can prove that the recursion tree has at most O(num base case nodes) as ivan "kind of" did to prove bigO 2^n
He did it (or meant to) by showing that a balanced binary recursion tree has basically the same number of internal nodes as base case nodes).

(Warning: Everything above needs adjustment if base case and internal functions aren't both constant running time)

Nothing is as obvious as changing f to T.

- S O U N D W A V E November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Some typos above, incl:
*num-internal-nodes is O(numbasecasebodes)

Having no comp. Access suxs.

- S O U N D W A V E November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

LOL @brainless.

You are an idiot. Goodbye.

- Anonymous November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

@BU.

There is no "change f to T" going on.

It is based on the single recursive calls to f(n-1) and f(n).

If we were computing the recurrence F(n) = 1000*F(n-1) + F(n-2) the runtime recurrence would still be T(n) = T(n-1) + T(n-2) + O(1).

Don't be such an idiot.

- *anon* November 11, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

^^^^ Fact that you said that nonsense proves that you are responsible for many of the handles/names that posted earlier. Multiple personality syndrome?

Idiot, we're replying back and for to "Time complexity of Fibonacci function evaluation is Fibonacci function itself."

You keep talking about slightly varying, unrelated things to troll on. F*ck off.

- Urik November 12, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

It is T(n) = T(n-1) + T(n-2) because it gives the correct for fib. @BU

- *regex* November 12, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

@Urik. Fucking Idiot, that statement is correct. If F_n is nth fibonacci number, then Theta(phi^n) = Theta(F_n) and so that statement is completely accurate.

Go back to your textbook and try to understand something this time.

Your inability to get a better bound than O(2^n), while the tight bounds being obvious to many is just irking you, isn't it?

Deal with it, dodo.

btw, your attempts to save your face were hilarious. Perhaps you can have a career in comedy (it is quite clear computer science/mathematics is not for you).

- *anon* November 12, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

I don't understand Urik's/Trollcity's objections, TBH. The proof given by anonymous looks perfect to me.

What fallacy do you see Urik/Trollcity? Can you please explain?

- Anonymous November 13, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Anonymous have no objections to his own final final version of proof. The final proving look good except to say it is obvious some steps. The obvious part proven wrong with example.

Already upvoted by me now. Stop to talk to yourself over and over. I know it is one person always talking to himself in most the postings these days.

-Guest DS

- algos November 13, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Algos. Utterly ignorantos. Fuck off. You are disturbing my conversation with myself.

Hehe.

- Anonymous November 13, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

The proof of Anonymous beat the urik , ajeet, flux , ghost shit proofs . Good proof Anon.

- Expert November 13, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Funny thread.

- Guest BS.

- BS November 14, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

As mentioned earlier, the complexity is exponential. The reason is that the implementation is memory less, so you keep evaluating the same terms over and over.
Let's assume cost of calculation for comparison and addition is "1" unit.
Then if T(n) is the computation cost, we have

T(n) = T(n - 1) + T(n - 2) + 2 = 2  T(n - 2) + T(n - 3) + 4 > 2 * T(n - 2) > 4 * T(n - 4) > 8 T(n - 6) > ... > 2^k T(n - 2 * k) = theta(2^{n / 2}).

- Ehsan November 19, 2013 | Flag
Comment hidden because of low score. Click to expand.
2
of 6 vote

O(2^n) both space and time.

- Mr. Yoda November 05, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 2 votes

Look at the function and you can see why this is right. For each number you are recursively calling the function 2 times more.

So the first call to the function would be turning that one call into 2 additional calls, then those 2 calls would spawn another 2 calls.

1-> 2 -> 4 -> 16

So as you can see from his answer if you put in 4 it would give you 16 calls. sqrRoot(16) = 4. Therefor, as stated above, O(2^n).

- Anonymous November 05, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

It's correct the program runs in exponential time. But in fact you ignore the fact that the recursion tree is not balanced.
If you computes f(n), the recursion tree will be f(n-1) on the left and f(n-2) on the right. However the height of this two subtree is not equal. And that is why your answer is wrong.

The correct approach is to define the running time of f(n) as T(n)
Then we have T(n) = T(n-1) + T(n-2)
For n < 2, T(n) = 1

So Flux's answer is more accurate than yours. In fact the running time is Lucas number.

The Lucas Number is also an exponential function. But it's much smaller than 2^n when n gets large

- ghost89413 November 05, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

In addition, in big O notation, the running time is about O(1.618^n)

- ghost89413 November 05, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

His answer was correct. It is bigO 2^n

It is also bigomega sqrt2 ^n

Both above come from either assuming whether. The n-1 or n-2 dominates in recursion tree.

A better lower bound is bigtheta of the number u posted just above.

- urik on bb November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

O(\theta^n), where \theta = (\sqrt(5)+1)/2

- Jason November 06, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

Theta deserves to be on outside :(

There will be theta( goldenratio^n ) invocations of the base case itself.

So its actually more correct to have omega not O of the thing inside ur bigO answer.

- urik on bb November 06, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

Number of function calls for the n-th fibonacci number is
1+2+2^2+2^3+...+2^(n-2) which is actually 2^(n-1) - 1
the asymptotic behavior of this expression when n -> infinity
is O(2^n) obviously

- Ivan Drago November 07, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

Two assumptions implicit in ur proof.
Ur proving the growth of fib. Sequence Itself.
2nd u proved an upperbound of said fib.

But the upperbound is fairly loose so it shud be a good upperbound for naïve fib implementation itself.

I have not seen a proof that the fib programmed function (naïve recursion) is bigO phi^n , in any other replies.

It wud be quite a neat proof if someone can provide it.

- urik on bb November 07, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

Actually... Ur proof with some clean-up will prove 2^n upperbound on fib rec. Programming function directly (or u can use it to prove that upper bound on fib seq. And modify to cover the rec. Programmed function).

So this reiterates the earlier fellows correct answer of bigO 2^n

One can prove lower bound phi^n also

Still open is a tight theta bound.

No proof in this thread of bigO phi^n yet.

- urik on bb November 07, 2013 | Flag


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More