Amazon Interview Question
Country: United States
use recersion similar to binary search. In this case, every recursion excludes 1/8 the search space. What's the complexity?
bool mf(int A[N][N][N], int v)
{
return f(A, 0, N-1, 0, N-1, 0, N-1, v);
}
bool f(int A[N][N][N], int xi, int xj, int yi, int yj, int zi, int zj, int v)
{
if(xi>xj || yi>yj || zi>zj)
return false;
int mx = (xi+xj)/2;
int my = (yi+yj)/2;
int mz = (zi+zj)/2;
if(v>A[mx][my][mz])
{
return f(A, xi, xj, yi, yj, mz, zj) ||
f(A, xi, xj, my, yj, zi, mz) ||
f(A, mx, xj, yi, my, zi, mz);
}
else
{
return f(A, xi, xj, yi, yj, zi, mz) ||
f(A, xi, xj, yi, ym, mz, zj) ||
f(A, xi, mx, ym, yj, mz, zj);
}
}
Well, if X is the input size F(X) = 3 F(X/2) = 9 F(X/4) = .O (3 ^(log base 2 of X)) = X ^(log base 2 of 3). But X is O(n^3), so this is actually approximately O(n^4.75), a huge complexity. Compare to the O(n^2) solution.
The brute force is O(n3). This divide-conquer algorithm must be at least better than that. Other idea?
It's not better than that. Sorry. It's O(n^4.75) or thereabouts, as explained by the analysis above. Not every divide and conquer algorithm is guaranteed to be better than the naive algorithm. That's why not every problem is solved by divide and conquer.
@ eugene.yarovoi, there seems to be issues with the binary search also.
Suppose, we want to perform binary search in a 2D array which is sorted row-wise as well as column wise.
------------
| 1 | 2 |
| | |
|----X-----|
| 3 | 4 |
| | |
------------
Suppose the middle element is X. If the item to be found is less than X, only part 1 needs to be searched. However, if the item is larger than X, all the parts 2,3,& 4 need to be searched.
So, the array is not equally distributed in two halves as mentioned in the approach.
What do you think?
If the item to be found is less than X, parts 1, 2, and 3 need to be searched. What's happenning here is that we're searching the left half (1 & 2) and the upper half (1 & 3) in this case. So the approach is valid. Had the overlap been avoided here, the complexity could have been n^(log base 2 of 7), a small improvement over the naive algorithm, but not better than the O(n^2) approach mentioned before.
@eugene.yarovoi, you might be interested,
follow this link:
leetcode.com/2010/10/searching-2d-sorted-matrix-part-ii.html
That was interesting. Doesn't change my analysis of this specific approach, though. The approaches discussed in the article are different.
We come up with the same algo.
F(N) = F(7n/8)+C
Time complexity is O(log(8/7)n), definitely better than O(N)
I need to explain here
F(N)=F(7N/8)+C, N=n^3
O(log(8/7)N) = O(3log(8/7)n) = O(log(8/7)n), better than O(n)
@LoocalVinci: That's not correct. The correct recurrence for this solution is F(N) = 3*F(N/2) + C, N = n^3. This evaluates, like I said above, to approximately O(n^4.75).
If you modified this solution to get a better recurrence, you could possibly get something like F(N) = 7*F(N/8) + C. But this is very different from F(N) = F(7N/8) + C! The former has a performance of N ^ (log base 8 of 7) power. Since N = n^3, this evaluates to about O(n^2.81), a little better than the brute-force n^3 search and considerably worse than the n^2 algo discussed in the top-voted answer.
I think the solution can be found in O(logN) operations. Please correct me if I am wrong.
The 3D array is an array of 2D array which in turn is an array of 1D array.
Consider an example
int a ={
{
{1,2},
{3,4}
},
{
{5,6},
{7,8}
}
}
In memory, all of these elements of 3D are stored in consecutive memory location.
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |
We can exploit this property to find the element in O(log(N)) operations. As the array is sorted we can use divide and conquer algorithm to achieve this. Below is the simple program to demonstrate the concept. The programs works absolutely fine.
#include <stdio.h>
#include <stdlib.h>
int main()
{
/*int a[][2][4]={
{
{1 ,2 ,3 ,4},
{5 ,6 ,7 ,8}
},
{
{9 ,10,11,12},
{13,14,15,16}
},
{
{17,18,19,20},
{21,22,23,24}
}
};
*/
int a[][2][2]={
{
{1,2},
{3,4}
},
{
{5,6},
{7,8}
}
};
// we'll find the start, end and pivot value using base pointer
// Starting address of the 1D array
int *start = **a;
// End address of the array
int *end = **a+sizeof(a)/sizeof(int)-1;
// Pivot would be set in middle of start and end
int *pivot = **a+(sizeof(a)/sizeof(int))/2;
// step would (start+end)/2, as pointer is used, +/- would have to be used
// Cant divide address by 2, not allowed
int step = (sizeof(a)/sizeof(int))/2;
// Value to be searched in the array
int val=8;
// Counter to tell the number of iterations
int i=1;
//printf("%d",*(**a+7));
//for(i=0;i<10;i++)
while(1)
{
step/=2;
if(step==0)
step=1;
printf("Start- %d End-%d Pivot-%d Step-%d\n",*start,*end,*pivot,step);
if(val<*pivot)
{
end=pivot; pivot-=step;
}
else if(val>*pivot)
{
start=pivot; pivot+=step;
}
else
{
printf("Value found in %d loop runs",i);
exit(0);
}
i++;
}
printf("Value not found");
return 0;
}
I think you misunderstand the question. A[0][0][N-1] is not guaranteed to be smaller than A[0][1][0]. The only sortedness property here is that if you have two positions whose coordinates in only one dimension, the position with the greater value of the differing dimension holds a value >= to the position with the lesser value of the differing dimension.
I think O(n) wil do this task ..correct me if wrong....
let assume arr[m][n][r] size.....we can consider as it as racks of rectangle put on each other....as this is sorted given in question..last element of each rectangle will be the biggest element of that ractangle, so we can go up & we can stop on the rectangle which have value greater then the required element... it will take O(m) time...
Now we can search that element in 2-D array in O(n) or o(r) which ever is greater size...
so total time is linear.....
I think its O(n^2) . There is a flaw in your algorithm .
We've to check for more than one rectangle.
.
Yep. While you may be able to rule out some of the rectangles, you still can't pinpoint it down to less than O(n) of them.
Sure. Why would the rectangle you stop on contain the value? Any rectangles after that rectangle could contain it too.
for(i=0 to maximum length)
for(J = 0 to maximum width)
do binary search in array[i][j][k = 0 to maximum hight]
maximum hight = c
maximum width = b
maximum length = a
then complexity = a*b* log(c)
where n = a*b*c
a small correction to your algo . you should find the rectangle for which your search number is greater than the last element of current rectangle and less than last element of the next rectangle . In this case the next rectangle is the rectangle where our number may lie . If you cannot find such a rectangle then the number doesnt exist in the 3d array .
we can apply binary search to reduce the complexity. if we apply binary search than complexity will reduce up to n.lgn.
I cnt think of an efficient solution. can anyone provide an easy solution or do we have to perform search by eliminating slices of rectangle if we go according to the rectangle-based search explained above?
I dont know whether this is correct .
I think this method works less than O(n)
for(i=0 to maximum length)
for(J = 0 to maximum width)
do binary search in array[i][j][k = 0 to maximum hight]
maximum hight = c
maximum width = b
maximum length = a
then complexity = a*b* log(c)
where n = a*b*c
@Njan .. your logic will work with lil change but i think u mis calculated the complexity..
binary search in sorted matrix nxn take O(n lg n) so 'n' numbers nxn n matrix will have O n * n(lg n) ie n2 log n.
even if i believe binary search for nxn takes O(n) still it turns out to be n2.
consider 3D matrix [i][n][n] as i number of [n]x[n] matrix
it takes O(n) time to search in 2D matrix (alogorithms r thr that r better)so
now in worst case it should take i *O(n) so surely best case is better in O(n2)
Search a specific element in n-element array won't take more than O(n) (brute force compare one by one). The question may be less than O(n).
To explain this clearly, think about a cube in which element array[i][j][k] has coordinate (i, j, k).
The key is you always do binary search on the diagonal.
First, search on the cube diagonal. You end up with two surfaces that are both incomplete.
Second, search on the diagonal of each surface. You end up with two incomplete lines on each surface.
Third, search on the total four lines.
Each step above takes O(log(n)) for binary search. So is the total.
def 3dsearch(array, element):
if len(array) < 2:
return search2d(array[0], element)
mid = len(array) / 2
if array[mid][0][0] > element:
return search(array[:mid], element)
else:
return search(array[mid:], element)
def 2dsearch(array, element):
if len(array) < 2:
return bin_search(array[0], element)
mid = len(array) / 2
if array[mid][0] > element:
return search2d(array[:mid], element)
else:
return search2d(array[:mid], element)
def bin_search(array, element):
if len(array) < 1:
return False
mid = len(array) / 2
if array[mid] == element:
return element
elif array[mid] < element:
return bin_search(array[mid + 1:], element)
else:
return bin_search(array[:mid], element)
print 3dsearch(array, element)
In Python I used 3 binary searches, one for each level
complexity :
for a list x[a[b][c]
complexity = log(a) + log(b) + log(c)
I could be wrong about the complexity though but it should be less than n2 ... it is not nested that is for sure. The first two are not pure binary searches.
This approach identifies the sub-array in first dimension that may have the value then gives that as imput to the 2d method and finally when we can trace the 1d array that will have the value we put it in binary search..
Axis X,Y & Z.
We need to find a
y=0,z=0
(1) Binary search for a from arr[0,y,z] to arr[X,y,z], or atleast narrow down to the max value lesser than a. The resulting index in X is x. (THIS IS O(log(Nx)))
(2) If a is still not found, search for a from arr[x,0,z] to arr[x,y,z]. THIS IS O(log(Ny))
Repeat from (1) for every value of arr[0,0,0] to arr[0,0,Z] which is less than a. This is O(Nz)
if we have a equal number of elements on all axis, Nx=Nz=Ny.
Total # of elements = M = N^3.
Total complexity is O(log(N)) x O(log(N)) x O(N)
=N x log(N^2) = O(N x log(N))
which is always lesser than O(N^2)
If the 3D array is sorted in all the 3 directions, its 1D equivalent array would also be sorted. If A[L][M][N] is a sorted 3D array, then A1[L*M*N], where A[i][j][k] = A1[i*M*N+j*N+k] will be a sorted 1 D array of (L*M*N) elements. The complexity of binary search on this would be log(L*M*N).
Please correct me if wrong.
I think you misunderstand the question. A[0][0][N-1] is not guaranteed to be smaller than A[0][1][0]. The only sortedness property here is that if you have two positions whose coordinates differ in only one dimension, the position with the greater value of the differing dimension holds a value >= to the position with the lesser value of the differing dimension.
Can we just perform a binary search??
A 3d is represented linearily in memory. So this 3d array can be represented as a 1D array(sorted) of size N^3. If we perform binary search in this the complexity would be just log(N^3).
A[0][0][N-1] is not guaranteed to be smaller than A[0][1][0]. The only sortedness property here is that if you have two positions whose coordinates differ in only one dimension, the position with the greater value of the differing dimension holds a value >= to the position with the lesser value of the differing dimension.
O(N^2) approach
For each 2D array, search an item in O(N).
How?
Let the matrix be ABCD sorted from A to B & from A to C & C to D & B to D.
Start searching the item at point C. If the item is greater,we can ignore the column AC. If its less, we can ignore row CD.
- Aashish July 22, 2012In this way, based on the comparison one of the row or column gets rejected. So, time complexity will be O(N+N)=O(N).