## Google Interview Question

Software Engineers**Country:**United States

**Interview Type:**In-Person

@iwaow: No, X is a new array that I use as intermediate storage. Maybe better idea is to use X as a new max-heap. Sorry for not being clear enough.

If use X.pop in step 3 or use heap-maxify then it will introduce O(log k) extra complexity for each element we insert into X.

If k in worst case is n it going to be more than O(n) i.e. O(nlogn)

Am i understanding this right ?

Also to use median of medians or randomized quickselect we will need to partition and partial sort element in the new array which doesnt feel right when the interviewer says dont modify original heap

@um01: solution 2, as I estimate max of k is at level of O(logn), probably not up to O(n).

I also don't like the solution 1 because since we are not allowed to modify the heap, we have to make a copy of it, which doesn't make much sense. But that's the only way I know to achieve O(n) time complexity, and also O(n) space complexity.

and the complexity of this is not O(n), it is O(k*log(k)) time + O(k) space where kth max number is the ask.

and your first solution, modifys the input, we have to copy the input into another array and then apply that median of median so the complexity comes out here is O(n) space + O(n) time.

```
public static void findKthElementFromHeap(int[] heap, int k) {
PriorityQueue<Integer> q = new PriorityQueue<Integer>(
new HeapComparator(heap));
int val = -1;
q.add(0);
int i = 0;
while (!q.isEmpty()) {
i++;
int temp = q.poll();
if (i == k) {
val = heap[temp];
break;
}
int n = (2 * temp) + 1;
if (n < heap.length) {
q.add(n);
}
n = (2 * temp) + 2;
if (n < heap.length) {
q.add(n);
}
}
System.out.println(val);
}
public static void main(String[] args) {
int[] heap = {15, 12, 10, 8, 9, 5, 6, 6, 7, 8, 5};
findKthElementFromHeap(heap,7);
}
static class HeapComparator implements Comparator<Integer>{
int[] heap;
public HeapComparator(int[] heap){
this.heap = heap;
}
@Override
public int compare(Integer o1, Integer o2) {
return Integer.compare(heap[o2], heap[o1]);
}
}
```

Time O(n), memory O(n), no heap modification.

It works by sorting the heap in a quick way by using heap properties. The mechanism is similar to the one used in Dijkstra algorithm for shortest paths:

1. I start from the root node (heap max), adding it to the list of “edge nodes” - the nodes that I am choosing the current max from. This max is the next element in the ordered heap.

2. in an infinite loop I remove the max element from the edge nodes list, know that this is the next element in the order, then I take its children and add them to the “edge nodes” list. I sort the edge nodes list by merging in O(number of edge nodes), to be able to get the max in the next iteration in O(n), and repeat.

3. I finish when I return the k-th edge nodes value maximum, which is k-th element in the order

Memory complexity is O(n/2) == O(n) (in the worst case I have all heap leafs in the edge nodes list)

Time complexity is O(2n) == O(n) (for every processed edge node while adding its children I need to merge 2 elements child list with list of existing edge nodes that has between 1 and log n elements. This would make nlogn but if you look more carefully you will see that the ratio of those bad logns and good 1s is such it totals to 2n. This is similar situation like when at the first glance heap construction is O(nlogn), but if you look more carefully it can be in fact done in O(2n). Thank you Skiena, the best spent $80 in my computer programmer life :)

```
class EdgeNode {
int idx, value;
public EdgeNode(int idx, int value) {
this.idx = idx;
this.value = value;
}
}
// k == 0, 1, 2 …
public int getHeapKthElement(int a[], int k) {
if (a == null) { throw new IllegalArgumentException();}
if (a.length < k) { throw new NoSuchElementException();}
LinkedList<EdgeNode> edgeNodes = new LinkedList<NextNode>();
edgeNodes.add(new EdgeNode(0, a[0]);
int count = -1;
for (;;) {
EdgeNode maxNode = edgeNodes.removeFirst();
if (++count == k) {
return maxNode.value;
}
LinkedList<EdgeNode> childNodes = getChildNodesSorted(a, maxNode.idx);
edgeNodes = merge(edgeNodes, childNodes);
}
throw new RuntimeException(“we should never be here …”);
}
private LinkedList<EdgeNode> getChildNodesSorted(int a[], int i) {
LinkedList<EdgeNode> childNodes = LinkedList<EdgeNode>();
Integer child1 = i * 2 + 1 < a.length ? a[i * 2 + 1] : null;
Integer child2 = i * 2 + 2 < a.length ? a[i * 2 + 2] : null;
// two element sort, if needed
if (child1 != null && child2 != null && child2 > child1) {
Integer tmp;
tmp = child2;
child2 = child1;
child1 = tmp;
}
if (child1 != null) {
childNodes(child1);
}
if (child2 != null) {
childNodes(child2);
}
return newNodes;
}
private LinkedList<EdgeNode> merge(LinkedList<EdgeNode> edgeNodes, LinkedList<EdgeNode> childNodes) {
LinkedList<EdgeNode> merged = new LinkedList<EdgeNode>();
Iterator<EdgeNode> edgeNodeIt = edgeNodes.iterator();
Iterator<EdgeNode> childNodeIt = childNodes.iterator;
EdgeNode edgeNode, childNode;
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNode.next();
}
if (childNodeIt.hasNext()) {
childNode = childNode.next();
}
for (;;) {
if (edgeNode == null) {
if (childNode != null) {
merged.add(childNode);
}
while(childNodeIt.hasNext()) {
merged.add(childNodeIt.next());
}
return merged;
}
if (childNode == null) {
if (edgeNode != null) {
merged.add(edgeNode);
}
while(edgeNodeIt.hasNext()) {
merged.add(edgeNodeIt.next());
}
return merged;
}
if (edgeNode.value < childNode.value) {
merged.add(childNode);
if (childNodeIt.hasNext()) {
childNode = childNodeIt.next();
}
} else {
merged.add(edgeNode);
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNodeIt.next();
}
}
}
return merged;
}
```

Time O(n), memory O(n).

It works by sorting the heap in a quick way by using heap properties. The mechanism is similar to the one used in Dijkstra algorithm for shortest paths:

1. I start from the root node (heap max), adding it to the list of “edge nodes” - the nodes that I am choosing the current max from. This max is the next element in the ordered heap.

2. in an infinite loop I remove the max element from the edge nodes list, know that this is the next element in the order, then I take its children and add them to the “edge nodes” list. I sort the edge nodes list by merging in O(number of edge nodes), to be able to get the max in the next iteration in O(n), and repeat.

3. I finish when I return the k-th edge nodes value maximum, which is k-th element in the order

Memory complexity is O(n/2) == O(n) (in the worst case I have all heap leafs in the edge nodes list)

Time complexity is O(2n) == O(n) (for every processed edge node while adding its children I need to merge 2 elements child list with list of existing edge nodes that has between 1 and log n elements. This would make nlogn but if you look more carefully you will see that the ratio of those bad logns and good 1s is such it totals to 2n. This is similar situation like when at the first glance heap construction is O(nlogn), but if you look more carefully it can be in fact done in O(2n). Thank you Skiena, the best spent $80 in my computer programmer life :)

```
class EdgeNode {
int idx, value;
public EdgeNode(int idx, int value) {
this.idx = idx;
this.value = value;
}
}
// k == 0, 1, 2 …
public int getHeapKthElement(int a[], int k) {
if (a == null) { throw new IllegalArgumentException();}
if (a.length < k) { throw new NoSuchElementException();}
LinkedList<EdgeNode> edgeNodes = new LinkedList<NextNode>();
edgeNodes.add(new EdgeNode(0, a[0]);
int count = -1;
for (;;) {
EdgeNode maxNode = edgeNodes.removeFirst();
if (++count == k) {
return maxNode.value;
}
LinkedList<EdgeNode> childNodes = getChildNodesSorted(a, maxNode.idx);
edgeNodes = merge(edgeNodes, childNodes);
}
throw new RuntimeException(“we should never be here …”);
}
private LinkedList<EdgeNode> getChildNodesSorted(int a[], int i) {
LinkedList<EdgeNode> childNodes = LinkedList<EdgeNode>();
Integer child1 = i * 2 + 1 < a.length ? a[i * 2 + 1] : null;
Integer child2 = i * 2 + 2 < a.length ? a[i * 2 + 2] : null;
// two element sort, if needed
if (child1 != null && child2 != null && child2 > child1) {
Integer tmp;
tmp = child2;
child2 = child1;
child1 = tmp;
}
if (child1 != null) {
childNodes(child1);
}
if (child2 != null) {
childNodes(child2);
}
return newNodes;
}
private LinkedList<EdgeNode> merge(LinkedList<EdgeNode> edgeNodes, LinkedList<EdgeNode> childNodes) {
LinkedList<EdgeNode> merged = new LinkedList<EdgeNode>();
Iterator<EdgeNode> edgeNodeIt = edgeNodes.iterator();
Iterator<EdgeNode> childNodeIt = childNodes.iterator;
EdgeNode edgeNode, childNode;
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNode.next();
}
if (childNodeIt.hasNext()) {
childNode = childNode.next();
}
for (;;) {
if (edgeNode == null) {
if (childNode != null) {
merged.add(childNode);
}
while(childNodeIt.hasNext()) {
merged.add(childNodeIt.next());
}
return merged;
}
if (childNode == null) {
if (edgeNode != null) {
merged.add(edgeNode);
}
while(edgeNodeIt.hasNext()) {
merged.add(edgeNodeIt.next());
}
return merged;
}
if (edgeNode.value < childNode.value) {
merged.add(childNode);
if (childNodeIt.hasNext()) {
childNode = childNodeIt.next();
}
} else {
merged.add(edgeNode);
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNodeIt.next();
}
}
}
return merged;
}
```

Time O(n), memory O(n).

It works by sorting the heap in a quick way by using heap properties. The mechanism is similar to the one used in Dijkstra algorithm for shortest paths:

1. I start from the root node (heap max), adding it to the list of “edge nodes” - the nodes that I am choosing the current max from. This max is the next element in the ordered heap.

2. in an infinite loop I remove the max element from the edge nodes list, know that this is the next element in the order, then I take its children and add them to the “edge nodes” list. I sort the edge nodes list by merging in O(number of edge nodes), to be able to get the max in the next iteration in O(n), and repeat.

3. I finish when I return the k-th edge nodes value maximum, which is k-th element in the order

Memory complexity is O(n/2) == O(n) (in the worst case I have all heap leafs in the edge nodes list)

Time complexity is O(2n) == O(n) (for every processed edge node while adding its children I need to merge 2 elements child list with list of existing edge nodes that has between 1 and log n elements. This would make nlogn but if you look more carefully you will see that the ratio of those bad logns and good 1s is such it totals to 2n. This is similar situation like when at the first glance heap construction is O(nlogn), but if you look more carefully it can be in fact done in O(2n). Thank you Skiena, the best spent $80 in my computer programmer life :)

```
class EdgeNode {
int idx, value;
public EdgeNode(int idx, int value) {
this.idx = idx;
this.value = value;
}
}
// k == 0, 1, 2 …
public int getHeapKthElement(int a[], int k) {
if (a == null) { throw new IllegalArgumentException();}
if (a.length < k) { throw new NoSuchElementException();}
LinkedList<EdgeNode> edgeNodes = new LinkedList<NextNode>();
edgeNodes.add(new EdgeNode(0, a[0]);
int count = -1;
for (;;) {
EdgeNode maxNode = edgeNodes.removeFirst();
if (++count == k) {
return maxNode.value;
}
LinkedList<EdgeNode> childNodes = getChildNodesSorted(a, maxNode.idx);
edgeNodes = merge(edgeNodes, childNodes);
}
throw new RuntimeException(“we should never be here …”);
}
private LinkedList<EdgeNode> getChildNodesSorted(int a[], int i) {
LinkedList<EdgeNode> childNodes = LinkedList<EdgeNode>();
Integer child1 = i * 2 + 1 < a.length ? a[i * 2 + 1] : null;
Integer child2 = i * 2 + 2 < a.length ? a[i * 2 + 2] : null;
// two element sort, if needed
if (child1 != null && child2 != null && child2 > child1) {
Integer tmp;
tmp = child2;
child2 = child1;
child1 = tmp;
}
if (child1 != null) {
childNodes(child1);
}
if (child2 != null) {
childNodes(child2);
}
return newNodes;
}
private LinkedList<EdgeNode> merge(LinkedList<EdgeNode> edgeNodes, LinkedList<EdgeNode> childNodes) {
LinkedList<EdgeNode> merged = new LinkedList<EdgeNode>();
Iterator<EdgeNode> edgeNodeIt = edgeNodes.iterator();
Iterator<EdgeNode> childNodeIt = childNodes.iterator;
EdgeNode edgeNode, childNode;
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNode.next();
}
if (childNodeIt.hasNext()) {
childNode = childNode.next();
}
for (;;) {
if (edgeNode == null) {
if (childNode != null) {
merged.add(childNode);
}
while(childNodeIt.hasNext()) {
merged.add(childNodeIt.next());
}
return merged;
}
if (childNode == null) {
if (edgeNode != null) {
merged.add(edgeNode);
}
while(edgeNodeIt.hasNext()) {
merged.add(edgeNodeIt.next());
}
return merged;
}
if (edgeNode.value < childNode.value) {
merged.add(childNode);
if (childNodeIt.hasNext()) {
childNode = childNodeIt.next();
}
} else {
merged.add(edgeNode);
if (edgeNodeIt.hasNext()) {
edgeNode = edgeNodeIt.next();
}
}
}
return merged;
}
```

a)If we have to do in time O(n) and space O(n) then quick sort partition method can be used on a copied array.

Approach b)All levels except last level in Heap are guaranteed to be complete binary tree.

Let i be the level of k the element i=log2(K+1)-1, and copy all elements from level i-1 and i into temp array. Now do the step a procedure.

Since we have copied from two levels and used log, thus log complexity. Also time= log(n), we are taking linear time on log space.

Please critique.

Why can't you flip the question and say, find the (n-k)th smallest number in the heap.

The leaves are the smallest numbers in the tree.

Find smallest number in the set of leaves

Replace that number with its parent.

Repeat and find the smallest number in the new set

Repeat (n-k) times to get the kth largest number.

I don't see why this doesn't work

```
*algorithm :
262 *1. Let A be the the orginal max-heap array
263 *2. Initially queue is empty and rank = 0
264 *3. Insert root(max element of A) in queue
265 *4. while(Queue!= empty){
266 *
267 * //add queue currents node's left and right child to the queue
268 * enqueue(leftchild);
269 * enqueue(rightchild);
270 * //heapify the queue
271 * heapify(Queue);
272 *
273 * //delete node from queue (deletion is done from front and insertion from tail, like BFS)
274 * dequeue(Queue);
275 *
276 * rank++;
277 *
278 * if(rank == ith order statsictic)
279 * return the dequeued element
280 *
281 *}
282 *
283 *complexity : since to find Kth largest we atleast need to dequeue K element , and for each element , we need to heapify the
284 *queue which take log(n) time , n-> no of elements in the queue at a given time,
285 * so total time = Klog(n)
```

thoughts/critiques?

Idea:

Use a modified Dijkstra's algorithm, and get O(klgk) algorithm.

Use the existed Selection algorithm, and get O(n) algorithm. [1]

`Estimate and choose one algorithm, induce an O(min(klgk,n)) algorithm.`

```
SelectKth(MaxHeap h, k)
n = h.size;
if ( klgk < n )
return DijkstraKth(h,k)
return SelectKthByDC(h,k)
```

A sample for the modified Dijkstra's algorithm.

```
Sample: n = 7, k = 3
[0]100 3 largest priority queue
[1]95 [2] 80 100 95 90 85 80
[3]90 [4]85 [5]75 [6]70
```

Sample code for the modified Dijkstra's algorithm

```
// 1. Memorize the index, so that we can know its child.
// 2. Use a pointer can avoid to spend additional space to memorize index.
// However, the space complexity remains O(k).
class HeapNode<Type> : IComparable where Type : IComparable
{ int i ; // index
Type v; // value
int CompareTo( HeapNode<Type> y){ return y.v - v; }
}
Type DijkstraKth(Heap<Type> h,int k) where Type: IComparable
{
if ( k > h.size )
throw Exception(“k must <= h.size”);
if ( k <= 0 )
throw Exception(“k must greater then 0);
PriorityQueue queue = new PriorityQueue();
queue.Add(h[0]); // Add the root node
while ( k>1) // k=1, return root
{ HeapNode<Type> x = queue.ExtractMax();
queue.Add(h[x.i*2 + 1]); // left node
queue.Add(h[x.i*2 + 2]); // right node
k --;
}
return queue.max().v;
}
```

```
Time Complextity:
Use a Fibonacci heap as the priority queue
2k times Add(), k times ExtractMax() => O(k) * O(lgk) = O(klgk) time
Space Complexity:
2k times Add() => O(k) space
```

Lower Bound:

I think it is impossible to get kth in O(lgn) worst case time.

`The Lower Bound of the worst case time for large k is O(n).`

The provement is by contradiction:

Assume the worst case time is lower then O(n)

let k = n, means to find the smallest.

The smallest must on the leaves.

The leaves for a balenced tree is O(n).

The elements in leaves are non-orginized.

The lower bound to find smallest element need n - 1 comparisons, that is, O(n) worst time.[2]

Contradict to the Assumption!

[1] Cormen, Leiserson, Rivest and Stein, Introduction to algorithms, third edition.p220-229

[2] Cormen, Leiserson, Rivest and Stein, Introduction to algorithms, third edition.p214-215

Finding the kth element in BST and others is called selection, we can do this by traversing the tree as the BST is a tree wide property. However, in binary heaps, one can not assure all of the elements on the right side is larger than the left.

Only correct way to solve this is to read the heap, O(n) ( as we are not supposed to destroy the heap).

Then insert that to a Priority Q and then deleteMax for K times.. to get the kth element.

There is another very interesting way to do this, deleteMax for K times from the heap

Insert the values to end of the queue but do not rehapify,

After reinserting the kth you can run the reheapify and restore to the order. This is log N operation, assuming K is small.

If you do not have access to PQ utilities or do not want to touch that, you can store K values in a Queue. Then reinsert them to the PQ and it can be done in K * Log N.

In short this question is not correctly targeted or asked and it seems interviewer did not fully understand the heap properties. When such questions come up in interviews, I normally ask clarifying questions and explain the interworking in more details.

O(logn) time, O(logn) memory

1. find level of max heap that kth largest element is on, assume root is level 0

2. create another array consisting of elements from that level

3. keep finding max of that array until you've found kth largest element

Ex. k = 6, A = [9 6 8 5 3 4 1]

1. log(6, 2) = 2, so 2nd level

2. elems = [5 3 4 1]

3. want 6 % 2^2 + 1 = 3rd largest max of elems

a. [5 3 4 1] -> pop 5

b. [3 4 1] -> pop 4

c. [3 1] -> return 3

```
from math import log
def kthLargest(k, A):
start = 2**int(log(k, 2)) - 1
end = start + 2**int(log(k, 2))
numMaxes = k % 2**int(log(k, 2)) + 1
elems = A[start:end]
for i in range(numMaxes):
maxElem = max(elems)
elems.pop(elems.index(maxElem))
return maxElem
print(str(kthLargest(6, [9, 6, 8, 5, 4, 3, 1])))
```

O(logn) time, O(logn) memory

1. Find level of kth largest element (assuming root is level 0)

2. Create an array of elements from that level

3. Pop max from that array until you get kth largest element

Ex. k = 6, A = [9, 6, 8, 5, 3, 4, 1]

1. Level 2, int(log(6, 2)) = 2

2. a2 = [5, 3, 4, 1]

3. 3rd max from a2, 6 % 2^2 + 1 = 3

a. a2 = [5, 3, 4, 1], pop 5

b. a2 = [3, 4, 1], pop 4

c. a2 = [3, 1], return 3

```
from math import log
def kthLargest(k, A):
start = 2**int(log(k, 2)) - 1
end = start + 2**int(log(k, 2))
numMaxes = k % 2**int(log(k, 2)) + 1
elems = A[start:end]
for i in range(numMaxes):
maxElem = max(elems)
elems.pop(elems.index(maxElem))
return maxElem
print(str(kthLargest(6, [9, 6, 8, 5, 3, 4, 1])))
```

UPD: put all elements of heap into array and find k-th max. Use divide and conquer technique to achieve O(n)

/*

We know that every level of heap contains 2 ^ i elements. Taking this into account we may find the level where the k-th max is located. After we add all elements in this level to array, note that number of elements in this level will be less or equal 2 ^ i. Now we need to find k-th max in the array, which can be done using divide and conquer in linear time.

*/

Won't work.

Take this heap as an example:

15, 12, 10, 8, 9, 5, 6, 6, 7, 8, 5

For k = 7 (counting from 1), the k-th max is in level 3 (counting from 0), and your algorithm would assume it's in level 2 because levels 0-2 contain 7 elements.

o(n) solution

1. max heap 1 to n/2 elements arre non leaf nodes and n/2 +1 to n are leaf nodes

2. if k > n/2 target is in n/2 + 1 else target is in 1 to n/2

3. for k > n/2 , now we need to find k-n/2 th max element in n/2+1 to n array

4. create new max heap of n/2 + 1 to n array and repeat same process.

I think it is not correct,

It is not guaranteed that the elements in the nth level are all larger than the elements in the (n + 1) th level.

K-th largest element can be found in O(n) regardless of the fact that this is max-heap. You can use selection algorithm (quickselect).

But I cannot think of a method to do it in log(n) and was not lucky to find anything else except some proof that this cannot be done faster..

Here is a basic algorithm that assumes all the nodes needed exist (checking for null nodes or "out of bounds" indices only adds clutter). The basic idea is that you use a secondary heap to store the nodes you have yet to traverse. I'm not sure whether is satisfies the O(n) bound. Getting this done in log(n) will require some clever binary search.

```
public class HeapStatistics {
public Node<T> kthElem(int k, Heap<T> hp) {
if(k == 0) return hp.max();
if(k == 1) return hp.max().maxChild();
Heap<T> candidates = new Heap<T>(hp.max().minChild());
Node<T> curr = hp.max().maxChild();
k--;
while(k > 0) {
if(curr.maxChild().val >= candidates.max().val) {
candidates.add(curr.minChild());
curr = curr.maxChild();
} else {
candidates.add(curr.maxChild());
candidates.add(curr.minChild());
curr = candidates.extractMax();
}
k--;
}
return curr;
}
}
public interface IHeap<T> {
public T max { get; set; }
public T add(Node<T> n);
public T extractMax();
}
public class Node<T> {
int key;
T val;
public Node<T> parent(Node<T> n);
public Node<T> left(Node<T> n);
public Node<T> right(Node<T> n);
public Node<T> maxChild(Node<T> n) {
return n.left().val >= n.right().val ? n.left() : n.right();
}
public Node<T> minChild(Node<T> n) {
return n.left().val >= n.right().val ? n.right() : n.left();
}
}
```

you are using additional O(n) space.

I am not sure about correctness of your algorithms. Can you explain why you want to move towards minChild or maxChild?

Every node N of a max heap satisfies the following property:

second largest value in subtree rooted at N = max(N.left, N.right)

Note that it is not true (in general) that the third largest value (of the subtree rooted at N) is min(N.left, N.right). We only know that if min(N.left, N.right) is not the the third largest value (of the subtree rooted at N), then it lies within the subtree rooted at max(N.left, N.right).

Explaining this at the root should make the algorithm clear. Let H1 be the original heap and H2 the secondary. After the root, the second largest element of the H1 is max(root.left, root.right) == root.maxChild(). As per the note above, we do not know if min(root.left, root.right) == root.minChild() is the third largest element of H1. We only know that the subtree rooted at root.maxChild() may contain values larger than root.minChild(). Therefore, we store root.minChild() in H2.

Now the loop begins. Basically H2 keeps a list of the nodes/indices of H1 that haven't been counted as one of the largest k elements "yet," and every time you pass over a node's "minChild" to check the "maxChild subtree" for larger elements, we store that "minCHild" in H2 (which keeps the max yet-to-be-counted value at its root). If it is ever the case the the node stored a H2's root is larger, we must add both of curr's children to H2 since we have not traversed either.

Sorry for spam, but after reading hieu.pham's answer, mine seems to be similar except that I am using a secondary heap to store unused values, and hieu.pham uses an array with insertion sort.

Solution 1. Using median of median you can find kth largest element in worst case O(n).

- hieu.pham June 11, 2015Solution2. Using max-heap property, I came up with the following solution, at least O(n):

Denote by A the max-heap.

0. X <-- {new array}, rank r = 1

1. element at rank 1: e_1 the largest element is the first element in the max-heap

2. element at rank r+1 = max( X.pop(), leftchild(e_r), rightchild(e_r) ) if any of those exist

3. insert the not-maximum element(s) in 2 into X at the right place (insertion sort)

4. r++

5. if r<k go to 2, else return e_r

There might be room for improvement here, but I haven't found it yet.

Edit: An idea for improvement is to maintain X as a new max-heap, and in step 3 use heap-maxify to insert elements