Bloomberg LP Interview Question for Software Engineer / Developers


Country: United States




Comment hidden because of low score. Click to expand.
4
of 4 vote

there's a couple questions you want to ask in this case: if memory efficiency is an issue, if you need results in real time, and whether the data has a lot of duplicates.

if memory efficiency isn't an issue, then you should make the hash table much more than 1000 entries to avoid collisions right? ( I guess it depends on how you're hashing your data and the likelihood of there being collisions... )

if you don't need data in real time, then buckets can be implemented with linked list to deal with collision.

if you need data in real time, and there is a lot of duplicates, then would it be better to implement buckets with BST to improve the access time? Not sure if the complexity would be worth it if you're only dealing with 1000 items...

- Anonymous February 02, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
3
of 3 vote

it depends on situation.......there is no exact answer as what i reckon...
if your data is distributed equally in such a way that very few or little collision then take size >1000 which is prime number...........
so performance of searching and insertion get enhanced.....

if your data has lot of collision than go for smaller table size...and use chaining to chain the records or make bucket......

hope this helps......

- Hitesh September 22, 2011 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

bool isPrime(int n) {
   //if n is prime, then return true, otherwise return false
}

int HashSize(int item_num)
{
   int hashSize;
   for(hashSize = 2 * item_num + 10; !isPrime(hashSize); ++hashSize)
        ;
   return hashSize;
}

- nim September 11, 2011 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

Seems the answer should be smaller than the number of items; otherwise, extra space will be allocated and never used (such waste is never totally avoidable, but should be minimized).

- Tony September 12, 2011 | Flag
Comment hidden because of low score. Click to expand.
-1
of 1 vote

On the contrary, you should have more slots than items. Some items' hashes will collide even with the best hashes. You want to avoid collisions as much as possible. I'd say, something like the smallest prime greater than 1.5 x |items|.

- JeffD January 12, 2012 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

I was curious to see how nim's suggested hash size holds up. For 1,000 items we generate 2011 buckets. Simulating random numbers for 1,000 elements I produced 379 collisions 37.9%. Running the same simulation, with a purely deterministic random, the outcome was 212 collisions 21.2%.

- Michael Kofman February 09, 2012 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

there's a couple questions you want to ask in this case: if memory efficiency is an issue, if you need results in real time, and whether the data has a lot of duplicates.

if memory efficiency isn't an issue, then you should make the hash table much more than 1000 entries to avoid collisions right? ( I guess it depends on how you're hashing your data and the likelihood of there being collisions... )

if you don't need data in real time, then buckets can be implemented with linked list to deal with collision.

if you need data in real time, and there is a lot of duplicates, then would it be better to implement buckets with BST to improve the access time? Not sure if the complexity would be worth it if you're only dealing with 1000 items...

- angie February 02, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

@Leo,
Not correct. The proper answer should be taking into account the load factor (call it "a") so the lookup time should be constant on an average which means that the n = THETA(m) where m = # of slots. But then again, the constant hidden in the THETA is determined by memory constraints. The real objective of the interviewer would be (I reckon) that the interviewee understands this.

- Anonymous March 13, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
-1
of 1 vote

The answer is simple.

You have 1000 items and you want to access them as fast as possible. Nothing is said that other items will be ever added. So of course you need 1000 slots!

Memory couldn't be an issue here, just 1000 items! (Well, if you are programming a washing machine with very small memory... but computers in Bloomberg have enough memory to hold hash tables with 1000 slots).

- Leo March 08, 2012 | Flag Reply


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More