Bloomberg LP Interview Question for Software Engineer / Developers






Comment hidden because of low score. Click to expand.
0
of 0 vote

Use 26 buckets corresponding to each alphabet and insert names in sorted order. So, if name starts with A insert in the first bucket, if its starts with B insert in second bucket. To make sure that all the names starting with A are stored in sorted order care must be taken while inserting them in the hash. Obviously there will be several collisions and that be handled by chaining. If the maximum length of the chain is K then the efficiency to insert or search for a particular entry will be O(log(k)). This is the best I know, if anyone knows a better answer please comment.

- gauravk.18 April 04, 2008 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

I think a hash table with 26 buckets will be very inefficient for large data sets like a phone book, because there will be many collisions. Nowhere it says that list must be ordered. If you really want it ordered, use a binary tree.

- Sascha March 13, 2010 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

I think a hash table with 26 buckets will be very inefficient for large data sets like a phone book, because there will be many collisions. Nowhere it says that list must be ordered. If you really want it ordered, use a binary tree.

- Sascha March 13, 2010 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

Maybe the garbage collector is not working so we are not deleting the entries which go out of scope
When hash table becomes so large as to result into significant number of collisions, we expand the hash table. So maybe it exceeded 15Gb of size and it was doubled to 30Gb

It seems unlikely that such a large table could be stores on a 32 bit system. So it has to be on a distributed network

- vaisingh3 January 05, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

Maybe the garbage collector is not working so we are not deleting the entries which go out of scope
When hash table becomes so large as to result into significant number of collisions, we expand the hash table. So maybe it exceeded 15Gb of size and it was doubled to 30Gb

It seems unlikely that such a large table could be stores on a 32 bit system. So it has to be on a distributed network

- vaisingh3 January 05, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

how about a trie?

- Eric.ma1990 September 21, 2012 | Flag Reply


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More