A favorite category of quant interview questions is sorting and complexity. Obviously, if you're doing quant, most of the time you'll be coding, and a good understanding of computer science (algorithms in particular) is crucial.
Here's one that's more computer science oriented than most, but fair game nonetheless:
You're given 1 billion integers to sort. You have 1Gb in memory. How would you achieve the task? Complexity?
Would you do it differently if the integers are guaranteed to be unique? Complexity?
Solution:
Basically, some kind of merge sort. Assume memory M and integers N. Step 1: sort N/M length-M lists. Step 2: merge the N/M lists.
Yeah, for non-CS people, it's time to review the various ways to sort. For step 2, review heaps. The answer is straightforward if you remember what a heap is. The complexity is some combination of N log M and N log (N/M).
If the integers are guaranteed to be unique... this is kinda neat. So you have 1Gb in memory, which is more than enough bits to use each bit as a flag to indicate the existence of a, say 32-bit integer. So simply go through your list of N integers and populate the corresponding bit in memory for each of those integers.
This will yield a O(N) solution. Essentially a simple bucket sort.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment