I must be sleep deprived. 1024^3 = gigabytes, not terabytes. Wow, then they've definitely over-engineered it, and disk isn't even necessary in the event of a 10-fold increase in the database size. Tries, search trees, or properly sized hash tables on phone number prefixes should be plenty fast and the whole thing can fit in RAM on a modern laptop...
Our solution maintains the data in a simpler scheme because keeping the data in a maintainable format is important aspect when the data is ever changing - every couple of days millions of people update their preferences and we need to be able to apply these changes. When choosing a single data structure for all the numbers, we need to optimize for space or for faster inserts/updates. Our solution is a trade-off between these options.
A hash table is more than compact enough. It should work fine even at high load factors (80%/90%) given a decent hash scheme (eg. Robin Hood hashing) which means you need only 4GB memory. It's random access and on average should have one cache miss per lookup. Updating is rather trivial.
I get 3.2GB, which fits in RAM.
So yes, I guess, but why hit the disk at all? I'd probably just sort the data and do a binary search.