Too Many Pointers

Speedy Solutions, Speedy Fast!

It should be no surprise that very often there is more than one solution to a given problem. In my previous post, I explained how the two pointer technique can get you some major wins with some white board-style interview questions. However, the one downside of the two pointer technique is that you need to have seen it before in order to know how and when to use it. If not, it’s unlikely you’ll be able to craft the solution together in the heat of an interview (unless you’re a coding genius, which I’m not doubting that some of you are).

So what do you do if you’re not a black belt at two poiner-fu? Do you just throw your hands up in the air, embarrass yourself with the brute force solution, and move onto the next interview you had lined up?

Not so fast! As I alluded before, there are often multiple solutions to the same question and this problem is no exception to that. I personally really love the simplicity of this solution, so buckle up while we go on this speedy ride!

Blazing Fast Look-Ups with Hash Tables

If you’re not already aware, HashTables and HashSets (and any of their derivatives) have this incredible superpower when it comes to looking up values. In order to appreciate this superpower (because with great power comes great responsibility), let’s think about one of the most common alternatives to hashed data structures: arrays. When it comes to arrays, in order to see if a value exists in the array, you need to potentially iterate through the entire array. This can happen if the value is either at the end of the array or it doesn’t exist at all. This is certainly a downside to this data structure since it provides a time complexity of O(n). Now, what if we could bring that time complexity all the way down to O(1)? Sounds crazy right?

That’s not crazy at all, that’s our superpower! Any hashed data structure can find a value in just O(1) time as long as you have the value’s key. This is assuming that you’ve overriden and implemented your hashCode() function correctly for the class that you’re trying to look-up (otherwise bad things will happen), which you should be doing anyway. Most language-provided classes already provide their own hashCode() implementation and Kotlin’s data class provides this and equals() out of the box. If you’re intimidated by the thought of writing your own hashCode(), vast majority of IDEs will generate this function for you.

Now that you’re convinced of the speedy fast lookups of hash tables, let’s solve this problem speedy fast!

The Speedy Solution

As a refresher, here’s the two sum problem statement that we had previously solved:

Given an array nums and an integer target, find two indices within nums whose elements sum up to target. You may assume that each input will have a unique solution and you cannot use the same element twice. For example:

nums = [1, 2, 3, 4, 5]

target = 5

output = [0, 3]

Before we do any sort of iterating, we’ll first create a hashed data structure (a hash map should work just fine). As you’d suspect, we’ll gradually be inserting the elements of nums into the map. As we insert values, we’ll also be calculating the complement of the element. The complement is the difference between the target and the current value we’re iterating over. For example, if the target is 5 and the current value we’re looking at in nums is 2, then the complement is 5 – 2 = 3.

If the complement already exists in the map then… 💡! We know that the two sum indeed exists. This idea breaks down into a joyfully simple strategy:

  1. Create a hash map, where the keys will be an element in the array and the value will be the element’s index
  2. Iterate over nums
  3. For a given element in nums, calculate the complement.
  4. If the complement already exists in the map, then we’ve found the two sum! Otherwise, simply add the element and its index to the map.
  5. If we finish iterating over the entire array, then return an empty array since the two-sum does not exist.

Now onto the code solution:

fun twoSumMap(data: IntArray, target: Int): IntArray {
    val map = HashMap<Int, Int>()
    
    for ((index, value) in data.withIndex()) {
        val complement = target - value
        if (map.containsKey(complement)) {
            return intArrayOf(index, map[complement]!!)
        }
        map[value] = index
    }

    return intArrayOf()
}


fun main() {
    val result = twoSumMap(intArrayOf(1, 2, 3, 4, 5), 5)
    println("Result of using hashed data structure: ${Arrays.toString(result)}")
}

Kotlin Playground

Oh, the Complexities

Let’s look at what we gain and potentially lose with this solution in terms of time and space complexity. Since at most we will iterate over the entire array once, our time complexity results to being O(n), where n is the number of elements in nums. If the two sum doesn’t exist, we unfortunately won’t know that until we iterate over the entire array, so this scenario is what leads us to O(n). Our space complexity also results to being O(n), which also happens if the two sum doesn’t exist.

In comparison to our two pointer solution, time complexity is equivalent but the space complexity is degraded with this solution. That is certainly a downside to this strategy and if you’d be using it during a coding interview question, you likely won’t have as much credibility since a two pointer solution is much more space efficient:

SolutionTime ComplexitySpace Complexity
Two PointerO(n)O(1)
HashMapO(n)O(n)

However, let’s not shoot this solution completely down. I, for one, still believe it’s very powerful in its simplicity. It’s much easier to see how the solution works once you identify the complement since it doesn’t require you to study and practice the two pointer pattern. Therefore, one can argue that if this were to be used in production code, this solution carries a lower level of technical debt. The cognitive load is lighter and it allows less-familiar developers to understand how it works. Also, if you know that your data set will always be in some limited range of sizes and/or you aren’t overly concerned about memory constraints, then it’s likely that you can live with the space complexity that this solution carries.

Speeding Across the Finish Line

In this post, we’ve explored an alternative solution to the two sum problem by taking advantage of speedy fast look-ups with hash tables. Although not quite as efficient as the two pointer approach, it solves the problem in a beautifully simple way. Stay tuned for my next post on coding interview questions and problem solving patterns. Happy coding everyone!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s