I didn't really have fun today. My final solution in Ruby took way too long to get too and doesn't feel like an accomplishment. Rather it's a Dynamic Programming trick you just have to know or see, but not something to derive at logically or interestingly.
The grunt work is O(n), the sort is O(n log n), so sorting the input becomes the "limiting" factor. YMMV.
require'benchmark'adapters=File.readlines('input.txt').map(&:to_i).sortmaximum_jolt=adapters.last+3adapters.push(maximum_jolt)=begin
# For the example input, here is what is being calculated:
target: options => total options
1: only 1 option (0 + 1) => 1
4: only 1 option (1 + 3) => 1
5: only 1 option (4 + 1) => 1
6: either 4 + 2 or 5 + 1 => 2
7: 6 + 1 or 5 + 2 or 4 + 3 => 2 (6) + 1 (5) + 1 (4)
10: only 1 option (7 + 3) => 4
11: only 1 option (10 + 1) => 4
12: either 10 + 2 or 11 + 1 => 4 (10) + 4 (11)
15: only 1 option 12 + 3 => 8
16: only 1 option 15 + 1 => 8
19: only 1 option 16 + 3 => 8
22: only 1 option 19 + 3 => 8
=endBenchmark.bmdo|x|x.report(:part_1)dojolt_groups=adapters.each_with_index.mapdo|adapter,index|adapter-(index>0?adapters[index-1]:0)end.group_by(&:itself)putsjolt_groups[1].length*jolt_groups[3].lengthendx.report(:part_2)do# Track the number of options to get to a target jolt value# and default to 0. The first jolt value is 0, and can only# be reached in one way.options=Hash.new(0)options[0]=1adapters.eachdo|target_jolts|options[target_jolts]=[1,2,3].sum{|difference|options[target_jolts-difference]}endputsoptions[maximum_jolt]endend
In this case, the problem can be broken down and simplified. We're not looking for the list of combinations. We are looking for the total number of options to build a valid connection between start (0) and end (max + 3).
For each adapter the following is true:
The number of paths to get to this adapter from the start is equal to the sum of the number of paths to get from the previous adapter to this one.
That means that this problem can be reduced. You can walk the list from start to end or end to start and only answer the question: paths to / paths from this adapter.
Storing the intermediate outcome is a typical dynamic programming this.
Determine number of paths between start and first adapter (always 1). Store it.
1: only 1 option (0 + 1) => 1
Stored is the value 1 (ways to get to this index) on the index 1 (the joltage of the first adapter):
[1]
Determine number of paths between the second - 1 and second - 2 adapters. This is the total number of paths from start to the second adapter. Store it.
Stored at index 4 (joltage) is the number of paths to this index: ways to joltage 4 - 3 plus ways to joltage 4 - 2 plus ways to joltage 4 - 1.
[1, 0, 0 , 0,1]
Determine number of paths between the third - 1, third - 2 and third - 3 adapters. The sum is the total number of paths from start tot the third adapter. Store it.
Stored at index 5 (joltage) is the number of paths to this index: ways to joltage 5 - 3 plus ways to joltage 5 - 2 plus ways to joltage 5 - 1.
[1, 0, 0 , 0, 1, 1]
Determine number of paths between the fourth - 1, fourth - 2 and fourth - 3 adapters. The sum is the total number of paths from start tot the fourth adapter. Store it.
This is what is key in Dynamic Programming. You store intermediate results so that you don't need to do the work again. A classic example is solving Fibonacci.
You may want to solve it recursively, but the problem with doing that without memorisation step is that you'll be calculating the same values over and over. Instead, because each fib(n) equals fib(n-1) + fib(n-2), the pair input and output can reduce the recursive complexity of
O(2^n) to O(n). Each term is only considered once, instead of many many many times.
Oh wow, thank you very much π Really didn't expect such a lengthy, comprehensible answer. So, thank you very much again! Just thought it's a special technique one could look up and practice but I guess I just have to look into dynamic programming again ππ»
I didn't really have fun today. My final solution in Ruby took way too long to get too and doesn't feel like an accomplishment. Rather it's a Dynamic Programming trick you just have to know or see, but not something to derive at logically or interestingly.
The grunt work is
O(n)
, the sort isO(n log n)
, so sorting the input becomes the "limiting" factor. YMMV.Hey, thank you very much! Could you give me a hint on how this "trick" is called?
Certainly. It's dynamic programming!
In this case, the problem can be broken down and simplified. We're not looking for the list of combinations. We are looking for the total number of options to build a valid connection between start (0) and end (max + 3).
For each adapter the following is true:
The number of paths to get to this adapter from the start is equal to the sum of the number of paths to get from the previous adapter to this one.
That means that this problem can be reduced. You can walk the list from start to end or end to start and only answer the question: paths to / paths from this adapter.
Storing the intermediate outcome is a typical dynamic programming this.
Stored is the value 1 (ways to get to this index) on the index 1 (the joltage of the first adapter):
Stored at index 4 (joltage) is the number of paths to this index: ways to joltage 4 - 3 plus ways to joltage 4 - 2 plus ways to joltage 4 - 1.
Stored at index 5 (joltage) is the number of paths to this index: ways to joltage 5 - 3 plus ways to joltage 5 - 2 plus ways to joltage 5 - 1.
As you can see, here is where it becomes interesting. By storing the ways to get to a joltage in the array, when we need it's Γ©valuΓ©s, it doesn't need to calculate or worse, recalculate those values.
This is what is key in Dynamic Programming. You store intermediate results so that you don't need to do the work again. A classic example is solving Fibonacci.
You may want to solve it recursively, but the problem with doing that without memorisation step is that you'll be calculating the same values over and over. Instead, because each fib(n) equals fib(n-1) + fib(n-2), the pair input and output can reduce the recursive complexity of
O(2^n) to O(n). Each term is only considered once, instead of many many many times.
Oh wow, thank you very much π Really didn't expect such a lengthy, comprehensible answer. So, thank you very much again! Just thought it's a special technique one could look up and practice but I guess I just have to look into dynamic programming again ππ»
Yeah. Once you do a series of solutions using Dynamic Programming, you'll get a feel for it!
Thanks for the clear explanation Derk-Jan Karrenbeld!
Here is my implementation in Python: