DEV Community

Discussion on: Python Programming Challenge

Collapse
 
idanarye profile image
Idan Arye

The trick is that you can deduce the products for the next X from the ones of the current X without multiplying everything again:

prod(A[0]..A[X+1]) = prod(A[0]..A[X]) * A[X+1]
prod(A[X+2]..A[N-1]) = prod(A[X+1]..A[N-1]) / A[X+1]

Using that, you just need two passes on the array - once to get the initial product of the second half (the entire array) and once again to update the products for each element and find the minimal difference:

def splitted_sums(array):
    from functools import reduce
    from operator import mul
    first_half_sum = 1
    second_half_sum = reduce(mul, array, 1)
    for i, elem in enumerate(array):
        first_half_sum *= elem
        second_half_sum /= elem
        yield i, first_half_sum, second_half_sum


index = min(splitted_sums(array), key=lambda t: abs(t[1] - t[2]))[0]
print(index)
Collapse
 
subsr97 profile image
Subramanian 😎

How is this different from my second approach?

Collapse
 
idanarye profile image
Idan Arye

Now that I look at it - it isn't. I was probably not paying enough attention when I looked at your original post, and thought that you are calculating the entire products for each index...

The time complexity should be linear, and because every element can change the result you can't have sublinear complexity. So how did you exceed the time complexity? Are they timing the actual run instead of doing complexity analysis?

Thread Thread
 
subsr97 profile image
Subramanian 😎

Yes I think the actual run time is considered for timeout.

I'm unsure of what went wrong. That's why I wanted to discuss about this challenge. :)