Big O is a big deal when it comes to algorithms. And I don't know if it's just me, but I feel like I spent a lot more time understanding time compl...
For further actions, you may consider blocking this person and/or reporting abuse
Here from Odin, just wanted to point something out.
Your explanation of the space complexity is mostly accurate. However, in the given code snippet (example 2), the space complexity is actually O(1), meaning it is constant and not directly proportional to the input size (n).
Space complexity measures the additional space or memory required by an algorithm based on the input size. In this code, the space used remains constant, regardless of the size of the input array.
The only additional space used in this code is for the variable
sum
, which is a single integer. It doesn't depend on the size of the array itself, but rather on the number of elements present in the array. Consequently, the space complexity is constant, denoted as O(1).It's important to distinguish space complexity from the space required to store the input data. In this case, the space complexity of the code is independent of the input array's size.
However, if you were to introduce a copy of the array and utilize that copy in the code, it would indeed increase the space complexity. Let's examine the modified code:
With this modification, the
array_copy
variable is introduced to hold a copy of the original array.Since we are now allocating additional memory to store the copied array, the space complexity becomes dependent on the size of the input array. Consequently, the space complexity would be O(n), where n represents the number of elements in the array.
Here from TOP too and the lesson on memory complexity states:
'When a data structure is passed in as the argument, especially for languages that pass arrays by reference rather than value, it can be a bit unclear if that method considers the space used by that data structure when calculating its space complexity. If we didn’t count it, then it would be easy for all our methods to have great space usage on paper because we put the onus on the caller to allocate that space. If we did count it, but the data structure was created for use by many different methods, then the space complexity for all those methods is O(N) when they aren’t utilizing additional space. Then consider that if your method receives an array as an input and loops it, an index must be created for the loop which uses additional space.
..Ultimately when you consider Big O measures the worst-case scenario, it would be easier to err on the side of caution and do consider the space of arguments passed to your method."
So wouldn't the space complexity be O(n) and the auxiliary space is what would be O(1)? Let me know if I'm misunderstanding something
Many sources seem to define space complexity as input space plus the auxiliary space. Since, the input array can have "n" elements so the space complexity is considered as O(n), I think.
Refer to the second example here: courses.cs.northwestern.edu/311/ht...
I always thought of space complexity as just auxiliary storage, which makes more sense in my opinion...
Here from Odin also, I think that the space complexity depends if the argument is a reference to the original variable or just a copy. Because for instance if the function is recursive then every time it is being called it is creating a copy of the input then it will definitely use more memory on the other hand if its a reference that's just O(1) times the recursive calls.
Let me know if I am wrong...
mosmn's correction would also solve the contradiction arising in example 3: "Compared to its iterative counterpart, this takes up much more space where an iterative approach would use the same variable space over and over again, thus being only O(1)." But the iterative example above, number 2, had O(n) as space complexity. So which is it?
Hi, the space complexity for the second example is O(1). Since every time we iterate over the loop, we are reassigning the value to the same variables i and sum.
Came here from the odin project course work...great piece, thanks!
Hi, thanks for the great article. It explains space complexity very simply.
For the third example using recursion, I thought that the array[1...size] would create a new array for each method call which would add to the memory like: n + (n-1) + (n-2) + ... 1 = (n + 1) * (n / 2) (using the sum of n natural numbers formula) = O(n^2)
I suppose it depends on the implementation of the slice method but I think it returns a new array. If each method call does actually use O(1) space then overall O(n) is used as stated in the article. I am not sure that is the case here though...
Having continued with TOP I found this article with a section 'Copy on Write Optimization in Ruby Arrays'. For arrays of length 16 or larger, creating sub-arrays initially does not create a copy in memory! It only goes ahead and does that when it really has to (when writing to the new sub-array), maintaining the appearance of creating a new array for the user.
For the third example using recursion in this article, this means each function call uses O(1) auxiliary memory and thus O(n) for n recursive calls. So the space complexity (including the input array of length n) would be O(n) as stated in the article.
It just goes to show how you can only be really sure about space complexity when you know the programming language implementation.
This post is part of an assignment from the odin project course work...great piece, thanks!
It seems there is a bit of confusion when it comes to calculating the space complexity compared to the time complexity with regards to wether the input argument should be taken into account or not.
Great article. It does supplement my understanding from the odin project.
Great read, thank you for putting this together! I'm currently working through "The Odin Project" and your code examples/explanations really helped make these concepts stick for me.
From the Odin
I also found this via The Odin Project. Excellent article. Inspired me to join the dev.to community.
Brilliant article, thank you! Cleared up so many doubts I've had