If you're a JavaScript developer you've likely used arrays quite a bit. They're an essential data structure within the language.
In fact, they're so essential that the array prototype has seen rapid expansion in the past few years, with things like flat
and filter
added. And we're not done yet.
Accessor
In order to access an element in an array, you need to know its index. Indeces in JavaScript are zero-based, so the first element is at index 0.
const arr = ["a","b","c","d"]
arr[0] // this is "a"
arr[2] // this is "c"
As you can see in the example above, you can access the first element, or the third element. What about the last element? In other languages you might be able to do something like this.
const arr = ["a","b","c","d"]
arr[-1] // This is NOT "d"
But not in JavaScript! Why not? Well, as it turns out, -1
is already a valid key. Arrays are really objects with indeces as keys. So arr[-1]
is looking at the arr
object and the value of the "-1"
key, which is undefined
.
The last element
Then how do we access the last element without knowing its index? There are ways to do it, but its certainly more verbose. You can use the length lookup.
arr[arr.length - 1] // this is "d"
Or you have the slice option.
arr.slice(-1)[0] // this is "d"
Introducing at
That's why the at
function is under consideration. Instead of those options, you'd be able to write this instead.
arr.at(-1) // this is "d"
Note that this works for all valid negative indices, up until you pass the first element.
The great thing about at
is that it can replace square brackets all together.
arr.at(0) // this is still "a"
And what about an invalid index?
arr.at(5) // this is undefined
Seems pretty all encompassing.
An aside on history
As it turns out, this was attempted before using item
. However, it wasn't web compatible as it clashed with major libraries. So, at
is the current proposal.
Would you use it?
Hopefully this will move forward to Stage 4 soon and be officially adopted. I can see this being nice syntactic sugar for accessing array elements.
Top comments (77)
Would I use it?
const response = ['No way', 'I don't know', 'Awww yeah'].at(-1)
Everybody gangsta until .at(-1) returns undefined.
Would
[].at(-1)
be like dividing by zero?[].at(-1) == [][[].length - 1] == [][-1] == undefined
What if you need second-to-last item?
.at(-2)
is as reasonable as.at(-1)
.Yup, that works fine
I've used
var last = arr[arr.length - 1]
in the past, but now with destructuring, I typically doconst [, last] = someArray
.I'm not sure I'd use
at
aside from getting the last item in an array since currently I can doe.g.
someArray[1]
which is less typing thansomeArray.at(1)
for elements that are not the last item. I probably would have opted for anArray.prototype.last
.Maybe there are use cases for it that I'm missing like composing a bunch of functions.
Not everyone is seeing my addendum to this comment so here it is
Just an update to my initial comment as I typed it out pretty quickly yesterday.
const [, last] = someArray
will work if the array was only two items. For example, if it's 4 items, this won't work. You'll end up with this.If I wanted to get the last element in the above array, I'd have to do this.
This
const [, last] = someArray
is equal toconst last = someArray[1]
To use destructuring, one might want something like
const [l, [l - 1]: last] = someArray
, but it's hardly more readable 😁@milichev I get the idea from your snippet but I think the syntax is incorrect, it should look something like this :
Anyways that's a funny thing I never thought of :)
Just an update to my initial comment as I typed it out pretty quickly yesterday.
const [, last] = someArray
will work if the array was only two items. For example, if it's 4 items, this won't work. You'll end up with this.If I wanted to get the last element in the above array, I'd have to do this.
Oooh I knew this was possible, but never really put 2 and 2 together with that destructuring for the last item! Thanks for pointing that out to me.
Even I propose to extend your idea and include all quick methods related to collections like LINQ (from C#) does.
As described here it looks like
.at()
is barely useful to get the last item.It's unclear if it can be used to loop through the array in reverse (eg.
.at(-2)
) because we could have an hard time figuring when we reach the first element and stop the loop.So we still have to use
.length
like in the classic wayWe just shifted the
1
from thelength - 1
to the starting pointI'm struggling to see why we need
.at()
BTW You should cange
const
tolet
in your exampleYou're right. I never use classic for loops anymore really. Too used to writing:
I agree though. Don't see the point of at. Equally if you wanted to iterate the loop in reverse:
They gave you an example. You really don’t need to “correct” it.
What's wrong with that? I saw that he already edited it because he used "0" as a starting point instead of "1", so I thought he would like a fix.
Because they're not wrong. Let is just as valid as const from the perspective of the code. Without larger context it's just a preference. And you understood the example they were kind enough to provide just fine. Your response was to make yourself feel smart.
Maybe I'm missing somethig as english isn't my first language but using
const
is wrong.Javascript prevent you from reassigning a value to a const throwing an error
TypeError: Assignment to constant variable.
I was the first one asking a question because I know that I don't know everything and I want to learn so I think others can appreciate if I reciprocate
Yes, you could loop through in reverse. The first non-element would return undefined.
Sadly you can't just check for an undefined element because array with "hole" like
const arr = [1,,3,4];
can create a problem with that logic.Yes, if you have holes then you'll need to check relative to the length of your array.
sure or you should do as suggested here dev.to/hisuwh/comment/1fcp7
I still miss the value of
.at()
beside being a sugar syntax to get the last itemIt seems like you're pretty passionate about having a
last()
but I don't really see the issue.at(-1)
uses the same number of characters aslast()
so it's no more verbose, plus it has the added benefit of providing a uniform way of accessing array items from the end of the array regardless of how many positions back from the end you need it.A multi-purpose solution seems preferable to a method that can only do one thing, no?
That aside, if people start using
.at(positiveIndex)
, that's on them. We already have people swearing offfor
loops and thinkingreduce
is the catch all solution for everything, and I think those issues are far more detrimental to this.What ever happend to prototypes?
You can read more about that here: github.com/tc39/proposal-array-last
The TLDR is that at was progressing faster and solved the same problem. So last was no longer pursued.
I see responses saying why not arr.last.
I think they are completely missing the point.
True the article used -1 as the example value to get the last element, but the point of .at() is that negative indices start at the end, so -2 is the next to last element and -3 the one before that one, etc.
I'd assume that an index of -n of an n element array would be the first element and any index less than -n is undefined.
In addition .at() lets you use the index value stored in a variable, whereas .last does not.
Honestly, I don't think getting the last is the real reason for "at"
Well then... What is? The only benefits over indices seem to be that it throws on non-array and that you can use valid negative indices.
For negative indices, the main purpose is most definitely
.at(-1)
, since anything dynamic still requires knowing the.length
, and anything less than-1
implies a data structure that should probably be an object in the first place.And if I want to throw on non-array, I'll just use TypeScript.
Throws on non-array seems to me a real reason for using "at". TypeScript is not everywhere yet and even it is, there's still "any".
You would implement a stack as an object? Trippy ...
Another point - since everything in JS is an object, everything can be indexed using square brackets, not only array objects. So "at()" provides a safe way to get an element by index from the array and prevent an error when you're accidently working with someone which is not an array.
Okay, fair point! Clarity is certainly an important factor in code (hence my jab at
reduce
!), and you are right, the whole Unix mantra is built on "do one thing well" and it seems to do okay!I suppose at this point, just having something better than
[arr.length - 1]
would be a massive improvement, even if it were a little obscure. Ifat
makes people happy overlast
great! My bet is now on the two specs opposing one another, then they'll both sit uselessly unadopted 🤦♂️I wonder will it work
and do we really need it 🤔😆
Break the internet! haha love it! Sometimes they need to take control and make the tough decisions for the betterment of the web.
Correct me if I am wrong, but if the official name is called
.last()
, the code above would simply overwrite it and not cause anything to break? However, if a library also creates a similar .last function, depending on the order of code execution, it could cause some unexpected results.Huh, TIL! I wonder why they wouldn't just extend the indexing operator to support negative indices. Seems easier than introducing a method, and now there's the question of which to use.
The article covers this really:
There can be a "-1" key present in the object, so the behaviour for "negative indices" is already defined by the language.
Because it is already a valid index and would break existing code.
This is the part that I don't get, though. If
arr[-1]
returnsundefined
, then that means one of two things:arr[index]
is truthy (or explicitly checks if it'sundefined
).Either way, I don't see it breaking existing code if
arr[-1]
all of a sudden starts returning the last entry of an array. If we're doing #1, the API doesn't change for us since our code explicitly prohibits negative indexing. If we're doing #2, it still doesn't change because now negative indexing doesn't returnundefined
(unless the last element is, literally,undefined
).I only see this being an issue for actual objects. Can't arrays override the behavior of the indexing operator? (I'm not familiar with those kinds of low-level implementation details for JavaScript, so maybe not.)
"Unfortunately, JS's language design makes this impossible. The [] syntax is not specific to Arrays and Strings; it applies to all objects. Referring to a value by index, like arr[1], actually just refers to the property of the object with the key "1", which is something that any object can have. So arr[-1] already "works" in today's code, but it returns the value of the "-1" property of the object, rather than returning an index counting back from the end."
github.com/tc39/proposal-relative-...
Oh, gotcha! So it would break existing code. Thanks for the reference!
Coming from C++, my first feeling is that
.at(nonexistent)
would throw exception, but apparently this isn't the case.I would also expect an exception to happen instead. Passing in an out of bounds index argument should be considered a misuse of the method and raise an error instead. The return result of
undefined
is not sufficient for detecting if the index is out of bounds, since that accessed element could beundefined
.In programming it's better to fail early, so come a crash and the time to debug, the failing line of code in the stacktrace is as close as possible to the root of the problem.
Ideally the language and API should make it harder to misuse it, but in JavaScript a lot of nonsensical things are possible that can cause hard to debug problems because cause and effect can be delayed. That forces us to come up with ways to make it stricter such as using static type checkers or switching to a different cross-compiled language altogether.
This could have been an opportunity for getting it right though 😞
Compatibility and interchangeability with the existing bracket syntax was probably not the reason behind that design decision. The
at
function expects an integer index argument. The official polyfill reference implementation converts non-numeric or fractional numbers to whole numbers, which is another strike in my book.The method should simply reject non-numeric, non-integer arguments. Instead we end up with the following possibility that doesn't make sense to me either:
a[1.1] !== a.at(1.1)
. Fractional numbers should cause a crash too. Why second-guess the programmer and try to auto-correct them based on assumptions? 🤷It returns undefined
What is quite a disappointing fact, instead of enhancing the language quality, addressing the code issues and simplifying the development, we see the new standards popping up with additions to the language, but keeping the old issues as well. Now "at" should be used instead of "[]", it makes a lot of sense, but will anyone learn that? Will the courses in popular websites address that? Nope.
It is clear they are trying to keep the backward compatibility, but enhancing the slicing operator would be so much better to the JS, developers, and the future of the dev in general.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.