DEV Community

Discussion on: ⭐️ Interactive JavaScript Quiz #1

Collapse
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

Note: this is going to be a long discussion. I enjoy it very much, but I don't understand DEV enough to know whether it is a good etiquette to hijack a top comment like this. So, Lydia, if you feel we're not providing value to your readers, just say so and we'll find a different venue.

C++ is a fantastic language for beginners to learn because it exposes them to fundamentals that are very important to learn early on

I agree with everything in that sentence except "early on". Seriously, has the first language you've learned been C++?? I know it wasn't in my case. The concepts that are important in the early stages of programming are quite different from ones that are important in the long term, and that's fine -- in all human endeavors. That is what separates apprentices from masters.

If you knock down a beginner (who is just trying to grasp the syntax) with undefined behavior, there is a good chance they'll never get up anymore. They'll just conclude programming isn't for them. (I've seen it!) Which you might say is just fair, but I think that is the real shame. Just a clash of value systems, I guess.

it's still very much relevant today.

Absolutely. Languages are relevant for many different reasons. COBOL was relevant because of a huge library, FORTRAN because of its numeric capabilities, C because of portability, JS because it runs in a browser, Python because the programs are a joy to read, Lisp because... well, Lisp was never relevant. :-Q

C++ is mainly relevant because modern C compilers (which were goaded into compiling C++ too) are the most complex software the humankind has ever produced, by an order of magnitude. Many decades will pass before we accomplish something similar for any other paradigm. (See Rust as a perfect example of how hard it is to get such a project off the ground, despite having intelligent hardworking people and a much better paradigm.) So if we just want to get as close to the metal as possible without tying our code to a particular type of metal, there really is no choice.

But of course, that criteria mean perfectly nothing to beginners. For them, "runs in a browser" or "natural readability" are much more relevant. And we can not blame them for that. For me, the criteria was "It is hardwired in my machine, so I don't have to wait while it loads from tape", so I chose BASIC.

When drawing diagrams of objects in memory, for example, we tend to naturally use arrows (pointers).

So do a test. Take someone without C-centered education, draw such a diagram and ask them to put in words what they see. They'll use "connected to", "describes" or even "is a value of", and you might get them to produce the word "reference", but you'll never get them to say "pointer". Simply because they've never heard of that word in that context.

Unfortunately, the true meaning of "reference" has been lost.

It hasn't. Bibliographers use it all the time. Even programmers, for example when they write LaTeX documents, use \ref to refer to a label. They don't use \point. :-P It just seems so if you live in a C-centered world.

Again, it's not pedantry. Terminology needs to be precise.

Absolutely. Only, in my opinion, "precise" is precisely opposite from what you say. To be precise is to call a concept we're discussing, a reference. To use "a reference" for "an alias" is imprecise to the point of being incorrect. Nobody does that, except C++ programmers.

About passing by reference: that's a completely separate can of worms, but if you insist, we can discuss that too. There are two undeniable truths: first, everything is "pass by value", if we call that what is passed "a value". Pass by address in C is nothing more than passing the address by value, for example. The true importance of the phrase comes only when speaking of a concrete variable, say "pass x by ...". Then we want to say that ... of x is passed (by value, of course:). ... might be x's address, reference, its value, or even its name (in some old versions of ALGOL).

Second undeniable truth is that if you only know of two types of passing from Pascal, and you speak about them in the context of previous paragraph, you'll inevitably be confused here, because passing in JS is neither of those. It's only by serious language twisting "when you declare a Dog d, d is not really a dog, but a pointer to a dog" that you manage to shoehorn it into one of these buckets. But it explains nothing, precisely because of the first undeniable truth: every pass is by value if we call that what is passed "a value". But if it doesn't correspond to what people usually think when they speak about the value of the object, it's simply misleading.

Your suggestion is for us to continue shielding beginners from the truth for fear that they'll get scared away by the complexity.

That's emphatically not what I'm saying. I'm asking to use the proper terminology, "proper" being defined more broadly than Bjarne Stroustrup.

One last thing to note: Pointers exist in all languages, not just C++. Call them whatever you want.

You probably mean "pointers exist in the implementations of all languages, and that's really true if those implementations are written in C (which they usually are). But they don't have to be. You can write a Python in Java (Jython), and its implementation really has no pointers. Anywhere.

If you really think pointers exist at the lanuage level, then you're simply not thinking broadly enough. There are no pointers in PROLOG, Haskell or Turing machines. Really. :-)

At the lowest possible level, there is no way to store an "object" or an array or a string in a CPU register; all you can do is store bytes, sequentially or otherwise. Hence the universal need for pointers.

Ah, good old reductionism. It has never got us anywhere, but people still try. :-) OK, let's do this one too.

There is no "lowest possible level" (that we know of). I could as equally say that there is no way to store bytes, you can only store electrical impulses or magnetic orientations -- or even lower below, quantum states. On an even lower level, you don't store anything, you simply collapse the wave function of the memory into some observable state.

Now you'll say that this is ridiculously low, and more importantly, irrelevant for the topic. And that is completely true. But it is equally true for the modern beginner when you tell them about registers, bytes (in the context of memory), or machine instructions. It simply doesn't enter their mental model until much later. And that's a good thing. Our ability to progress depends on the always heightening the level of abstraction, leaving the details of implementation deep below. We must: there is a long journey upwards ahead of us. ;-)

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

About beginning with C++: it seems you agree with me. Only, I wouldn't say it's "irony", it's just a normal way the things are. It would be surprising if it were different. And wouldn't it be a great shame if you weren't a programmer today, just because C++ knocked you over one time too much when you were fragile? For example, you wouldn't have this discussion with me. :-F

I realized that learning to program isn't necessarily trivial and that it requires a good bit of self-study and hard work.

Absolutely! But if you face the same circumstances on your first encounter with programming, that's not the lesson you will extract from it. Trust me, I've seen it (I teach many young people programming, usually using languages I didn't choose -- as you probably realized by now.) C++ has the highest "screw this, programming is just not for me" rate. (I'm sure Malbolge would have even higher one, but no school in my vicinity starts with that.;))

I'm biased because I like to think at a low level.

I hope I showed you that "low level" is very relative. I've seen people telling other people things like "Yeah, Django is very nice, but I like to think at a low level, how the Python beneath it actually executes -- it really helps me in my work." :-D

Yes, I see where you're coming from, but I must regretfully inform you that most of the things you've learned are by now false or at least obsolete. L1 cache is more important than registers, vectorized operations are a thing, execution is not sequential, memory access is not atomic, processors actually use quantum effects when executing some instructions, and ALUs do speculate about what's going to come next, being at some points more superstitious than my grandma. :-P

Low-level thinking might be useful when it is correct. With modern advances in technology, it's less and less so.

In the long term, it's probably safe to continue using the term "reference" in JavaScript because it's become idiomatic. Plus, it is, as you say, a natural way to talk about objects—a variable "refers to" an object in memory.

That's all I wanted you to admit. But in the meantime we've opened a bunch of other subjects... :-D

Actually, it's not. In pass by reference, the formal parameter is a "true" reference (alias). Hence the name (and behavior) of pass by reference.

I understand that. I just hoped I could avoid this line of discussion since JS is neither. (Yes, it is pass by value if you redefine value, but I hope I have explained why I think it's unsatisfactory.) We don't pass values, and we don't pass references. We pass objects themselves. It's just a false dilemma (from my perspective, though I understand yours).

Most modern programming languages (save for C++ and PHP, for example) don't have true references (aliases), and hence they do not have any notion of pass by reference.

... and so the notion is finally free to be used in the true linguistical sense. Horray! :-)

Java programmers, when taught that there are these two ways of passing things into functions

So the solution is simple: don't teach it to them. :-) In fact, you can teach it to them using the true notion of reference. In Java, you can say truthfully that primitives are passed by value, and objects are passed by reference -- it will only confuse them if they heard from some C++ programmers that "reference" means "alias". But I'm sure no-one would do that to poor Java programmers. :-]

I've never implemented a language, or examined a language's implementation,

Hm, interesting. For a low level thinker, that's really strange. May I suggest you do that sometime? I think it will be a fulfilling experience. :-) I teach a course on compiler/interpreter design, but unfortunately it's in Croatian. I'm sure you can find decent ones in your preferred language. (If you want tips, Appel cs.princeton.edu/~appel/modern/ is really good, though verbose.)

Pointers are not just a "C thing"—they exist at the assembly level, where a pointer is literally a CPU register that happens to store a memory address (which is a number like any other).

Ah, in that sense, of course. Though even that's not a given: some early versions of Fortran had no dynamic memory allocation at all. Everything was static, and so could run directly from memory. Yes, you lose various niceties like recursion (you can't have a stack without indirection, of course), and functions could call other functions only to the depth of two:-), but people actually programmed in that monstrosity 50 years ago. (I'm sure someone will say that sentence for C in 50 years.:)

And about strings: do learn about Hollerith constants some time. Loaded directly from the source, together with the length, statically. Beautiful stuff. :-D There are more things in heaven and Earth, Horatio, Than are dreamt of in your philosophy. ;-)

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

So, it seems we agree on everything after all. How nice. :-D

About that compiler course... I didn't tell you the most important thing. I teach it using Python (and a little framework of mine, just to do bureaucratic tasks like counting lines and characters for nice error messages, and buffering characters and tokens in case you want to go back in the stream). That's the only way I can produce a working compiler in 20 hours. And that's why I told you there are no pointers in it. (Yes, we use CPython, so there are pointers in the implementation of Python, but as you said, if you just mean "indirect access to memory", it's bound to appear somewhere down the ladder of abstraction. The point (yeah, bad pun:) is that there are no pointers in the source language, no pointers in the target language (we mostly compile expressions to some stack-based VM, and transpile commands to things like JS, which is pretty much the universal architecture nowadays -- se webassembly;), and no pointers in the implementation language. The people writing those certainly don't think in terms of pointers.

Of course, being a low level thinker, you'll probably say it's cheating, and you'd want to use C++ as implementation language, and (if I estimated your age correctly) something like x86 as the target. But my perspective is that you'll just spend more time (and increase the risk of giving up) while not learning anything essentially new. "I guess it depends on what type of thinker you are and what you enjoy in programming.", as you said. :-)

Wow, that's... actually pretty cool! I didn't even think that you could do that.

As they say, necessity is the mother of invention. You don't know you can do many things (like survive 4 days without food) until you have to. ;-) BTW both things (surviving 4 days without food and using Hollerith constants regularly) are extremely bad for your health. There are good reasons we don't do them anymore. Kids, don't do this at home. ;-)

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

Youth is neither good nor bad---at least that's something we old farts say to each other for consolation. ;-)

x86, on the other hand, is mostly obsolete, but that shouldn't necessarily stop you---since your aim is not to produce something useful, but to understand things better.