DEV Community

Cover image for A Trendy Article From 2048
Jason C. McDonald
Jason C. McDonald

Posted on • Originally published at indeliblebluepen.com

A Trendy Article From 2048

Thanks to a glitch in the time-space continuum (which apparently operates on C code, who knew?), I accessed the following article, written by some random coder, from a version of dev.to 30 years in the future.

The most interesting part to me is the comment section on that article. Just goes to show you, the more things change, the more things stay the same.

Disclaimer: Don't take this as advice on how to be ready for the coding world of tomorrow! This article is from the future. All it would take is one person sneezing at an inopportune moment and scaring his girlfriend's pet chihuahua to completely alter the trajectory of time, whereby none of this happens.


Taking a Look at Corundum 2.0

I recently downloaded the latest version of the Corundum web framework. In the past, I haven't been terribly impressed, as the documentation is pretty dense, and I've been pretty decently happy just hand-rolling my own JK++ code to get the job done. But I have to admit, once you work your way past the learning curve, Corundum makes life soooooo much easier.

Of course, there are hundreds of frameworks for making apps that work over the subether interlink, but in my opinion, most of the holographic interfaces people make with them still look like those dusty Javascript web apps from the 10s (flat colors? Who does that anymore? Srsly....). Corundum interfaces really look modern.

Of course, Corundum still has its detractors. Some cite the fact it can suck up RAM really fast, but I don't see that as a problem. If you're running a computer with any less than 8 PB of memory (and a 2.5 THz processor, while we're on the subject), you should think about upgrading that dinosaur. ;) So, while a "Hello, World" app in Corundum is going to chew through about 200 TB of memory, that's a small price to pay for a clean, modern interface and the latest features.

Another common argument against Corundum that I hear is that it "automates too much." I've run into a number of old-school troglodyte coders that insist that explicit is better than implicit, a saying borrowed from Python, Go, Rust, or one of those other dusty old languages that don't really have a use other than "legacy support". (It's almost as bad as the financial sector, which is STILL using COBOL of all things.)

But I digress, before this turns into another "seriously, why are you still using ancient languages like Go" articles. Don't knock those old languages too much; most of those languages still have a place, even if they're painful to work in. After all, Linux is still written in Rust, and it works great despite that language's terrible error handling.

Anyhow, I think the saying 'explicit is better than implicit' is way out of date now. After all, any decent IDE is going to record what you were thinking about when you designed the code, so you should be able to figure it out pretty quickly later. Corundum's intention-resolution support is excellent, and I rarely get ambiguity errors.

For coders coming from less intelligent frameworks, it will take some time to get used to letting the computer resolve ambiguity for you. For the sake of keeping code brief, Corundum actually doesn't allow you to explicitly resolve anything in the source. If it gets confused, which (again) is rare, it'll just prompt you to clarify. Once you're past the phase where this blind trust gives you heebie-jeebies, you'll find you code a lot faster.

A few benefits I've found in using Corundum:

  • Great integration with RERUN databases.

  • Supports the latest mindreading protocols, including Orwell, SpeakMind, and TaNK.

  • Great compatibility with all the architectures...unless you want to use something crazy antiquated, like Intel or AMD processors.

  • Supports full-color holographic interfaces! (Something I couldn't find in the other libraries.)

Now, I know that Corundum might not work for everything. I'm still a strong believer in being able to code without a framework. JK++ is a really capable language on its own. But if you're able to put in the time to learn Corundum initially, I think you'll find like I did that it saves a lot of effort in the long run.

Comments Section

Explicit IS better than implicit. You're still wasting time reviewing your automatically-recorded thoughts when debugging.

who doesn't use ai for debugging these days? my prof says theres no reason not to

Uhm, in practice, ai debugging doesn't catch a lot of logic errors still.

logic errors? what is this, functional programming or some old paradigm like that?

...all coding paradigms get logic errors. o.O

I lost track of the number of times my AI debugger couldn't find the problem. Desk checking FTW.

One more reason I still like old languages. Those manual debuggers rocked. I think someone ported gdb for JK++.




You think Corundum's docs are bad? You should see some of the old UNIX man pages. Of course, by saying that, I'm showing my age. (No, I did not personally know the dinosaurs.)

20 years, and they still don't know how to write good docs. You'd think the mindreading technology would have helped, but nooooooooo.


D'you suppose they might port Corundum to work with the Leaf programming language?

I think that's up to Leaf's BDFL, honestly.


clean code shouldn't have a lot of inline thought recordings, because they really confuse the whole thing. one solo open source project i found on github had a bunch of random thoughts about cheese scattered throughout it.

emacs can clean up your thought records, you know. C-x-relevant

Real coders use vim.

omg u use github still?? i didnt even know that was still around

yeah, it's mostly legacy code, but it's got some cool stuff if you don't mind the interface.

i could never get past the interface. IdeaDrop for me, girl.




Hey, I still use Python! It's a great language.

Yeah, I guess, assuming you can wrap your head around that clunky import statement and other such clunky stuff.

It's still better than how C++ did it. #include "somefile.hpp. Ouch. Just ouch.

C++? LOL, I haven't heard mention of that since college. I think my grandpa has a book on C++17 around. Ancient history, yo.

I still can't believe the #python IRC channel is still there! Bunch of troglodytes.

Be nice. They do a lot of low-level stuff so we can work with modern languages like Narf and JK++.

Yeah, yeah, okay.

I can't believe IRC is still there.

Wait, what's IRC?

Y'know, Python 18.7 ain't half bad. Doesn't even feel like the same language.



Corundum is okay, but I like FooUI better for interfaces, and that's supported full-color holographic interfaces since v1.3.

FooUI?? Isn't that, like, for Nim?

Originally, but they have bindings for JK++ now.

Oh, cool, good to know. I liked FooUI back in the day.

Nim's still better than Javascript.

Dude, anything is better than Javascript. JS = pain, as they say.




Corundum is just a fad, like blockchain, neural networks (the original ones, anyway), javascript frameworks, and cloud computing. Use it, but don't get too dependent.

Nah, Corundum is solid. It'll probably be in use in 20 years still.

That's what they said about javascript frameworks......


Top comments (11)

Collapse
 
georgeoffley profile image
George Offley

Ha! Am I the only one who looked to see if Corundum was actually a thing?

Collapse
 
fnh profile image
Fabian Holzer

Great article, Jason!

When I look back to 1995, when I got my first computer, an old 80286, previously owned by my uncle, a machine barely younger than I was (with me being born in 1986) - I have a measure for three decades of technology. 16 MHz to something in the order of 4 cores with around 2 GHz each makes seven orders of magnitude (base 2, not base 10) on frequency and nine orders if you count the cores. If you just extrapolate that, you're not far off, with exchanging the GHz for THz (and the TB for PB). But - the crazy exponential curve has flattened, in the last 8 years there was not even a doubling (compare the first generation i7 with the latest; same amount of cores, 3.0 vs 4.3 GHz - not factoring in cache).

What I want to express: it seem's to me that the "free lunch period" provided by the exponential development of the hardware's capability might be over.

And while Moore's Law might let us down, Wirth's law ("software gets slower faster, than hardware gets faster") seems to hold. May I say, that to me this would constitute a bleak outlook indeed. Even more bleak than the prospect of Cobol running financial transactions in the year 211 CE.

I will still be more than half a decade away from retirement by then, but I certainly hope, that - all fads aside - the field of software development aims for and achieves more than just incremental steps on the status quo until then. By piling up masses of rubble by sheer force, you might get pyramids, but you get neither cathedrals nor skyscrapers.

But one thing still holds true, and I hope will do so at least until 2048, the most interesting time to be a developer is and up to this day has always been the present.

PS: On that one I lol'd really hard: "one solo open source project i found on github had a bunch of random thoughts about cheese scattered throughout it."

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

This article was actually quite satirical, as you probably detected. In reality, I've always doubted we'll get anywhere near to the specs that the "future author" referred to, because I agree with your assessment. The point I was making was basically that, no matter how fast we make computers, we'll probably always waste that speed and capacity by default.

In a way, the future I painted here is rather dystopic; it represents that pile of rubble that you mentioned in the Alan Kay reference. I actually speak in depth about this topic in my talk, The Cake Is A Lie (and I use that Kay quote in there, btw).

Collapse
 
fnh profile image
Fabian Holzer

I certainly didn't miss the satirical nature. Good satire, as well as dystopian or utopian texts, provides a great opportunity to use it for reflection.

I've skimmed the first of the four articles you wrote, on which you base your Cake talk. I liked it also and'll look into it a bit deeper the next days!

Collapse
 
twof profile image
Alex Reilly

hahahaha this is great. Dev.to comments sure have gotten sassy in the past 30 years.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

"D'you suppose they might port Corundum to work with the Leaf programming language?"

Awesome, I guess I will finish the language someday! :)

Collapse
 
codemouse92 profile image
Jason C. McDonald

I was hoping you would find that little Easter egg. ;)

Collapse
 
alephnaught2tog profile image
Max Cerrina • Edited

I can't believe you didn't cover the most recent version of C[^\w]{1,}!

Collapse
 
jjjjcccjjf profile image
endan

Real coders use vim

Collapse
 
jfrankcarr profile image
Frank Carr

Interesting...

But, what do the COBOL and VB6 job listings look like?

Collapse
 
twigman08 profile image
Chad Smith

Ha, now that's hilarious.
We were just joking around about a lot of this stuff at work the other day. About what the programmers of the future are going to say about what we used. Lol.