DEV Community

Cover image for The broken promise of static typing
Dan Lebrero
Dan Lebrero

Posted on • Originally published at labs.ig.com

The broken promise of static typing

This article originally appeared on IG's blog

I was quite surprised at a recent blog post by Uncle Bob Martin, titled: "Type Wars", in which he writes: "Therefore, I predict, that as Test Driven Development becomes ever more accepted as a necessary professional discipline, dynamic languages will become the preferred languages. The Smalltalkers will, eventually, win."

This statement didn't sit well with some people ​in the static typing community, who argued that in a sufficiently advanced statically typed language, types are proofs and they make unit tests mostly redundant​. Haskell even claims that "once your code compiles it usually works"​!

Be it safer refactoring, better documentation, more accurate IDE support or easier to understand, for me all these claims translate to a simple promise: less bugs.

And I really hate bugs. I find them to be one of the worst wastes of time and energy for a project, and there is nothing that annoys me more than getting to the end of iteration demo and the team being somehow proud of saying, "We did X story points and we fixed 20 bugs! Hurray!"

To me it sounds like, "In the last iteration we wrote more than 20 bugs, but our clients were able to find just 20! And we were paid for both writing and fixing them! Hurray!"

Chartin​​​g bugs​​

With that in mind, I tried to find some empirical evidence that static types do actually help avoid bugs. Unfortunately the best source that I found suggests that I am out of luck, so I had to settle for a more naïve approach: searching Github.

The following are some charts that compare the "bug density" for different languages. By bug density I mean the average number of issues labelled "bug" per repository in GitHub. I also tried removing some noise by just using repositories with some stars, on the assumption that repositories with no stars means that nobody is using them, so nobody will report bugs against them.

In green, in the "advanced" static typed languages corner: Haskell, Scala and F#.
In orange, in the "old and boring" static typed languages corner: Java, C++ and Go.
In red, in the dynamic typed language corner: JavaScript, Ruby, Python, Clojure and Erlang.

Round 1. Lang​​​​uages sorted by bug density. All repos

round1

Round 2. Language​s​​ sorted by bug density. More than 10 stars repos

round2

Round 3. Languages sorte​d by bug density. More than 100 stars repos

round3

Whilst not conclusive, the lack of evidence in the charts that more advanced type languages are going to save us from writing bugs is very disturbing.

Static vs Dynamic is not th​​e issue

The charts show no evidence of static/dynamic typing making any difference, but they do show, at least in my humble opinion, a gap between languages that focus on simplicity versus ones that don't.

Both Rob Pike (Go creator) and Rich Hickey (Clojure creator) have very good talks about simplicity being a core part of their languages.

And that simplicity means that your application is going to be easier to understand, easier to change, easier to maintain, and more flexible. All of which means that you are going to write less bugs.

What characterizes a simple language? Listing the things in common between Go, Erlang and Clojure, we get:

  • No manual memory management
  • No mutex-based concurrency
  • No classes
  • No inheritance
  • No complex type system
  • No multiparadigm
  • Not a lot of syntax
  • Not academic

Maybe all those shiny things that we get in our languages are actually the sharp tools that we end up hurting ourselves with - creating bugs and wasting our time - and that all they do is bring a lot of additional complexity, when what we really need is a simpler language.

As Tony Hoare said:

There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies and the other way is to make it so complicated that there are no obvious deficiencies.

Latest comments (80)

Collapse
 
readyready15728 profile image
readyready15728

How is Python vastly more complicated than Ruby?

Collapse
 
josiah14 profile image
Josiah

I've been noticing similar things. Statically typed compiled languages do eliminate the possibility of certain classes of errors getting through to production simply because the compiler will catch the errors. However, a lot of the time, these compiled languages introduce complexities that raise the likelihood of other classes of errors that may actually be more advanced to troubleshoot (looking at you, Haskell, with your lazy evaluation and unpredictable performance). Another language that makes me question the perceived value of static typing is Scala. Until I had worked with Scala almost exclusively for the better part of a year, I actually spent more time fighting with the compiler over code that was logically correct (but due to type-erasure and other things, the compiler couldn't verify) than actually working on real bugs in the application.

I wonder, if all of this is true, what it's impact is going to be in the success or demise of Rust?

Collapse
 
danlebrero profile image
Dan Lebrero

I think that for a programming language it is far more important the marketing around it than the technical qualities.

Collapse
 
jacmkno profile image
jacmkno

¿Have you recalculated the data by dividing the total number of bugs by the age of the projects?

This is necessary to remove the age variable as obviously the older the project, the more bugs it will have.

But surely you have found a very valuable source of information to provide more insight in the static vs dynamic typing debate...

Collapse
 
eljayadobe profile image
Eljay-Adobe

Thanks for this post Dan. It generated a lot of good discussion on this topic!

Is the bottom line "simplicity appears to be important, and static-vs-dynamic typed languages less important"?

Collapse
 
danlebrero profile image
Dan Lebrero

That is what I would like to have a conversation around! What is your feeling about it?

We have been fighting over the static-vs-dynamic thing for too long.

Thanks for reading!

Dan

Collapse
 
eljayadobe profile image
Eljay-Adobe

My feelings -- just my feelings, not backed by any hard data -- is that the most important thing is both simplicity and writing the source code for maintainability and legibility. What Uncle Bob wrote about in his book Clean Code.

Some languages lend themselves to simplicity. For example, I'm impressed with D, Python, Lua and F# ... all of which have a clean syntax and are rather free of excessive "ceremony". Which is why I have a soft spot in my heart for those languages.

But the languages I use that pay the bills are C++ and C#, and I have a love-hate relationship with both of those languages. (More vehemence for C++, because I've been using it for a very long time.)

Bugs can be written in any language. But languages like C++ that have so many areas of undefined behavior that are easy to accidentally stumble into do no one any favors.

Languages that have contract programming, like Eiffel, D, and Ada 2012, make unit testing a lot less important because the contracts can be specified directly in the code instead of being encoded in unit tests. (That's what unit tests do: they express contracts.)

In my experience, statically typed languages -- like Go, C++, D, F#, Swift, TypeScript -- don't have much better protection from the duck typed languages like Python, JavaScript, Boo for "not making bugs". What the static typing does provide is scaling. Small applications gain little benefit from static typing. But as applications grow helps to make sure the pieces are fitting together correctly.

Case in point is Google's Angular that was converted from JavaScript to TypeScript, they had discovered that there had been a good number of bugs in their code that were caught once they had the static typing of TypeScript. (TypeScript transpiles to JavaScript, and the type annotation information is erased. It's a transpile time safety net.)

But, I've also worked with large system based in Objective-C which has a mix of static type checking and runtime duck typing, due to the nature of it using message passing to objects. (The message passing is reminiscent of SmallTalk.)

When I think of duck typed languages, I usually think of scripting languages. When I want to do something quick-and-dirty I reach for Python. When I want to make something application-like, I reach for a static typed compiled language.

But there are languages out there that bridge the two worlds of sorts. Languages that minimize the ceremony around the static typing, like OCaml, F#, and Swift. They're still all strongly typed, but the burden is more on the shoulders of the compiler, rather than forcing the developer to dot all the i's, and cross all the t's.

So I'd say that static typing catches a small category of bugs. For smaller applications, those kinds of bugs are few. For larger applications, those kinds of bugs can be crippling.

I don't know of any scripting language that supports contract programming as part of the core language. (Educate me if you know of any!)

A vastly bigger source of bugs in programs I work in is mutable global state. By which I am also including local mutable member variables in a class instance... that's a smaller scope global state. Programs that I've seen and I've written that emphasize immutability and segregate immutable data from functions and side-effect free functions seem to produce a lot less bugs.

I'm not sure if the "less bugs" I'm seeing is because I'm a better programmer with those kinds of languages, or if I make less bugs in those languages because it is easier to reason about the correctness of the code. Doesn't have to do with all those languages being statically typed. I believe it does have to do with immutable data and lack of global state has more simplicity.

Another vast source of bugs I've run into is null pointers. (Damn you Tony Hoare for adding in the null reference to ALGOL W!). That's another area where Haskell, F#, OCaml, Swift outshine C, C++, C#. Objective-C sort of sidestepped the problem with its treatment of the nil object quietly eating messages (well, almost quietly... the eaten message is output to the console log).

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

This analysis is flawed. The input variables are not controlled, nor do the conclusions logically follow from the data. I could make an almost opposite conclusion with the same data:

Based on bug density we clearly see that static typed languages are the best for identifying bugs.

The fact that the "data" can be used to draw very opposing conclusions would indicate a fatal flaw in the analysis. This is a sensational piece with no merit as research.

Collapse
 
danlebrero profile image
Dan Lebrero

Thanks for the comment!

I would completely agree with your conclusion if the bugs from statically typed languages were all compilation errors. I suspect they are not.

Also, I could agree if the bug density of statically typed languages weren't all over the place. Note that Go is statically typed and one of the languages that I would call simple.

The post is not research. It doesn't say so anywhere, the post says "naïve", "not conclusive" and "opinion".

For research, read the link near the word naïve.

Also I would suggest to read the comments and watch the videos from Blaine. They are very cool and probably closer to your taste.

Thanks again!

Dan

Collapse
 
lewiscowles1986 profile image
Lewis Cowles

Have you considered that you may have actually measured known bug-count vs unknown? To me this is like like comparing pennies in a jar vs missed pay-checks... It's the unknown long-term problems with systems (rounding errors, off-by-one's, partial API regressions, and design flaws) that lead to the biggest problems.

It's for sure interesting, I'd love for there to be an answer, but I've been making the transition from dynamic -> static yo-yo'ing without any evidence for or against either for the general-case since the 90's.

Thanks for the article

Collapse
 
danlebrero profile image
Dan Lebrero

Hi Lewis,

Not sure if I understood you about the known vs unknown. Do you mean that for dynamically typed languages, there are bugs that have not been reported or found, while those same bugs would have been reported in a staticly typed lang?

I love the pennies vs paychecks analogy. I will steal it for a future blog post ;).

I am with you in the static vs dynamic debate, that is why I wanted to propose a different one: simple vs complex. On this one, I would position myself on the "simple-by-default" camp, were doing complex things was painful and non-idiomatic. What about you?

Thanks!

Dan

Collapse
 
lewiscowles1986 profile image
Lewis Cowles

Hi Dan,

You got the known vs unknown in one. Not knowing about a bug (it not being in issues) doesn't mean it doesn't exist as we found with the OpenSSL bugs a few years back.

I'm glad you enjoy my analogies I love using them as they generally help ;)

On simple vs Complex. I'm sure it's a false dichotomy overall but I definitely love the idea I keep being sold RE: simplicity.

 
danlebrero profile image
Dan Lebrero

I would expect something similar.

At the summary of studies that I link in the blog, there is reference to this talk were "The speaker used data from Github to determine that approximately 2.7% of Python bugs are type errors".

I was quite surprised.

Collapse
 
danlebrero profile image
Dan Lebrero

Yes!

As I am somehow fascinated with Haskell, I would love to go through all the fixed bugs in some Haskell repos to see if there are some common patterns.

What would you expect to find?

Collapse
 
eacasanovaspedre profile image
Enrique Alfonso Casanovas Pedre

I've used static and dynamic languages and I agree with the hypothesis that static languages, when used well, help you reduce the probability of bugs. In many cases, people write poor code. If you use a static language, like F# or Haskell, but use it like it is JavaScript or the old C, C++ it is normal that bugs will arise. Most programmers are "Primitive obsessed" which is a source for some bugs. Many like to cast all over the code too. Programming in a way which makes invalid states impossible to represent helps a lot and also saves you a lot of testing. Usually when I get a code to compile I have very few bugs and most of them are caused by a bad communication of the requirements.

fsharpforfunandprofit.com/series/d...

Collapse
 
ericnormand profile image
Eric Normand

Hi Dan!

Nice analysis.

When you calculate bug density, is it # of bugs divided by # of lines of code?

If so, this is really surprising! I read somewhere that bug density was pretty much a constant. So more concise languages had an advantage by being shorter.

However, your analysis shows that Java and C++ have more bugs per line of code!! So more code * more bugs = more more bugs! Ouch!

It would be cool to see the distributions of these languages. How wide are the curves? Does bug density vary widely in Java projects? What about Haskell?

Rock on!
Eric

Collapse
 
danlebrero profile image
Dan Lebrero

Hi Eric,

I have updated the post to make it clear: "By bug density I mean the average number of issues labelled "bug" per repository in GitHub"

The assumption is "I do expect is that roughly all developers, no matter the language, have to solve the same problems, so the open source libraries available have roughly the same functionality.". David seems to disagree on this assumption. What are you thoughts?

I also remember reading somewhere that bugs are constant per lines of code, but maybe what was constant was the number of lines produced or the number of lines that you can keep in your head. I unable to find the reference right now.

Steven McDonnell in the "Code Complete" book says: "the number of errors increases dramatically as project size increases, with very large projects having up to four times as many errors per line of code as small projects"

Great idea for another pet project. Maybe one for PurelyFunctional.tv? ;)

Thanks a lot!

Dan

Collapse
 
bousquetn profile image
Nicolas Bousquet

Many aspects affect the number of unfixed and solved bugs like:

  • the number of features of the program.
  • it's maturity.
  • It's user base.
  • The impact of bugs for the users. Does a bug mean billions $ losts, hundred or thousand people killed (think aircraft autopilot or you nuclear power plant software) ?
  • The team/developers/company behind. Are they reliable, serious, experimented ?

What I see in your graph is that the most used languagues (C++/Java) with the biggest codebases and most features under the software built with them have the most bug per repo, but it seems logical.

Seeing that, it is now quite hard to draw any conclusion from that data alone.

What I surely see is that static typing serve as a mandatory documentation that help both the compiler, the IDE and the developper to reason about the code. There less information available on a typical dynamic language meaning that one has to rely more on alternate solutions but in state of the art tooling, the IDE/compiler typically never catch up. More checks are done at run time and the IDE fail to provide the same quality of tooling and context (auto completion, refactoring, code navigation).

Collapse
 
danlebrero profile image
Dan Lebrero

Hi Nicolas,

Thanks for the comments.

The data is from Github which means Open Source code and from tens of thousands of repositories.
I state that the approach is very naive but I am still surprised about the results.

I agree that not all bugs are equal and you shouldn't use the same development practices in all projects.

I would include Ruby, Python and JavaScript in the list of most used languages. I do not which codebases are the biggest or with most features, but Steven McDonnell in the "Code Complete" book says: "the number of errors increases dramatically as project size increases, with very large projects having up to four times as many errors per line of code as small projects"

I think that is the reason why monoliths needs to be split into micro services at some point. My personal experience is that language expressiveness matters

Have you look at the comments by Blaine? They are really interesting.

Cheers,

Daniel

Collapse
 
eljayadobe profile image
Eljay-Adobe

Thanks for the link to Paul Graham’s Beating the Averages and The Blub Paradox. I run into that all the time.

Trying to convince the other developers (who are very bright people) that there are alternative languages that would be more powerful and suitable for our problem domain invariantly meets with deer-in-headlights blank stares.

Even contemplating alternative languages is outside of most developer’s comfort zone. Or moreso, even outside of capability of consideration. Even as a thought experiment.

When I look at the trends, I see object-oriented programming to continue for the foreseeable future. But, I also think there will be two language idioms that overtake object-oriented programming languages: functional programming languages, and domain specific languages.

I consider Lisp to be a programmer's programming language. An "abstract syntax tree oriented" programming language. Paul Graham's secret super-weapon is safe.

Collapse
 
bousquetn profile image
Nicolas Bousquet • Edited

The most visible correlation in your data is that the more stars a repo has, the more bug there is inside. Also the respective ranking of language change significantly with the number of stars, like java being quite good for all repo, but quite bad by your metric on big repos.

It may be possible there a correlation between language and number of bugs or dynamic/static typing but really the data is not refined enough to remove other variables so concluding anything is impossible from the data.

Sure that language expressiveness matters, it is enough to try to develop anything in assembly vs Java or Lisp and you sure see a higher level language work better. But there expressive languages on both sides and different languages may suit different problem categories too.

My impression is also that huge projects are not often done in dynamically typed languages. I feel like a dynamically typed language may be able to leverage more of the individual productivity and on the contrary are not that great when the code base scale (millions lines of codes).

The number of line of code is not a good metric but it is far better than thinkings all repos are equals, so I would consider bug per LOC. After that is done you could always apply a factor between high level language vs basic one (like C typically needs more LOC than Java).

Thread Thread
 
danlebrero profile image
Dan Lebrero

Thanks for the comments!

I neither do think that the data proves anything, I hope I made that clear in the post. Proving is a big word that I rarely use for anything.

I don't know if you noticed by I linked to the best source of studies on the matter that I found.

Reading your comments, something popped to my mind.

When we talk about huge projects, do you think that we plan from the beginning for huge projects or their start small and grow to be huge? Do you know think is common in the second case to switch languages?

Cheers,

Daniel

Thread Thread
 
bousquetn profile image
Nicolas Bousquet • Edited

About huge projects do we know in advance? Well I guess it is case by case.

Twitter started basically as a Ruby shop and decided quite some time ago already to migrate to the JVM with Java and scala in particular (and javascript for the client). I don't know but I would say twitter started small.

Now I have colleagues and friends working for the french civil aviation and they decided long ago to make a new version of one of their key component. They started thinking big from the start. And by the way, automatic memory management was a no go as not realtime friendly, meaning many language like Java/Clojure/Lisp are instand no go.

There a saying that if you are a startup, you should go for instant productivity and that you'll always have time and money to rewrite everything if you company is to be successful, but if you are not successful, going more slowly to ensure better architecture, easier to maintain code or better performance doesn't make sense at all.

Some other would say you should use what you master. I think that make a lot of sense it save you time and let you concentrate on more important aspects like finding clients, hiring the right people or creating a business plan...

Most of the companies I worked for are big established company and while there often an emphasis on using the best tool for the job, it is also quite important to use standard tools, ensure you can hire easily and also that new people to a project have a chance to get up to speed. They almost always choose the popular statically typed language with Java, C++ and C. Javascript is now widely used but only because there basically no way to avoid it on the web and for year such company tried many way to go around it: JSF, GWT, doing it all on the server... The dislike of javascript by many IT specialist practice made the web losing years before nice reactive websites because the norm.

Theses companies have technical policies and outside of proof of concept, for anything that may go to production, it has to use allowed technology. For my current company that's C/C++ for most legacy, Java for most new things, Scala/Spark for BigData analysis and a bit of python. That last one being restricted to scripting, small projects that do not need to scale.

I do not necessarily says it is the right way to proceed, but the common practice is to use a statically typed language that has widespread adoption in the industry, and a mature echosystem that help on the productivity.

That being said, I quite remember the arguments of Paul Graham about lisp and how it helped him on his startup.

But even if he criticized it, when Yahoo brought his company, one of the first things they did was to migrate the code from lisp to a statically typed language... The decision was criticized, maybe rightly so, but it show that many people are not that found on dynamically typed languages.

Thread Thread
 
danlebrero profile image
Dan Lebrero

I am personally quite torn about standardisation.

I always wonder what I would do if I created my own company. Would I mandate some popular language or would I allow every team choose whatever they wanted?

I can see a lot of good arguments in both sides and I have seen a lot of talks about the subject, and again, nobody agrees.

It is a little bit paradoxical what you say about the best tool for the job. I have similar experience and I see it as "the best tool within these limited and blessed toolset". When and how do you decide to add a new tool? It is really hard to quantify the value and cost, when we keep saying things like "more maintainable" or "easier to use".

Paul Graham essay is a classic, every developer should read it, not because of Lisp but to be aware of the Blub Paradox. It applies to all of us.

Out of curiosity, what language did your friends choose?

Thanks!

Thread Thread
 
bousquetn profile image
Nicolas Bousquet

I guess for developers like you or me that love our craft, we want to get the most of our time, tooling and libraries. As such we like to have the best of the best, whatever it is.

That the promise of languages like lisp where you can easily build new abstractions that fit the best to solve the problem at hand.

But many things require several or many people either at the same time or over the years... maybe for example you'll not want to devote the next 20 year to the maintenance of the project you did in the past 5-10 years. This is where standardization make sense. If you get better productivity for yourself but the overall productivity drop, that a net loss.

So both aspects are to be taken into account. I would say in a big company, small independent team each completely responsible of its area even including the production make sense and help to scale that productivity. In today world, for many case, just saying your are able to provide VMs in the cloud that are able to respond to some kind of network queries should work and let of freedom in how things are done inside.

But even that doesn't solve everything. The interractions between teams will still dictate many things like what protocol data is exchanged. But also how you managed your database, what is overal architecture, what tools will you use for the continuous integration, QA testing, the cloud you'll use and how your application will skrink and scale dynamically...

There no much we can do alone in a big company if we don't cooperate.

I am convinced the language impact the productivity somewhat, but many other things impact it more. The programing language is a tactician choice, while the bigger things are strategist choices. And while you'll want to delegate the details to great tacticians, you'll want to have great strategists when you are in a big company... If just switching teams mean your employee need 6 months or 1 year before he become fluent in the technology stack, that's a real downside because this is only a small part of the job.

Thread Thread
 
danlebrero profile image
Dan Lebrero

Hi Nicolas,

You are so much right. As Gerald Weinberg said:

"The Second Law of Consulting: No matter how it looks at first, it's always a people problem."

Thanks a lot for your toughts. It has been a pleasure to have a civilized discussion.

I have started following you on Twitter just in case you decide to start blogging.

Cheers,

Dan

Collapse
 
cartermp profile image
Phillip Carter

What if Haskell, Scala, and F# developers are super proactive about reporting bugs? What about other labels? In the F# compiler and IDE tools repository, "regression" is also used. What about repositories which could naturally have a high bug count, but can't be measured the way you chose? Clojure, Ruby, and Scala compiler repos don't have issues on GitHub, for example. F# and Golang do. There are so many other questions surrounding methodology that I have.

I don't think that you can begin to draw any conclusions from this.

Collapse
 
danlebrero profile image
Dan Lebrero

By the way, F# is the only thing that has made me ever consider looking at the Microsoft stack.

Thanks a lot!

Collapse
 
cartermp profile image
Phillip Carter

It's a good one!

Collapse
 
danlebrero profile image
Dan Lebrero

Are you suggesting that Java and C++ devs are super duper proactive? :)

The approach is naive, but I was really expecting that there was going to be some significant difference between "properly" types languages and the rest.

The best explanation that I have found so far is from Bartosz Milewski. Except from his book Category Theory for Programmers

"Strong static typing is often used as an excuse for not testing the code [...] The result of this cavalier attitude is that in several studies Haskell didn’t come as strongly ahead of the pack in code quality as one would expect."

So maybe strong static typing plus proper testing is the answer to less bugs.

What do you think?

Collapse
 
cartermp profile image
Phillip Carter

I'm not suggesting that some groups are more proactive than others - just offering a question that one could draw from that data. Put differently, what if Clojure developers were less annoyed by bugs than C++ developers? I see that conclusion as just as valid as those you've drawn.

I agree that static typing is not an excuse for tests (even though many folks in the FP community wold say otherwise...). Types can certainly eliminate a class of problems if used well, but they're certainly not a silver bullet. Tests guard against change. For any decently-sized project, you need tests to protect your code against yourself :).

Thread Thread
 
danlebrero profile image
Dan Lebrero

"you need tests to protect your code against yourself" -> that made me laugh!

Collapse
 
defhlt profile image
Artem Kholodnyi

What about #4 and #5: Ruby and Scala? They seem to be the opposite of simplicity. Also OOP languages.

Collapse
 
tomjaguarpaw profile image
tomjaguarpaw

What's "density"? Issues labelled "bug" per line of source code? If so, what counts as a line of source code? Whitespace lines? Lines consisting only of trailing or opening parentheses?

Collapse
 
danlebrero profile image
Dan Lebrero

Bugs per repository

Collapse
 
david_raab profile image
David Raab • Edited

Maybe you should add the meaning to the article? I also thought it would be bugs per line of code, a measurement that is useless by itself.

But up-so-far I think pretty much every measurement I have seen is useless.

Just to get it right. A project with 1 file and 100 lines of code in language X with 1 bug has technically a smaller "bug density" then a 50 million lines of code project in language Y with 3 bugs? If "yes", do you think that this is a useful measurement?

Thread Thread
 
danlebrero profile image
Dan Lebrero

thanks for the feedback. I will try to make it more clear!

To your question, yes, that is what I meant by bug density. Language X will have 1 bug per repo and language Y will have 3.

To your particular example, it is useful if the 100 lines of code provide the same amount of functionality than the 50 million lines.

Of course, I don't know any language that is 500k more succinct than other, but I don't either know of any 50 million lines codebase with 3 bugs.

What I do expect is that roughly all developers, no matter the language, have to solve the same problems, so the open source libraries available have roughly the same functionality.

Thanks!

Dan

Thread Thread
 
david_raab profile image
David Raab

Sure, you don't find a 50 million line code project with just 3 bugs. It will have a lot more. That's the point, the bigger the code size the more bugs you usually have.

Usually a comparison of bugs per line of code is "better". But "better" still doesn't mean useful. Some languages are 2-3 times more succinct for the same functionality. So a more succinct language with the exact same amount of bugs will automatically have a larger "bug density" (considering bugs per line of code).

The assumption that every language somehow solves the same problems is also not really correct. A lot of languages like PHP, Python, Ruby, Perl and so on are primarily web-development. And a lot of stuff is only solved by using C libraries. Or in other word, not really solved at all.

Some binding to GUI frameworks like GTK, Qt or game-engines (what you see in Python and so on) sure never will have the code size or complexity like a whole library in C (its just a binding).

Thread Thread
 
danlebrero profile image
Dan Lebrero

I am indeed generalising and assuming that most of us either build websites or do ETL from source A to source B. Probably because that is what I have done for the whole of my career (boring!).

I think you are right to point out that all languages rely on C or C++, but I think that is true for all of them, not just the ones that you mention.

Thanks for the comments!

Daniel

Collapse
 
tgelu profile image
Gelu Timoficiuc

What does "not academic" even mean?

Collapse
 
danlebrero profile image
Dan Lebrero

Go/Erlang/Clojure come from the industry as a reaction to specific pain points from "real production systems".

Collapse
 
tgelu profile image
Gelu Timoficiuc

And you are holding that erlang's immutability or functional paradigms for ex are not linked to academia because they are supposed to address "real production systems" issues?
I think it is an artificial distinction. For example elm is heavily based on all you would categorize as "academic" but its intent is to address real issues in client side development.
How can one separate these 2?

Thread Thread
 
danlebrero profile image
Dan Lebrero

Sorry, I do not mean that.

Academia is extremely important and should be a source of inspiration to the industry.

In fact Curry On is one of my favourite conferences: "Academia and industry need to have a talk."

Experimentation is key to advance the state of the art, but do you want experimental programming features in your production code? Brian Goetz, one of the Java Language architects explains it better here

Thread Thread
 
tgelu profile image
Gelu Timoficiuc

I always had the impression that in an ideal world programming would mean some kind of 1-to-1 relationship with discovered principles of math and nature rather than invented languages based on invented principles. Something closer to ideals that are inherently perfect from logic rather than inherently flawed human constructs (not that they are not pragmatic).
In any case, thanks for the resource!

Thread Thread
 
danlebrero profile image
Dan Lebrero

That is the most beautiful thought that I have read in a long time.

Thanks for sharing!