Everyday we all use Open source projects directly or indirectly to do some task (as a user) or build our own projects on top of it (as a developer). If you are a developer, you have to be bit careful while choosing an open source library/framework to build your project because the future of your project is going to be dependent on that framework. Choosing the right framework can boost up the project's progress, at the same time choosing a bad framework can slow down your progress and maybe you have to spend extra time to debug it or sometimes re-write the entire project using another framework or from scratch on your own.
Choosing the right framework depends on both your intuition and experience (The term experience here means how much you have experimented on similar projects before, not number of professional years). I am not an experienced developer, but I've had bad experiences with the past decisions I made while choosing a framework (Later I had to re-write my project), as time progressed through experimentation I was able to pick a good quality framework most of the time. A good framework here means Quality of the project, not the use-case. (Obviously we will not choose a framework which doesn't suit our use-case). So here I have made an attempt to list few points that makes an Open source project usable or good, maybe if you are an author planning to create your next open source framework, then this will act like a checklist for you. This post is for newbies who are into programming. If you are not a newbie, all the points below may seem obvious
1. Project roadmap
Many open source projects are abandoned after the initial commit, the developer never cares about adding new features, so obviously there will not be any roadmap for such projects. Be careful while using such projects, because the developer who created such project may comeback one day and change the entire code-base. (This has happened once for me, I had to pull back the initial commit). In some other cases, the repo itself can be removed, leaving no backups. If you look at successful open source projects/frameworks (like tensorflow
, react
, node
, flutter
etc), they all have a well defined roadmap. Roadmap also gives you a look into the future of the framework.
2. Versioning, Nightly builds and Latest stable releases:
Versioning is very important for any project, it helps the developers who build on the project to use a specific snapshot of the code-base. Many popular open source projects use Semantic versioning standard to version the code-base snapshots.
There are two types of releases (majorly) 1. Nightly builds and Latest Stable builds. If you want the most stable, well-tested version of the framework, you can choose latest stable release, but if you want to use the latest cutting-edge feature (and you can't wait for next-stable), you can always choose latest nightly build (These builds are almost available every day). If you are using a nightly build for production, be careful, because nightly builds can break at any point (the latest features may not be tested for edge-cases). Also if you are planning to pull the library/framework directly from GitHub, never pull the master branch directly, the fresh master branch may contain features which are recently merged or not tested yet (you might be in trouble if you do this, sometimes!!), go to releases
and check for the list of version tags, tags
are like snapshots of the code-base which is stable, you can pull that snapshot to make sure you are safe.
3. Dependencies and code-packaging
Majority of open-source projects are dependent on other projects (Example : Tensorflow depends on many C/C++ libraries like eigen
, abseil
, protobuf
, cuda
etc), if you are using such framework, indirectly you are using its dependencies as well. If any dependency breaks, the framework breaks, so your project breaks as well!!. Dependency management is an important thing which if often ignored in most of the projects. Many modern programming languages handle this problem by shipping a package manager along with the language compiler and runtime. (Example : pip for python and npm for node). These package managers can automatically download and install right-dependency for you, so need not worry much. Things become complex if the project is not released as a package, because you have to somehow integrate the code-base yourself, this is where versioning plays an important role. Whenever you feel like using an open source framework check if the dependencies are versioned properly. (If they are not versioned, the package manager always downloads the latest version which can cause trouble).
For example, recently I was working on some audio processing project using librosa - python
, the latest version of librosa
requires numba
which in turn is dependent on llvm-11.0
, but the latest version of llvm
which can be installed using apt
is llvm-8.0
, I had to build llvm-11.0
from source, which cased some other problems. (These issues are common :p)
4. Community support and popularity
This is the most important aspect if you are using a complex framework like React or Tensorflow. We all run into trouble at some point and we need to find solutions quickly to proceed further, at that point the obvious thing we all do is to google the problem (or just copy paste the error message :p). If the library is popular we obviously get a stackoverflow
link which solves our problem, if the library is not so popular, the obvious thing is the issues
and community forum
, where we find common problems other developers are facing. Most successful projects have a community forum where you can discuss your problem and get the solution from other developers. (React
has many such slack channels both official and unofficial ones, Even node.js
has many such forums)
So if you are really using the framework for something important, make sure you have a proper community support, otherwise you many end up digging the source code and breaking your mind at last!!
5. Documentation and Unit-Tests
Of course, a good project must have these things for sure. Documentation will help you to understand what all features are available in the framework, writing documentation can be boring, but these days there are many tools out there which can make things easier for you, these tools can automatically export a documentation in HTML
format from the code comments (Example: look at python's docstring
standard). It is not recommended to use a framework which does not have a proper documentation, unless you are capable of reading and understanding directly from the source code.
A framework can have many features, but are you sure all these features work as they intend to work?? This is where unit-tests comes into picture, If you plan to use a project which has unit tests, you can clone the repo, build it and run the unit-tests, if all the unit-tests are successful, you can use the framework with confidence. Unit-tests sometime serve as an alternative to code-documentation, if there is no documentation, you can use unit-tests itself as documentation as they show you how all the functions are meant to be invoked and what all input types they accept to produce proper output.
6. Build-scripts and Docker images
Not all the frameworks are available as executables, sometimes you have to build/compile them on your own (If you are a C/C++ developer, you have to deal with this everyday). This is where build scripts and built-automation tools are really helpful. Many good C/C++ projects I have seen so far atleast provide a Makefile
which automates the build-process with right include files and install location. Some complex projects like Tensorflow make use of advanced build-systems like bazel
and cmake
. Build systems are here to make our life easier, imagine how time consuming it can be to figure out how to compile a project with thousands of source files on your own. These build-scripts can come handy when you make changes to the code-base (many build-systems can cache old objects and compile only the changed files to reduce the build time). It is even better if you find out a framework which also provides a Dockerfile
, Dockerfile
is used to build a docker image, which automatically packs everything required to build on top of the framework. Docker images run as containers when executed, you can think of container as an isolated environment where you can run the code without breaking anything directly on the host machine. (Technically, a container allows your code to run in a isolated namespace and also changes the root-filesystem path so that container no longer has access to the host file-system).
Thanks for reading my post. Please share your opinions in the comments section, you are always welcome to suggest/add more points :)
Top comments (1)
Some great tips which would surely help save time for a newbie trying to find their way!