At a conference recently, I heard someone ask the question "When's a good time to start thinking about QA?" The question was asked in earnest, but the speaker responded with a quick "Immediately!" and moved onto the next question. While "Immediately!" reflected the speaker's enthusiasm about quality assurance, it left the question-asker down in the dumps.
Knowing when to think about QA is really hard, and there's not that much good advice out there. A lot of blog posts, like the above-mentioned speaker, skirt the issue by saying things like "write more tests" or "use strong typing." Chances are, if you're Googling "When to start thinking about QA", these are not answers to your question. This article is.
A couple definitions
Once, when I was building an Android app, an angry supervisor asked us to "Stop throwing the app over the net to QA." I understood what the person meant, and I've heard this sort of critique a lot. The issue with the critique is that the most important part of the QA process is throwing your app over the net. The reason you throw your app over the net is because, very soon, you will be throwing your app over an even bigger net called production.
I like to use the analogy of the performing arts here: there are lots of rehearsals where you bring in someone from outside to look at your work. The last rehearsal (the dress rehearsal) is almost always this, but there are regular points in the development of a staged production where you'll bring in a trusted expert to give their opinion. You do this because, eventually, there will be hundreds and thousands of people forming their own opinion about your work, and it's important to make sure that your work represents your ideas. When you collaborate with a QA engineer, you're bringing someone in to watch a rehearsal.
A lot of people worry that QA engineers will treat every rehearsal like the dress rehearsal, but this is not true. Great QA engineers know exactly what stage a project or feature is at and will give feedback that will help the project or feature at that stage.
So to sum up, QA is a third-party viewpoint that gives you stage-appropriate technical feedback about your project.
The single most important step
The most important step you can take, starting today, is kicking off a relationship with a QA professional or a company providing QA. This does not require purchasing or using a service - it just means introducing yourself.
Great QA services (and we like to think we're building one!) know how to take it from here. Like a tailor or a doctor, they will start taking measures. By monitoring a codebase, a roadmap, and specs, a QA professional will focus on the following questions:
- When are major releases planned?
- Are there any indicators that would predict increased development velocity at a given time?
- How often are there breaking changes to a service, and what are the precursors to these changes?
These questions are asked in order to plan out when their help can be most impactful and necessary.
At Meeshkan, we ask and answer these questions by testing code bases, creating specs, and building burn charts. The purpose of this is not to provide you with information (you're already overloaded!) but to answer even more important questions:
- When will the team likely have time to digest test reports?
- Are there enough valuable indicators about quality to share them with the team?
- What is a five-alarm fire in the context of the team's business needs?
All of this means that, when you get a ping from us, you can trust us that it is important. Of course, we give you access to the entire trail that leads up to our decisions, and if you really want to, you can read over the tens of thousands of tests we run on your service. But our job is knowing what to present to you, when to present it, and how it should be presented.
So, introduce yourself! It will take a matter of minutes, and you'll be on the path to cultivating a long-term relationship that underpins great QA.
The next step
You've now won half the battle. Developing a relationship with a QA professional will aleviate a lot of stress because your trusted partner knows when to step in and when to get out of the way.
The next step, and the other half of the battle, is what happens at the "rehearsal", to use the analogy from above. What are you hoping to get out of QA that you wouldn't get out of building a spec, writing a few tests, or using a great linter?
QA is all about business outcomes. It puts itself in the place of the user, looks at the current state of the UX, and assesses if the "diff" between the current UX and the desired UX is on track. If a QA engineer crashes your app by pressing a button five times in rapid succession, they won't bring the crash to your attention unless it is linked to an underlying problem that can effect your bottom line.
We see this all the time at Meeshkan. Our algorithms regularly crash servers by using very large numbers, negative pagination and other forms of exotic input. If you hear about this type of behavior from us, it is because we've linked the crashes to some underlying issue that we think is unsustainable. Otherwise, you won't hear from us because it's ok if your app crashes in the wild. Don't let anyone tell you otherwise! What's not OK is for your business to consistently only ever return three out of twenty valid search results to a user. That's not a crash, but if search is a critical feature of your business, then that's a major problem. Great QA brings that to the fore.
The difference between Meeshkan and traditional QA is that we do this automatically. There are several core benefits of automated QA:
- It explores a lot of potential outcomes.
- It pools data from thousands of tests to recognize patterns and make the best possible predictions.
- It moves at the speed of your development by testing every change to source code.
- The price point makes it easy for small businesses to get started.
Automated QA is not a replacement for manual QA. Someone will always need to give your app or service a spin in the wild. Sometimes your most enthusiastic users serve this role. But automated QA leads naturally to manual QA in several respects:
- It gives a tester a machine-built spec, which can act as a map of the service.
- It shows what parts of the app are the most trivial/predictable so that the manual tester can focus on the hard stuff.
- Like drivers in a self-driving car, manual testers provide a valuable source of confirmation to machine-built QA algorithms, creating a positive feedback loop.
An example
Your business deploys a new service. Congrats! 🎊 Initially, all of the service's features are in your head, and you're able to do enough testing so that you confidence when you deploy.
Soon, the service grows in complexity. You add authentication, you write to a database, and you start relying on other services. At a certain point, you begin to feel an uneasy feeling when you merge a branch to production. Why do you feel this way? You've tested the service as best you could, all the tests are passing, and still you fret about bad outcomes. It could be a logged out user accessing logged-in resources. It could be JSON parsing failure leading to serving a 404
. But there's some form of psychological resistance to merging to production.
The day you feel this is the day to kick off a conversation with a potential QA partner. If we're fortunate enough to earn your business, one of the first things we'll do is identify all of the points in your app that are most likely making you feel this way. For example, if you use some form of message passing like spawn/register
in an Erlang cluster or Google PubSub, we have a certain category of automated tests that apply to this type of setup.
These tests don't just bombard the system with garbage input. For example, we will look to see if there are PubSub topics that are written to but never read. We will also try to find common mispellings of PubSub channels and potential deadlocks. You may get a report that says the following:
Channel
customerPaid
expects input written to channelcustomerRegistered
, butcustomerRegistered
expects input tocustomerPaid
.
We may send you only this in the report even though your app is crashing in other places. This gets back to the main idea from above - our goal is not to taunt you with the 1,000 ways we made your service crash, but to distill our findings into the five things (maximum!) you need to know.
PubSub is one of hundreds of places where automated QA moves the needle. At Meeshkan, we also look at CRUD operations, calls to third-party APIs, naming conventions, and many other indicators of quality that add up to create a rock-solid service.
I've seen a lot of companies resist bringing in a QA professional because they're still experimenting with a novel idea and they're not sure if it's worth testing yet. I regularly hear folks say "Why bother bringing in QA if we may change the whole thing in a month anyway?" My answer is that one month is too long. Agile teams build upon successive hypotheses in a matter of days, and one way to speed up the process is having high-quality code early on. Meeshkan exists to provide value at exactly this stage. From the day you add Postgres or PubSub or OAuth2 to a service, our algorithms are able to discern if the use of a particular technology is idiomatic and viable.
How to get started
While we hope to be your trusted partner in automated QA, we realize that you have a choice in both automated an manual QA options. For manual QA and for writing your own tests in a Selenium-like style, we really like Rainforest QA. For automated QA, and especially if you're just getting started, we recommend creating a free Meeshkan account.
Conclusion
The purpose of this article, and why it is a bit longer than your average missive on the subject, is to avoid a kitch or cop-out answer to "When should I start thinking about QA?" It explores this question head on, addressing the purpose of QA, the benefits you can expect, and how to kick things off.
As businesses increasingly find themselves pivoting to keep up with market tendencies, the major challenge of the next decade will not be making software that's built to last. It will be honing in on market signals faster than competitors. Your software is the best signal detector there is, and modern quality assurance makes sure that what you are building will return as much signal and as little noise as possible. At Meeshkan, we accompany teams that build modern software. So what are you building?
Top comments (0)