DEV Community

Cover image for How Rappi Developers use Avo to build better product analytics
Avo
Avo

Posted on • Originally published at avo.app

How Rappi Developers use Avo to build better product analytics

guest post by Juan Jose Villegas, Engineering Manager at Rappi

The power of Avo is this: what is designed is what is implemented.
– Juan Jose Villegas
Engineering Manager


As an Engineering Manager at Rappi, I’m in charge of managing a cross-functional team made up of Android, iOS and backend engineers that are focused on developing and maintaining features that our end user perceives. My role is to take them from ideation and concept phases to testing and production, while remaining involved form a hands-on technical perspective.

We found Avo when searching for a solution to the problems that plague many engineering teams. Rappi develops new features and ships weekly, and each new feature requires analytics. Then, all those features require cross-platform implementation, so we are working on at least 3 different codebases at any given time. Each team had their own interpretation of what should be implemented. We’re only human, so there were typos and different types of events and properties scattered throughout the code. Once the feature went into production, the BI team started to track activity and realized what was defined didn't match what was implemented. Tickets bounced back to the engineering team, and the cycle started all over again.

All of this boils down to developers' time wasted in implementing analytics and fixing bugs - 25% of tickets in engineering were analytics related. We also saw the BI team wasting time on clean up projects and regression testing when critical events break. QA teams were stretched trying to catch every bug before it hit production. Even then, our VPs worried that the data presented to the CEO was inaccurate, leading to large-scale decisions in the wrong direction. I’m willing to bet you’ve found similar inefficiencies in your organization, and it was these recurring issues that led us to implement Avo.

Align Cross-Platform and Cross-Functionally

Avo helps developers communicate with each other better, as well as come to a clear understanding of our requirements and deliverables with data science, business intelligence, and product teams.

Global naming conventions

Discrepancies in event and property names––seemingly minor (order_submitted and orderSubmitted) as well as major (order_submitted and orderButtonPress)––are a surprisingly central pain point in data analytics, particularly when aiming for self-serve analytics culture.

Issues with event names and property names do not only arise from the implementation step (which is mostly prevented with type-safe analytics code generated from a central source of truth for all platforms); they also arise from the data-design step: when multiple teams work on a product, it typically also means that multiple humans work on the tracking plan. It's difficult enough to remember all the events you yourself created, let alone be aware of the events other people have created.

Avo has a global namespace to ensure that all events and properties in the tracking plan have unique names so that they can be easily identified. This significantly reduces duplicate event definitions for the same user action. A global namespace for events ensures that whoever is defining events is aware of events that have similar names. Furthermore, they won't even be able to create an event or property where the only difference is casing (a source of surprisingly many frustrating data issues).

Having a global namespace for all properties ensures you won't create properties with slightly different meanings depending on which event they are sent from – a source of many confusions for the data consumers. For example, the data consumer may be used to user_id meaning "id of the user ordering something", but then all of a sudden a new event gets created with user_id referring to the user delivering an order.

Descriptions

Descriptions aren’t new for us, and I’m sure they aren’t new to you either. Whether we work in Excel, Miro, Amplitude, or Avo, there is some version of event and property descriptions that tell us basic information about how and why a data point is instrumented. What separates Avo from the rest is how it communicates this rich information across the workflow. Descriptions don’t just die in Avo. They’re presented in the generated code, so developers have easy access to them while they’re implementing. They can also get published into your other schema sources, via a Webhook or Avo’s auto-publishing into Amplitude, Mixpanel and Segment. Avo also keeps a log of all changes and comments made, so the descriptions become an ever-evolving and well-documented source of truth.

To make it absolutely clear when an event should get triggered, our event and property descriptions follow a standard template including a Miro link to planning exercises and a link to an image that shows how and where the event is triggered. Avo also recommends having the purpose of your events clear, and allows you to link your events directly to the metrics they should be used for – which adds further clarity and context for your future self or coworkers, who might be wondering why an event exists.

Review workflows

Self-serve analytics and an open data democracy are tenets most product orgs aspire to in the age of rapid product development with an expectation for fast learning. However, I’ve found it to be unsustainable without some type of changelog and review system in place. We can define our analytics events as flawlessly as possible, but it’s meaningless if someone can come in and accidentally delete them.

We utilize Avo’s review workflows. The main branch is protected against direct changes, and every other branch has to be reviewed by someone before it is merged. Having another set of eyes on all changes means the probability of damaging events or designing incorrect data is slim.

It’s also worth noting that each change is logged in Avo, and displayed to the end user in a diff log. The Avo log is particularly designed for reviewing changes to analytics objects, which is better than a git-hosted .json file, where you can’t reliably trace changes to specific events or properties, because git is limited to line-changed of a file. With Avo, it’s easy to trace which developer or analyst made which change, and we can always do a rollback if an error does occur.

Implement faster and better

Type-safe code

When you use Avo to generate code via the CLI or the UI, it’s type-safe and generated the same event and property names across all platforms. We can only implement data in well-defined and reliable ways.

For example, we have an event property called “category”, and we want the values to be either “market” or “restaurant”. That’s how it’s defined in our tracking plan. When we go to generate code, we see that the “category” is implemented as an enum where the cases are exactly what is available in the tracking plan, “market” or restaurant”.

Inspector and Debugger prevent implementation errors

Even with the above processes in place, there could theoretically be errors. For example, we’re still responsible for correctly implementing the timing and triggers for how each event fires. However, the mobile debugger and the Inspector help QA teams to find and eliminate errors before we ship that version. These QA tools make it nearly impossible to ship bad analytics, and eliminate the feedback loop wherein our BI team would find errors and be forced to log Jira tickets to have them resolved.

The power of Avo is this: what is designed is what is implemented.

As an added bonus, Inspector tracks our legacy code before Avo. The Inspector is like a map of what’s happening with our data, what is being sent. What previously was taking weeks of BI time auditing and cleaning up our existing code is now automated. Avo has given us a clearer picture of what inconsistencies our tracking has, and the power to know exactly what issues to fix, so we can work towards prioritizing and resolving them.

Dedupe your workflow

Property Groups

Let me walk through an example I’m sure will resonate with you. You have a widget to sell, and you want to track the ecommerce funnel for how a user buys that widget. You have events like view_item, add_to_cart, and purchase and each of those events will have properties that tell you more about the event. You usually want to know things like item_name or item_price, so you send that along with each of the three events above. Each event can be triggered in multiple places, so you have to type item_name at least a dozen times already. Multiply that by the number of platforms your team serves. Bored yet? Oh, and Android decided to call it itemName instead because they didn’t know we were using snake casing for analytics.

Can we even be surprised that there is human error, then, when we are implementing the same thing over and over, by hand, with several different teams?

Instead, we’ve adopted Property Groups in Avo. We define and design groups of similar properties when we design the tracking plan and assign those Property Groups to relevant events. Then, not only are we only defining the properties once, we’re also ensuring that all necessary metadata is included with each event.

Use default destinations

As mentioned previously, we have quite a few tools in our marketing stack. Downstream from Avo, we have Amplitude, AppsFlyer, Braze, Firebase, Rakam, Split.io and Facebook. Our events could go to one of these sources or they could go to all of these sources. It can be challenging for our developers to know what goes where, and this results in a lot of guesswork that has to be corrected at some point in the future.

When you configure sources in Avo, you have the option of defining defaults for given source-destination pairs. For example, if I know I always want to send Android and iOS events to Firebase, but never web, I can automate that for all events. It is possible to change defaults at the time of event creation, but having defaults in place addresses the vast majority of cases.

Publishing

At Rappi, we use Amplitude for most product and marketing insights and visualizations. If you’ve worked in Amplitude previously, you’ll know that when an external event is sent into the platform, there is a review and acceptance process in Amplitude Govern. Without Avo, this is a valuable check in the system to make sure bad data doesn’t end up flooding our analytics tool. However, we’ve already done the work of describing and verifying all data points so this extra check becomes a timesuck with no added benefit to the process.

Avo integrates directly with Amplitude’s Govern tool and bypasses this process. Instead, all descriptions and contextual meta-data are sent to Govern each time a branch is merged. I don’t have to remember to update Govern, and our analysts always have the most up-to-date version of analytics.

Where We’re At Today

We’re making great strides in improving our analytics implementations and making our developers more efficient in the process. We are working towards an overall goal of having our engineers use only 5% of their time implementing analytics. That represents an 80% reduction in their current use of time. While we’re working towards our metric goals, we’re happy with our developer and data scientist feedback so far. Here is what my co-worker had to say:

“Developer adoption of Avo has been great. In a matter of weeks, multiple teams are implementing analytics with Avo, with developers adopting the tool seamlessly and without conflict, even if they are developers that don’t work together day-to-day. I’ve seen very good feedback from devs. It’s easier to make changes and maintain events, because they are well defined in advance.”
- Henry Martinez
Android lead

With Avo we create analytics schemas upfront, identify analytics issues fast, add consistency over time and ensure data reliability as we help customers serve the 12+ million monthly users their businesses attract. Overall, the power in Avo is that it frees my team up to build the experiences and features that really matter for our users.

Top comments (0)