DEV Community

Cover image for This Week I Learnt: gRPC & Protobuf
Dara Olayebi
Dara Olayebi

Posted on

This Week I Learnt: gRPC & Protobuf

After a (very) long hiatus, I'm back with a writing series I'm calling "This Week I Learnt". Just over a year ago, I started a new job and it has been the most exciting and most challenging experience of my career. Exciting because I'm contributing to a product I love with a brilliant team. Challenging because imposter syndrome reared its ugly head and I spent months working to overcome daily feelings of self-doubt. I was also transitioning into a full stack role with a purely frontend engineering background, so that made everything a little more difficult.

In my first few months, I was learning so much so quickly as I began to work with large, complex systems. I became exposed to system architecture and design concepts as well as advanced software tooling, a lot of which I'd had very little previous practical experience with. This led me to the idea of a writing series - breaking down some of these concepts as frequently as I can in a way that's easy to understand.

So, here goes the first one!


This Week I Learnt how gRPC (General Remote Procedure Call) and protocol buffers work.

To understand gRPC, it helps to first understand what RPC is. A remote procedure call is a type of communication that allows computer programs to execute code on a remote server as if they were local function calls. In slightly simpler terms, RPC allows developers to make function calls on remote machines or services. It is particularly useful for distributed systems where components are spread across multiple nodes or servers. gRPC is an open-source API implementation of the RPC model. It was built by Google in 2016 and is now widely used by a lot of companies with large systems. It has multiple benefits over the traditional REST/SOAP API approach, one major one being its performance.

APIs rely on a communication protocol to determine how to send and receive data. A communication protocol in a nutshell is a set of rules that determine how data is sent over the internet and other networks. HTTP, for example, is the most widely used protocol today, used to send data between clients and servers on the web. File Transfer Protocol (FTP) is another example, and is used to transfer files between computers etc. gRPC relies on the HTTP/2 protocol (which is an upgraded version of HTTP).


Image description

Image credit: Imperva


In addition, APIs also need to follow a specific data format for communication between clients and servers. REST APIs send and receive data as JSON or XML objects. gRPC APIs on the other hand use Protocol Buffers (or protobuf). These are .proto files containing messages and a service definition that define the methods of the service and each request and response type. Below is a protobuf file example - it defines a UserInfo service that allows consumers to get a user's profile information and update a user's email:

// Specifies the proto version
syntax = "proto3";


// Request and response for fetching a user's profile
message GetUserProfileRequest {
    string user_id = 1;
}

message GetUserProfileResponse {
    string user_id = 1;
    string email = 2;
    int64 phone = 3;
    string avatar_url = 4;
    repeated string followers = 5; 
}

// Request and response for updating a user's email
message UpdateUserEmailRequest {
    string user_id = 1;
    string new_email = 2;
}

message UpdateUserEmailResponse {
    // You can also define enums in these files
    enum ResponseStatus {
        SUCCESS = 0;
        ERROR = 1;
    }

    ResponseStatus response = 1;
}


// API service definition which lists the two API methods and their corresponding request and response
service UserInfo {
    rpc GetUserProfile (GetUserProfileRequest) returns (GetUserProfileResponse)
    rpc UpdateUserEmail (UpdateUserEmailRequest) returns (UpdateUserEmailResponse)
}
Enter fullscreen mode Exit fullscreen mode

You can have several proto files in your codebase, each one containing a different service and message definitions.

Once proto files have been defined, they need to be compiled to translate them into code that can be understood and used by various programming languages. For example, consuming a REST API in a JavaScript application can be done via the native fetch() method or some web client library like Axios. In the gRPC world, we need a way for the data structures defined in the proto files to become readable code. This is done by the protobuf compiler (protoc). It takes in the proto files as its input, and outputs a bunch of stub code made up of classes and objects that represent your service. The compiler supports multiple programming languages (Java, Python, C++ etc), so can generate stub code in the language you will be using to consume the API.


The image below is a workflow of the processes required to consume a gRPC API:

Image description
Image credit: IONOS Digital Guide


Once compiling is done, clients can then begin to interact with the API using the methods generated.


Helpful Links:

Top comments (0)