DEV Community

Cover image for Rust vs. Go: The benchmark
Zane
Zane

Posted on • Edited on

Rust vs. Go: The benchmark

πŸ‘‹ Introduction

Lately, there's been a lot of buzz around Rust & Go. Why all the hype? Which one truly holds the edge? Let's fire up servers on 127.0.0.1 & find out.

GitHub: Here's the Simple HTTP Server code for benchmarking:

πŸ“¦ Benchmarking Setup with WRK

To start, we'll need WRK(powerful benchmarking tool). If you're on a Mac like me - you'll find it on HomeBrew.

  • Benchmarking Tool: We're using WRK
  • Installation: Mac users, you can easily install WRK with Homebrew command:
  brew install wrk
Enter fullscreen mode Exit fullscreen mode

πŸ”§ Setting Up the Rust Web Server

For Rust, you'll need a specific directory structure to get started:

|_ Cargo.toml
|_ src
   |_ main.rs
Enter fullscreen mode Exit fullscreen mode

Once everything is in place, initiating the server is straightforward with cargo:

cargo build --release
cargo run
Enter fullscreen mode Exit fullscreen mode

Terminal GH GIF

πŸ— Rust Web Server Code

Interested in the nitty-gritty? Dive into the server code!

Explore the Code

Rust: main.rs

// Import necessary modules and types from the actix_web crate and standard library.
use actix_web::{web, App, HttpServer, HttpResponse, Responder, middleware::Logger};
use std::env;

// The main function is marked with `actix_web::main`, which sets up an async runtime.
// This function will return a Result that, if an error occurs, will contain an `std::io::Error`.
#[actix_web::main]
async fn main() -> std::io::Result<()> {
    // Set the environment variable for logging level to "info" for actix_web logs.
    env::set_var("RUST_LOG", "actix_web=info");
    // Initialize the env_logger logger, which will log information based on the RUST_LOG environment variable.
    env_logger::init();

    // Create and run an HTTP server.
    HttpServer::new(|| {
        // Initialize the Actix web application.
        App::new()
            // Add the Logger middleware to log all incoming requests.
            .wrap(Logger::default())
            // Define a route for the root URL ("/") that handles GET requests with the `root_handler` function.
            .route("/", web::get().to(root_handler))
    })
    // Bind the server to listen on the localhost address and port 8080.
    .bind("127.0.0.1:8080")?
    // Start the server and await its completion, handling any errors that occur.
    .run()
    .await
}

// Define an asynchronous handler function for the root URL.
// This function returns a type that implements the `Responder` trait, which can be converted into an HTTP response.
async fn root_handler() -> impl Responder {
    // Create an HTTP response with the status code 200 OK and the body "Hello, World! This is a cool web test!".
    HttpResponse::Ok().body("Hello, World! This is a cool web test!")
}

Enter fullscreen mode Exit fullscreen mode

Rust: Cargo.toml

[package]
name = "rust_server"
version = "0.1.0"
edition = "2021"

[dependencies]
actix-web = "4.0"
actix-rt = "2.5"
env_logger = "0.9"

[dev-dependencies]
criterion = "0.3"

Enter fullscreen mode Exit fullscreen mode

πŸ”§ Setting Up the Go Web Server

For Go, Effortlessly run the following:

go run main.go
Enter fullscreen mode Exit fullscreen mode

πŸ— Go Web-Server Code

Interested in the nitty-gritty? Dive into the server code!

Server Code

Go: main.go

package main

import (
    "fmt"
    "log"
    "net/http"
    "time"
)

func main() {
    // Create a new ServeMux (router)
    mux := http.NewServeMux()

    // Register a handler function for the root URL
    mux.HandleFunc("/", rootHandler)

    // Create a new HTTP server
    server := &http.Server{
        Addr:    "127.0.0.1:8080",
        Handler: logRequest(mux), 
                // Wrap the handler with the logging middleware
    }

    // Start the server and log if there's an error
    fmt.Println("Server is running on http://127.0.0.1:8090")
    if err := server.ListenAndServe(); err != nil {
        log.Fatal("Error starting server: ", err)
    }
}

// rootHandler responds to requests at the root URL
func rootHandler(w http.ResponseWriter, r *http.Request) {
    fmt.Fprint(w, "Hello, World! This is a cool web test!")
}

// logRequest is a middleware that logs each request
func logRequest(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        start := time.Now()
        next.ServeHTTP(w, r)
        duration := time.Since(start)
        log.Printf("Request: %s %s, Duration: %s\n", r.Method, r.URL.Path, duration)
    })
}

Enter fullscreen mode Exit fullscreen mode

πŸ”‘ Rust or Go: Key Metrics

Rust: Metrics Release Mode Not Enabled

wrk -t2 -c100 -d30s --latency http://127.0.0.1:8080  
Running 30s test @ http://127.0.0.1:8080
  2 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     7.36ms    5.32ms  68.59ms   91.98%
    Req/Sec     7.54k     1.51k   12.48k    69.83%
  Latency Distribution
     50%    5.94ms
     75%    7.67ms
     90%   10.77ms
     99%   33.52ms
  450491 requests in 30.02s, 48.98MB read
Requests/sec:  15004.41
Transfer/sec:      1.63MB

Enter fullscreen mode Exit fullscreen mode

Rust: Metrics Release - Enabled

wrk -t2 -c100 -d30s --latency http://127.0.0.1:8080                                                      
Running 30s test @ http://127.0.0.1:8080
  2 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     6.41ms    6.39ms  95.91ms   92.02%
    Req/Sec     9.47k     2.42k   14.95k    65.50%
  Latency Distribution
     50%    4.73ms
     75%    6.31ms
     90%   10.66ms
     99%   36.55ms
  566141 requests in 30.05s, 61.55MB read
Requests/sec:  18838.67
Transfer/sec:      2.05MB

Enter fullscreen mode Exit fullscreen mode

Go: Metrics

wrk -t2 -c100 -d30s --latency http://127.0.0.1:8090                                                      
Running 30s test @ http://127.0.0.1:8090
  2 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     3.12ms    3.13ms  46.54ms   94.85%
    Req/Sec    18.12k     3.18k   25.08k    69.95%
  Latency Distribution
     50%    2.62ms
     75%    3.23ms
     90%    4.23ms
     99%   20.40ms
  1082616 requests in 30.05s, 160.03MB read
Requests/sec:  36021.81
Transfer/sec:      5.32MB
Enter fullscreen mode Exit fullscreen mode

πŸ” Lets Chart it.. Rust or Go?:

Let's compare performance of Rust server & Go servers:

Image description

πŸ“ˆ Rust Server Performance: Release Mode - Not Enabled

  • Average Latency: 7.36ms
  • Standard Deviation of Latency: 5.32ms
  • Maximum Latency: 68.59ms
  • Requests per Second: 7.54k (average), 12.48k (maximum)
  • Latency Distribution:
    • Median (50%): 5.94ms
    • 75th Percentile: 7.67ms
    • 90th Percentile: 10.77ms
    • 99th Percentile: 33.52ms
  • Total Requests: 450,491 in 30.02 seconds
  • Data Transferred per Second: 1.63MB

πŸ“ˆ Rust Server Performance: Release Mode - Enabled

  • Average Latency: 6.41ms (down from 7.36ms)
  • Standard Deviation of Latency: 6.39ms (up from 5.32ms)
  • Maximum Latency: 95.91ms (up from 68.59ms)
  • Requests per Second:
    • Average: 9.47k (up from 7.54k)
    • Maximum: 14.95k (up from 12.48k)
  • Latency Distribution:
    • Median (50%): 4.73ms (down from 5.94ms)
    • 75th Percentile: 6.31ms (down from 7.67ms)
    • 90th Percentile: 10.66ms (up from 10.77ms)
    • 99th Percentile: 36.55ms (up from 33.52ms)
  • Total Requests: 566,141 in 30.05 seconds (up from 450,491)
  • Data Transferred per Second: 2.05MB (up from 1.63MB)

πŸ“ˆ Go Server Performance:

  • Average Latency: 3.85ms
  • Standard Deviation of Latency: 3.72ms
  • Maximum Latency: 60.31ms
  • Requests per Second: 14.71k (average), 24.66k (maximum)
  • Latency Distribution:
    • Median (50%): 3.14ms
    • 75th Percentile: 3.95ms
    • 90th Percentile: 5.45ms
    • 99th Percentile: 21.33ms
  • Total Requests: 879,736 in 30.06 seconds
  • Data Transferred per Second: 4.33MB

πŸ€” Well.. Is it Rust or is it Go?

  • Latency: The Go server has lower average, median, and 99th percentile latencies compared to the Rust server. This suggests that for each individual request, the Go server is generally able to respond faster.

  • Requests per Second (Throughput): The Go server is handling nearly double the number of requests per second compared to the Rust server. This indicates that the Go server has a higher throughput under the tested load conditions.

  • Data Transfer: The Go server is transferring more data per second than the Rust server, which aligns with its higher requests per second.

😎 Conclusion:

Well.. In this benchmark, the Go server is outperforming the Rust server in terms of both latency and throughput. However, it's important to note a few key considerations when interpreting these results:

  • Server Configuration: The configuration of the server, such as the use of asynchronous code, thread pool sizes, and other optimizations, can significantly impact performance.

  • Workload Characteristics: Depending on what the servers are actually doing (static file serving, database queries, CPU-bound tasks), the performance characteristics could change.

  • Benchmarking Conditions: The system on which the benchmark is run, other running processes, network conditions, and even the specifics of how wrk is used can affect the results.

  • Code Maturity and Optimizations: The specific Rust and Go code being benchmarked could be at different levels of optimization. More mature or optimized code can perform significantly better.

Top comments (7)

Collapse
 
emadmokhtar profile image
Emad Mokhtar

This makes sense. Go is built for miroservices and Rust isn't. I'm sure Rust will out perfrom Go in the low level tasks. This is why I don't believe in one language for all problems.

Collapse
 
itmind profile image
ITmindCo • Edited

Rust must be build with:
cargo build --release
or
cargo run --release

Collapse
 
lfpraca profile image
lfpraca

As others have mentioned, your rust code ran much more slowly than it should have because of the lack of "--release" in cargo run. I downloaded the source code and ran the test for the rust (with --release) and go version and this is what I got:

go:

wrk -t2 -c100 -d30s --latency http://127.0.0.1:8080
Running 30s test @ http://127.0.0.1:8080
2 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.97ms 0.88ms 20.03ms 82.19%
Req/Sec 52.82k 1.43k 56.99k 73.50%
Latency Distribution
50% 686.00us
75% 1.45ms
90% 2.12ms
99% 3.81ms
3153212 requests in 30.01s, 466.11MB read
Requests/sec: 105085.44
Transfer/sec: 15.53MB

rust:

wrk -t2 -c100 -d30s --latency http://127.0.0.1:8080
Running 30s test @ http://127.0.0.1:8080
2 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.92ms 345.33us 5.43ms 71.53%
Req/Sec 53.37k 1.27k 55.54k 78.17%
Latency Distribution
50% 0.86ms
75% 1.11ms
90% 1.37ms
99% 1.97ms
3186871 requests in 30.00s, 346.47MB read
Requests/sec: 106222.68
Transfer/sec: 11.55MB

These results show that rust achieved more reqs/sec, better average latency and much better max latency

Collapse
 
ionkrutov profile image
Ion Krutov • Edited

Hello, @lfpraca.
I have the same issue because I had a very slow HTTP rust server. Can you help me to optimize this simple hello world server? github.com/ionkrutov/benchmarks?ta...
I have the following result

ab -c 1000 -n 100000 -k http://localhost:8080/hello


Server Software:        
Server Hostname:        127.0.0.1
Server Port:            8080

Document Path:          /
Document Length:        13 bytes

Concurrency Level:      1000
Time taken for tests:   9.340 seconds
Complete requests:      100000
Failed requests:        0
Keep-Alive requests:    0
Total transferred:      3200000 bytes
HTML transferred:       1300000 bytes
Requests per second:    10706.58 [#/sec] (mean)
Time per request:       93.401 [ms] (mean)
Time per request:       0.093 [ms] (mean, across all concurrent requests)
Transfer rate:          334.58 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0   48  88.3     40    1087
Processing:    11   45  14.6     46      75
Waiting:        0   18  14.4     14      71
Total:         36   93  88.9     92    1147

Percentage of the requests served within a certain time (ms)
  50%     92
  66%     93
  75%     94
  80%     95
  90%     96
  95%     97
  98%     99
  99%     99
 100%   1147 (longest request)
Enter fullscreen mode Exit fullscreen mode
Collapse
 
xanonid profile image
xanonid • Edited

cargo run without --release starts the rust code in slower debug mode.

Collapse
 
der_gopher profile image
Alex Pliutau

Rust for Gophers with John Arundel packagemain.tech/p/rust-for-gophers

Collapse
 
der_gopher profile image
Alex Pliutau

We'll be publishing a post soon about comparing Rust and Go in 2024, stay tuned here - packagemain.tech/