DEV Community

Glitchbyte
Glitchbyte

Posted on

Rust wont save us, but it's ideas will

What are we saving?

Recently, I came across this article titled "Rust Won’t Save Us: An Analysis of 2023’s Known Exploited Vulnerabilities".

Being the clickbait it is, I clicked.

Quick background on me: I've worked in cybersecurity for almost 10 years. I know cybersecurity way more than I know development.

My day job is securing infrastructure and code.

An article like this piques my interest.

I've been writing programs in Rust for a few years now.

I started writing Rust because of its claim to memory safety, and it became my favorite language to use. I've even managed to ship Rust to prod in one of the coolest projects I've had the honor of being apart of.

So what is this article talking about?

TL;DR: Rust was made to solve memory-related vulnerabilities and issues, but that only makes up 19.5% of the most exploited vulnerabilities in 2023. Routing and Path abuse exploits tied for second place with memory vulns, followed by Default Secrets (4.9%), Request Smuggling(4.9%), and Weak Encryption (2.4%). The most abused exploit? Insecure Exposed Functions (IEF), at 48.8%.

The article goes onto making the most generic recommendations any cybersec professional would know:

  1. Vendors
    1. Develop the depth of knowledge of your engineers in the frameworks they use
    2. Harden, standardize, and audit the use of those frameworks across products
    3. Enable and expose verbose logging for your products
  2. Developers
    1. Assume all code you write is reachable from an unauthenticated context
    2. Practice defense-in-depth programming and don’t make it easy for an attacker to shell out
  3. Defenders
    1. Reduce any attack surface exposed to the internet if its not needed there
    2. Proactively enable logging, and remote logging if possible, for all products that touch the internet
  4. Researchers
    1. Look for bugs in the places frameworks come together

Therefore, Rust won't save us.

There is some truth to that, and the advice given by the article is also correct.

But it doesn't dig into why Rust was made in the first place.

It doesn't ask the question "Can we reduce/eliminate IEF abuse similar to how we reduced memory vulnerabilities?"

Looking at IEF

What are Insecure Exposed Functions, exactly?

Lets take a look at the MITRE definition:

The product provides an Applications Programming Interface (API) or similar interface for interaction with external actors, but the interface includes a dangerous method or function that is not properly restricted.

This weakness can lead to a wide variety of resultant weaknesses, depending on the behavior of the exposed method. It can apply to any number of technologies and approaches, such as ActiveX controls, Java functions, IOCTLs, and so on.

The exposure can occur in a few different ways

  • The function/method was never intended to be exposed to outside actors.
  • The function/method was only intended to be accessible to a limited set of actors, such as Internet-based access from a single web site.

IEF is access to functions the outside world should never have had access to in the first place.

Lets look at an example from the same page:

public void removeDatabase(String databaseName) {
  try {
    Statement stmt = conn.createStatement();
    stmt.execute("DROP DATABASE " + databaseName);
  } catch (SQLException ex) {
    ...
  }
}
Enter fullscreen mode Exit fullscreen mode

In this example, we have a Java method removeDatabase that will delete a database with the name specified in the parameter.

The problem is this method should never have been public. By declaring it public, the rest of the application has access to this method, even though it should be restricted.

private void removeDatabase(String databaseName) {
  try {
    Statement stmt = conn.createStatement();
    stmt.execute("DROP DATABASE " + databaseName);
  } catch (SQLException ex) {
    ...
  }
}
Enter fullscreen mode Exit fullscreen mode

Now lets take that same example and see what it would look like in Rust.

fn remove_database(conn: &Connection, database_name: &str) -> Result<()> {
    let mut stmt = conn.prepare(&format!("DROP DATABASE {}", database_name))?;
    stmt.execute([])?;
    Ok(())
}
Enter fullscreen mode Exit fullscreen mode

In Rust, this function is private by default.

In order for this function to be public, we would have to declare it public:

pub fn remove_database(conn: &Connection, database_name: &str) -> Result<()> {
...
}
Enter fullscreen mode Exit fullscreen mode

This example is a simple scoping error, or laziness.

It's easy to miss, but Rust doesn't let you make this mistake.

"Okay, so it's private by default, big deal. Theres other ways of improperly accessing functions and abusing them."

Lets look at another example from the MITRE site:

// Android code
@Override
public boolean shouldOverrideUrlLoading(WebView view, String url) {
  if (url.substring(0, 14).equalsIgnoreCase("examplescheme:")) {
    if (url.substring(14, 25).equalsIgnoreCase("getUserInfo")) {
      writeDataToView(view, UserData);
      return false;
    } else {
      return true;
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

This Android app intercepts the URL loading within a WebView and perform special actions if a particular URL scheme is used, thus allowing the Javascript within the WebView to communicate with the application."

A call into native code can then be initiated by passing parameters within the URL:

window.location = examplescheme://method?parameter=value
Enter fullscreen mode Exit fullscreen mode

Because the application does not check the source, a malicious website loaded within this WebView has the same access to the API as a trusted site.

You see where this is already going.

fn should_override_url_loading(&self, url: &str) -> bool {
    // 1.
    if url.starts_with("examplescheme:") {
        // 2.
        let rest = url.strip_prefix("examplescheme:").unwrap_or("");
        // 3.
        if rest.starts_with("getUserInfo") {
            // 4.
            write_data_to_view(&UserData {});
            return false; // 5.
        } else {
            return true; // 6.
        }
    }
    true // 7.
}
Enter fullscreen mode Exit fullscreen mode
  1. Verify the source of the URL
  2. Source verification logic can be added here if needed. For simplicity, we assume all URLs with the "examplescheme:" prefix are trusted. We extract the method from the URL.
  3. Check if the method is "getUserInfo"
  4. Write data to the view (assuming UserData is sanitized)
  5. Allow URL to load
  6. Do not override the URL.
  7. Default behavior: do not override the URL

Not the prettiest Rust code, but it gives the general idea.

We're using string slices &str that require us to know the length of the string at compile time.

In this way, Rust forces us to do bounds checking.

If the string length we receive does not match the string length we expect, Rust will pitch a fit. These are the kinds of things people are talking about when they tell you "Rust helps you build good habits."

Rust will not compile code if its wrong and forces you to think about what your code is doing and why it's doing it.

Looking at Routing Abuse

We can take it another step further.

According to the analysis, web routing and path abuse tied for second with memory corruption issues.

What is web path and routing abuse?

Its a large category to define. One example is MITRE CWE-22

The product uses external input to construct a pathname that is intended to identify a file or directory that is located underneath a restricted parent directory, but the product does not properly neutralize special elements within the pathname that can cause the pathname to resolve to a location that is outside of the restricted directory.

Many file operations are intended to take place within a restricted directory. By using special elements such as ".." and "/" separators, attackers can escape outside of the restricted location to access files or directories that are elsewhere on the system. One of the most common special elements is the "../" sequence, which in most modern operating systems is interpreted as the parent directory of the current location. This is referred to as relative path traversal. Path traversal also covers the use of absolute pathnames such as "/usr/local/bin", which may also be useful in accessing unexpected files. This is referred to as absolute path traversal.

In many programming languages, the injection of a null byte (the 0 or NUL) may allow an attacker to truncate a generated filename to widen the scope of attack. For example, the product may add ".txt" to any pathname, thus limiting the attacker to text files, but a null injection may effectively remove this restriction.

Web path and routing abuse happens when an attacker manipulates file paths or URLs in a way that allows them to access files or directories outside of the intended area.

In this example, the path to a dictionary file is read from a system property and used to initialize a File object:

String filename = System.getProperty("com.domain.application.dictionaryFile");  
File dictionaryFile = new File(filename);
Enter fullscreen mode Exit fullscreen mode

However, the path is not validated or modified to prevent it from containing relative or absolute path sequences before creating the File object. This allows anyone who can control the system property to determine what file is used. Ideally, the path should be resolved relative to some kind of application or user home directory.

We can rewrite the code in Rust using PathBuf:

use std::env;
use std::path::PathBuf;

fn main() {
    // Get the value of the "com.domain.application.dictionaryFile" property
    let filename = match env::var("com.domain.application.dictionaryFile") {
        Ok(val) => val,
        Err(_) => {
            eprintln!("Error: Property 'com.domain.application.dictionaryFile' not found");
            return;
        }
    };
    // Create a PathBuf representing the file path
    let dictionary_file = PathBuf::from(filename);

    // You can perform further operations with the `dictionary_file` PathBuf
    println!("Dictionary file path: {:?}", dictionary_file);
}
Enter fullscreen mode Exit fullscreen mode

We use Rust's strong type-safety to represent a path using PathBuf, ensuring that the paths are treated as filesystem paths and are subject to filesystem semantics.

Rust's type system performs extensive compile-time checks, ensuring that operations on PathBuf instances are type-safe and adhere to Rust's ownership and borrowing rules. This reduces the likelihood of runtime errors or vulnerabilities resulting from incorrect path manipulation.

The hero we need

Rust has inherent qualities that make it safer to use than the average language.

Rust may not save us, but the ideas it embodies will.

  • Private by default
  • Immutable by default
  • Type-safety checked at compile time
  • Borrow checker and ownership model reducing memory corruption

Rust doesn't rely on the developer to put in place all the details. It lifts responsibility from the developer so they can worry more on developing and less on safety/correctness.

Imagine using a language that prevents these kind of vulnerabilities.

Where we pass around immutable types, private functions by default, and types checked at compile time.

Why don't we expect this from other languages?

Why do we talk about around programming languages as if theres not a way to improve their inherent security as well?

Besides all the recommendations Horizon made, programming languages should also be among them.

We should expect all our languages to be safer.

Top comments (0)