The Detroit Post
Saturday, 16 October, 2021

Rust For Production

author
Christina Perez
• Saturday, 09 January, 2021
• 7 min read

Rust ’s main benefit is that it enables C-like performance while still keeping the memory safety that we are used to when developing with languages like JavaScript and Python. In this article, I will look at nine large companies that use Rust and delve into the reasons for their choice.

rust figma server performance side production
(Source: www.figma.com)

Contents

More than performance, its ergonomics and focus on correctness has helped us tame sync’s complexity. Although C is the default low-level full-control programming language, these binaries have strict security and correctness requirements.

One of Rust ’s common selling points is complete immunity to certain classes of security vulnerabilities thanks to its powerful type system, making it an excellent choice for security critical functions. Figma is a collaborative web-based design tool for vector graphics and interface prototyping.

They chose to rewrite their multiplayer syncing engine in Rust (previously, it was in TypeScript) to improve performance since their server couldn’t keep up with user growth. We chose Rust for this rewrite because it combines best-in-class speed with low resource usage while still offering the safety of standard server languages.

Low resource usage was particularly important to us because some performance issues with the old server were caused by the garbage collector. They rejected technologies such as C and C++ since they didn’t trust themselves to be able to handle memory management for a web-exposed service.

When a service can be deploy-and-forget, that saves valuable operations time and lets them focus on other issues. For the last 12 years, around 70 percent of the Caves (Common Vulnerabilities and Exposures) discovered at Microsoft have been connected with memory safety.

sound reduction journey results rust defective taken care
(Source: www.diymobileaudio.com)

Microsoft has tried various options to solve this issue, such as extensive developer training and static analysis tools. Facebook used Rust to rewrite its source control backend, which was written in Python.

They were looking for a compiled language to rewrite it in and were attracted to Rust because of its safety benefits. As the reasons for adoption, they mention the huge cost of bugs for Facebook and the ease of the compiler feedback loop, in contrast to static analysis and code reviews.

In addition, the company openly supports and sponsors the development of the language and its ecosystem. In this case, Rust enabled them to speed up their existing Elixir codebase while keeping everything memory safe.

While the Go version of the service was fast enough most of the time, it sometimes had large latency spikes due to Go’s memory model and garbage collector. To solve that, Discord switched to Rust, which offers a unique memory allocation system that makes garbage collection unnecessary.

For example, its type safety and borrow checker make it very easy to refactor code as product requirements change or new learnings about the language are discovered. Teams reach for it when they need extra performance but want to avoid memory issues associated with C.

rust series02 deviantart
(Source: dogukanaksu.deviantart.com)

But Rust has far more benefits: it makes lower-level programming more accessible, has excellent support for Was, and is fantastic for concurrency. In the future, expect Rust usage to increase as more and more companies discover how it can improve their codebases.

In the meantime, follow us on social media like Twitter and Medium to see more posts about Rust and multiple other programming languages that we use in our daily work. In particular, Serve was available well before Rust 1.0.0 was released (though the derive macro was unstable until 1.15.0).

Those can offer good compile-time and runtime performance, but they lock the data into their respective protocols, often with implementations available in other languages. In this guide, we’ll zoom in on both kinds of frameworks, considering API usability and performance.

Serve, the incumbent serialization/serialization library, is elegant, flexible, fast to run, and slow to compile. You rarely need to implement those traits manually since Serve comes with powerful derive macros that allow many tweaks to the output format.

For example, you can rename fields, define defaults for missing values on serialization, and more. Once you have derived or implemented those traits, you can use the format crates, such as serde_json or bin code, to (DE)serialize your data.

vw november t25 james posted
(Source: vwt25.blog)

This means reducing the problem of serializing N data structures to M formats from M × N to M + N. Because Serve relies heavily on monomorphisation to facilitate great performance despite its famous flexibility, compile time has been an issue from the beginning.

To counter this, multiple crates have appeared, from miniseries, to tinkered, to nanohertz. The idea behind these tools is to use runtime dispatch to reduce code bloat due to monomorphisation.

Serde_Jason crate allows serialization to and serialization from JSON, which is plain text and thus (somewhat) readable, at the cost of some overhead during parsing and formatting. Serializing can be done with to_string, to_vec, or to_writer with _pretty -variants to write out nicely formatted instead of minified JSON.

Serializing TowerData takes about a quarter of a microsecond on my machine. The overhead will vary depending on the serialized types and values.

This is another textual format with multiple language bindings like JSON, but it’s very idiosyncratic. It’s a wee bit terser than JSON at 91 bytes, but slower to work with.

oklahomahistory rust newsletter before
(Source: www.oklahomahistory.net)

Like serde_json, bin code also works with serve to serialize or deserialize any types that implement the respective traits. Because Writer is implemented for a good deal of types (notably smut DEC, File, and Upstream), this simple API gives us plenty of bang for the buck.

A small downside of all this generosity is that when you get the arguments wrong, the type errors may be confusing. Serializing took roughly 35 nanoseconds and deserializing a bit less than an eighth of a microsecond.

It prides itself on being very terse on the wire, which our benchmark case validated: the data serialized to just 24 bytes. The Rust implementation of the MessagePack protocol is called rap, which also works with serve.

Its thriftiness when it comes to space comes with a small performance overhead compared to bin code. Note that this will only construct a serde_json::Value, which is pretty fast (to the tune of only a few nanoseconds), but not exactly a serialized object.

Serialization was speedy enough at roughly 140 nanoseconds, but serialization was, unexpectedly, slower at almost half a millisecond. At 41 bytes, it’s a good compromise between size and speed, because at 60 nanoseconds to serialize and 180 nanoseconds to deserialize, it’s roughly 1.5x slower than bin code, at roughly 70 percent of the message size.

rust 3d printed ak 47 game weapon craft awesome props own printing prop gun guns overwatch salvaged axe project
(Source: 3dprint.com)

The relatively fast serialization and the thrifty format are a natural fit for embedded systems. MessagePack might overtax the embedded CPU, whereas we often have a beefier machine to deserialize the data.

For a freely chosen polyglot format, both JSON and mudpack best it in every respect. From Google comes a polyglot serialization format with bindings to C, C++, Java, C#, Go, LA, and Rust, among others.

FlatBuffers appears to lack a direct representation of pointer-length integers (e.g., size nor of Range s), so in this example, I just picked uint64 and an array of length 2 to represent them. Compiling this to Rust code requires the flat, which is available as a Windows binary.

Obviously, this is not our original data structure, but for the sake of comparability, we’ll benchmark the serialization and serialization via this slightly modified type. After we published this post, people asked why I left out Cap’n Photo, so I added it to the benchmark.

It works similarly to flat buffers, but the interface is somewhat impenetrable, so I cannot guarantee the results. UPDATE, Sept. 25, 2020 : One of the Cap’n Photo crate maintainers sent a PR my way that showed I did do something wrong: I used a nested struct to represent an Option, where using an Optional ENIM (like the one in my optional crate) would be a better fit.

rust collection future stories follow
(Source: steemit.com)

Using it for anything but benchmarking to measure the maximum theoretical performance of serialization and serialization is downright inadvisable. What it really does is basically a memory, plus fixing the occasional pointer so it can handle things like DEC and String.

Even then, JSON is faster JSON is the fastest of the three readable formats, which makes sense since it has seen wide industry usage and benefits from Sims optimizations, especially with the Simpson crate If you find problems or improvements, feel free to send me an issue or PR.

If you’re interested in monitoring and tracking performance of your Rust apps, automatically surfacing errors, and tracking slow network requests and load time, try Rocket. Instead of guessing why problems happen, you can aggregate and report on what state your application was in when an issue occurred.

Other Articles You Might Be Interested In

01: Srs Real Estate Partners Atlanta
02: Srs Real Estate Partners Austin
03: Srs Real Estate Partners Phoenix
04: Srs Real Estate Rentals Philadelphia
05: Srs Real Estate San Antonio
06: Illinois Vehicle Auto Insurance (xpert)
07: Illuminate Real Estate Photography Bakersfield
08: If You Have Insurance Can You Use Goodrx
09: If You Have Verizon Disney Plus Is Free
10: If You Have Verizon Unlimited Data Is Hotspot Free
Sources
1 community.verizonwireless.com - https://community.verizonwireless.com/t5/My-Verizon/Are-mobile-hotspots-free-with-the-unlimited-plan/td-p/982761
2 www.wirefly.com - https://www.wirefly.com/guides/verizon-mobile-hotspot-review
3 www.allconnect.com - https://www.allconnect.com/blog/verizon-hotspot-guide
4 wellkeptwallet.com - https://wellkeptwallet.com/cheap-mobile-wifi/