Today we can guarantee (barring compiler errors) memory safety in safe rust, but not unsafe rust.
Not quite. An unsafe function can export its lack of safety to other code. If an unsafe function can be called with parameters which make it violate memory safety, it opens a hole in memory safety.
I assume OP s point is that unsoundness or UB in an unsafe block is not contained in that block, but can taint safe code anywhere in the program. Which is true.
Yes, it's true, but that wasn't what Animats said. And even if that was Animats' point, it doesn't invalidate the GP. The key value proposition of Rust isn't that "you'll never have memory safety bugs," but rather, that if your unsafe code is correct and sound, then safe Rust will be free of memory safety bugs. The important bit is the implication that grants one the power to make that sort of reasoning. It's important precisely because it limits the places in your code that need to be audited.
This is indeed a very powerful property. However, I have a question: is it a regular practice to ensure that code marked unsafe is actually safe for whichever parameters it receives? Or could the safety of some unsafe code depend on the way its safe wrapper is called?
If you have an unsafe function---that is, a function that is unsafe to call---then it is common practice to document the precise preconditions that the caller must uphold in order to ensure safety. This may indeed include passing a correct parameter. For example, the preconditions of the slice method `get_unchecked` require the caller to verify that the index provided is in bounds, otherwise the behavior is UB.
If you have a safe function that uses unsafe internally, then all possible invocations of that function should be safe. If this isn't true, then we call those sorts of APIs unsound and they are strongly discouraged. David Tolnay wrote a great blog post about it: https://docs.rs/dtolnay/0.0.7/dtolnay/macro._03__soundness_b...
Rest of your program blindly believes that the interface of your "trusted set" is safe because you owned that responsibility when you marked it unsafe.
If you have a memory safety bug in that interface, then you can taint the rest of the program's memory safety as well, correct?
This may hang or even crash (malloc in the child) but there's no sense in which the unsafe block is "incorrect." Some unsafe code has unavoidable implications for the entire program.
Indeed, some uses of unsafe aren't meaningfully factorable. File backed memory maps or shared memory with other processes are other important real world examples that are difficult or impossible to meaningfully factor in the context of a single process. But I still think that saying "safety in Rust is factorable" is accurate beyond a mere first approximation. Figuring out how to encapsulate unsafety is, in my experience, the essential creative aspect to using unsafe at all. Because if safety wasn't something you could encapsulate, then Rust really wouldn't be what it is. (I have spent many long hours thinking about how and to what extent the safety of file backed memory maps could be encapsulated. Which is to say, factoring safety is really the ultimate goal, even if certain things remain beyond that goal.)
I’m unsure what you’re responding to. As I understood it, the comment took issue with saying that you can make guarantees about safe Rust but not unsafe Rust because unsafe Rust is not verified in that way and opens the door to violating guarantees in your “safe” code if your bugs leak out.
The function containing the unsafe block can be called from safe code. So unsafe functionality can be exported. Said unsafe functionality might not be safe for all possible inputs. That's a classic hole, as with APIs that can be exploited.
Then that's a "function containing an unsafe block," not an "unsafe function." In any case, this is still a misunderstanding of what staticassertion said. I elaborated more here: https://news.ycombinator.com/item?id=24028359
My bigger point here is that you're misunderstanding the advocated Rust value proposition. The value proposition isn't literally "Rust will forever and always eliminate all memory safety bugs in safe Rust." That's silly and no serious person with any credibility would double down on that claim.
Yes but it's easy to take this too far and conclude that e.g. Javascript is not memory safe because browsers are written in C++ and they have to interface with the kernel which is written in C. At some point you simply need to trust that the current implementation is correct and bug free. This is also a problem with formal verification. What verifies the verification?
Yes, of course you can. But that's not what Animats said. Animats specifically said "unsafe function," which means you need to use the unsafe annotation to call it. See my other reply for more elaboration: https://news.ycombinator.com/item?id=24028359
Yep. For example, it's easy to make intentional memory leak in Rust using safe API, which is unsafe. However, it's also easy to find such memory leak. Rust cannot stop you from doing weird things, when you want this, but it can help you to prevent, or quickly find, weird things, when you don't want them in your code.
> However, it's also easy to find such memory leak.
How? By manually inspecting all code in the project + dependencies marked with "unsafe"? The approach doesn't scale past "hello world" level of complexity.
I don't code Rust, using C# for same purpose. I remember couple times I spend hours debugging weird crashes caused by stupid bugs in totally unrelated unsafe C# code.
It's much easier to find memory leaks or native memory corruptions in C++, than in unsafe subset of a memory safe language. C++ has lots of runtime support for that (especially in debug builds), and many great tools, both in the compilers and external ones. Unsafe C# has none of them.
Simpler stuff like debug C heap, and checked iterators in C++ standard library, catch 95% of memory corruption issues in C++. These are enabled by default in debug builds, very often just pressing F5 in visual studio finds the buggy line of code in a matter of seconds.
Valgrind is Linux-only, I don’t have it. One Windows equivalent is memory profiler under Debug / Performance Profiler / Memory Usage in visual studio, Rust is not supported by visual studio. A cross-platform equivalent is Intel VTune Profiler, no support for Rust either.
I mean, if you’re talking about checked iterators... that’s checked in Rust too.
I am 100% Windows, but don’t triage these issues often enough to suggest tools; the unsafe code I write tends to be small and pretty obvious. (More like “write to this register” than “here’s a complex data structure.”)
In safe Rust. Safe C# is the same, it doesn’t even have raw pointers in the language unless compiling with /unsafe, and writing code in unsafe blocks.
> More like “write to this register” than “here’s a complex data structure.”
I sometimes write C++ DLLs precisely to implement these complex data structures. Modern CPUs have progressively worse proportion of RAM latency / compute speed. This means you need full control over memory layout of data on the performance critical paths of code, or it will be slow.
They’re checked in unsafe too. You only skip those checks if you use a specific “do this index with no checks” function. Unsafe does not change the semantics of code, only gives you more options.
Yes, and many people write that kind of code in Rust too, and use tools to help them debug it. I’m just saying that it’s not an issue for the kinds of code I write, so I can’t personally recommend tooling. I know "use GDB" isn't a great response to a Windows user, even if it is what I end up personally doing.
(It's true that I can't get the performance tools working though, but given I've used VS for all of 20 minutes... I'm also very interested to see if support happens native-ly, given how much interest there is for Rust inside of Microsoft right now.)
However you put it, tooling support is way, way better for C and C++ for obvious reasons.
Other critical bits for commercial development in many industries (certs, standards, third-party support, official bindings...) are also completely lacking.
That is not something against Rust, it is just what happens until things get popular enough.
Not quite. An unsafe function can export its lack of safety to other code. If an unsafe function can be called with parameters which make it violate memory safety, it opens a hole in memory safety.