Undefined Behavior *Can* Format Your Drive

My last post mentioned the ‘standard’ risks of undefined behavior such as having your hard drive formatted or having nethack launched. I even added my own alliterative risk – singing sea shanties in Spanish.

The list of consequences bothered some people who said that any compiler that would intentionally punish its users in such manners should never be used.

That’s true, but it misses the point. Undefined behavior can genuinely cause these risks and I don’t know of any C/C++ compiler that can save you. If you follow Apple’s buggy security guidance then it can lead to your customers’ hard drives being formatted.

As of May 19th, one month after my report, I see that Apple’s security guidance has not been fixed.

With the exception of launching nethack the problem is not that the compiler will mischievously wreak havoc. The problem is that undefined behavior means that you and the compiler have lost control of your program and, due to bad luck or malicious hackers, arbitrary code execution may ensue.

See this article for a great explanation of undefined behavior in C/C++

Arbitrary code execution is arbitrarily bad

If you read security notices then the phrase ‘arbitrary code execution’ comes up a lot. This means that there is a way to exploit a bug such that the attacker can take control of the target machine. It may be because of a design flaw in the software, but in many cases it is undefined behavior that opens the door to chaos.

Here are some ways that undefined behavior can format your hard drive:

Buffer overruns

The ‘classic’ use of undefined behavior to execute arbitrary code is a buffer overrun. If you can cause a program to overrun a stack based buffer with data that you supply then, in some cases, you can overwrite the return address so that the function returns to your buffer and executes your payload. The only reason these payloads don’t normally format hard drives is because there’s no money in that – encrypting hard drives for ransom or mining bitcoins are more profitable tasks.

Tools such as metasploit can be used to create exploits – here’s a handy tutorial – but creating one from scratch is more educational and I recommend trying it.

Microsoft has several compiler/linker/OS features that help to make buffer overruns more difficult to exploit. These include:

  • /GS (add code to check for stack buffer overruns)
  • /NXCOMPAT (make the stack and heap non-executable)
  • /DYNAMICBASE (move stacks, modules, and heaps to randomized addresses, aka ASLR)
  • /SAFESEH (don’t use unregistered exception handlers)

Using these switches (gcc has equivalents) is crucial to reducing your risk, but even with all of these defenses a determined attacker may be able to use a buffer overrun to execute arbitrary code. Return Oriented Programming defeats /NXCOMPAT and there are various ways to defeat /DYNAMICBASE. Even with significant assistance from the compiler and OS a buffer overrun in C/C++ is fundamentally undefined.

Note that only /GS has any runtime cost, so use these switches!

Buffer overruns in the heap or data segment can also be exploitable. Although it seems improbable, it has been shown that in some cases a buffer overrun of a single byte with a zero can be exploitable.

Use after free

Use-after-free seems to be the hot-new way to use undefined behavior to execute arbitrary code. Malicious JavaScript can manipulate the heap to get a freed object and a new object to overlap. Writes to the new object then show up in the old object, and arbitrary code execution may follow.

Use-after-free and buffer overruns are really just variations on reading/writing where you ought not to, and this is very bad (see: Code Red, Blaster, and Heartbleed).

Integer overflow and underflow

Integer overflow can cause some compilers – notably gcc and clang – to remove checks that are shown to be unreachable. This can then expose code to otherwise impossible bugs. Similar optimizations can happen because of misplaced NULL checks.

I think of these optimizations like this: If I write code that assigns a constant to a variable and then checks to see if that variable has that value then I expect the compiler to optimize away the check. As an example, here is some tautological code:

void OptimizeAway(int parameter)
{
    bool flag = true;
    int* p = &parameter;
    if (flag && p)
        printf(“It’s %d\n”, *p);
}

A good compiler should optimize away both of the checks because they are provably unnecessary, and this is the same thing that happens when compilers use undefined behavior to prune their state tree. The compiler ‘knows’ that signed integer overflow cannot happen, and generating code to handle that possibility is equivalent to generating code to handle false being equal to true.

In this simple example the compiler could warn about the tautology, but in more complex cases it may not be practical. Or it may not be desirable, if the tautology only happens in some variations of a template function.

Coding errors

Apple’s current secure coding guidance can trigger undefined behavior, but in a way such that the undefined behavior is unlikely to cause problems. But still, undefined behavior in a security guidance document? Sloppy.

The larger problem with Apple’s ‘fixed’ code is that it is completely broken on 64-bit platforms, even if signed integer math is defined and even if they used unsigned math. If you compile for a 64-bit platform that has 32-bit integers (i.e.; any modern platform) then Apple’s overflow checks will not catch anything. All pairs of positive numbers will pass the checks, regardless of whether they overflow a 32-bit integer. This allows for massive buffer overruns, execution of arbitrary code, and other things generally regarded as ‘bad’.

Is undefined behavior a good idea?

A full discussion of the benefits and costs of undefined behavior is above my pay grade, but I’m going to briefly opine anyway.

There are some types of undefined behavior that must stay that way. C/C++ cannot guarantee that out-of-bounds array accesses will always be caught, so their potential to corrupt memory or reveal secrets is unavoidable. Similar reasoning applies to use-after-free – there is no practical way to define this, so it must remain undefined. This isn’t a matter of compilers being malicious, this is just the C/C++ object model and performance expectations running headlong into the halting problem. The articles here and here discuss confusion about whether illegal memory accesses will cause a crash, and the short answer is ‘maybe’. C and C++ are not safe languages, and never will be.

Some constructs could be defined but are so ugly that they should not be. A classic variant that makes my eyes bleed and that should not be defined is:

i = ++i + p[++i];

Define this

Other types of undefined behavior could be defined, and many of them should be.

NULL pointer dereferences should be defined to crash the program. This requires ensuring that the pages at and before zero never get mapped. Unfortunately both Windows and Linux have historically done an imperfect job of guaranteeing that address zero is not mapped, but this is manageable.

Integer overflow should defined as twos-complement arithmetic (making –fwrapv the default in gcc/clang) or should be implementation defined. This does prevent some loop optimizations (see this article) but at some point we may need to decide that getting the right answer is more important than getting it fast. There should be ways to enable those optimizations without holding millions of lines of code hostage.

Excessive shifting of integral types could also easily be made implementation defined.

Conversion of out-of-range floats to int is currently undefined and could be made implementation defined.

In general I think that the C/C++ standards should reserve the undefined hammer for those cases that cannot be defined, instead of using it as a dumping ground for ambiguity. Visual C++ already behaves this way, but portable code cannot depend on this.

The standards committee for C++ is considering defining the behavior of left shifting a one into the sign bit, which seems like a move in the right direction.

Defining more behavior is no panacea however. In this article Larry Osterman discusses how the shift operator behaved differently for 32-bit and 64-bit processes. If the behavior of shift was implementation defined then it might still be defined differently for 32-bit and 64-bit processes, and his issue would probably still have occurred. However, the bugs caused by implementation defined behavior tend to be far more understandable than those caused by undefined behavior.

Know the standard

My uninformed opinions about undefined behavior don’t change the reality on the ground so I would recommend that all C/C++ developers read and understand this classic article on undefined behavior:

http://blog.llvm.org/2011/05/what-every-c-programmer-should-know_14.html

then, read the thirteen pages of undefined behavior listed in section J.2 in the C standard located here, and then, until Apple secures their security guide, consider this alternative guidance:

The CERT® C Coding Standard, Second Edition

or this scary read:

Dangerous Optimizations and the Loss of Causality

Although, I’m not sure I agree with page 53 of Dangerous Optimizations. It says that the compiler is allowed to implement “a < b” by subtracting and checking if the result is negative. However checking for negative is not sufficient because of overflow and, despite the document’s claims, it is not allowed to ignore that because of undefined behavior because ‘<’ is defined for all values.

About brucedawson

I'm a programmer, working for Google, focusing on optimization and reliability. Nothing's more fun than making code run 10x as fast. Unless it's eliminating large numbers of bugs. I also unicycle. And play (ice) hockey. And sled hockey. And juggle. And worry about whether this blog should have been called randomutf-8. 2010s in review tells more: https://twitter.com/BruceDawson0xB/status/1212101533015298048
This entry was posted in Programming, Security and tagged . Bookmark the permalink.

16 Responses to Undefined Behavior *Can* Format Your Drive

  1. Pingback: Buggy Security Guidance from Apple | Random ASCII

  2. Félix Cloutier says:

    You cannot define null dereferences to crash, because it would mean the C/C++ standards would interfere with the platform’s definition of virtual memory and force its use. Some platforms, especially embedded platforms (on which most development happens in C or C++), either don’t have virtual memory or don’t enable it, and some of them even have things physically at address 0. This would also be a problem with kernel development.

    This issue is better left implementation defined.

    • brucedawson says:

      Implementation defined is good.

      NULL dereferences are also tricky because there’s no real limit to the offset. A larger structure or array could easily have offsets larger than 64 KiB, and Windows at least allows allocating memory at address 64 KiB.

    • Daniel Neve says:

      How do you cope with 0 being a valid address? How would you know if a pointer is valid?

      Either you have a level of indirection (virtual address space) or you no longer have a way of identifying null pointers. (Or I’ve misunderstood the situation you’re talking about)

      • Félix Cloutier says:

        You can still place something at address 0 and give yourself a policy of never accessing it voluntarily, it’s just not going to crash if you do.

        Since I smell disbelief, here are a few examples: Atmel’s EVK1100 will map your program at address 0 (it’s likely to be the case of many other embedded boards, I just don’t really know any other); the Playstation 1 mapped its BIOS at address 0 through 64K; a lot of x86 BIOSes will (initially) map the interrupt vector table at address 0. I’m sure I could find more, and more recent examples, but hopefully you get the idea.

        • Samuel Bronson says:

          Is there some BIOS that manages to put the interrupt table somewhere else? If so, how did they manage to relocate it in real-mode? (Not counting the strange states that x86 CPUs tend to start up in, where they’re running in “real mode” but the hidden parts of the segmentation registers contain values which are otherwise unattainable in real mode.)

  3. Rich says:

    My favorite case of undefined behavior [presumably] causing your hard drive to get formatted was Diablo II under Wine – if you were foolish enough to run it as someone with write access to your boot drive, on run, it would nuke your bootloader, rendering your machine unbootable next time you rebooted…”oops”.

  4. Mei Zi Tang says:

    Undefined Behavior *Can* Format Your Drive | Random ASCII

  5. John Payson says:

    What is needed is a recognition of compilers and execution platforms as separate entities. In many cases, a compiler and execution platform will be under the control of separate entities; if an execution platform can’t guarantee the consequences of an action, a compiler may not be able to offer any guarantee unless it adds extra code to prevent it from reaching the hardware, and the authors of the C Standard quite reasonably didn’t want to require that. On the other hand, there should be no difficulty requiring that compilers define most things as a possibly-nondeterministic choice among various documented alternatives [e.g. if if int is 32 bits and long is 64, , and if y happens to be INT_MAX, then “int x=y+1; long l1=x; printf(“%ld”, l1); long l2=x;” might set x to a non-determinstic superposition of -2,147,483,648 and +2,147,483,648, so l1 might get one of those values and l2 might get the other]. The behavioral model may be more complex to describe than one that imposes no requirements in such case, but it would could allow some kinds of code to be written much more efficiently.

    Among other things, it’s common for a function’s duties to include the computation of a value that may or may not be meaningful, for a caller that may or may not need it. If overflows won’t occur in any cases where a value would be used by the caller, code that could ignore such overflows might be more efficient than code which had to prevent them. The fact that the values produced by overflow might behave non-deterministically would be a non-issue if they end up getting ignored anyway.

    BTW, I think another thing that would help would be a clarification of what permission to assume X means. I would like to see the Standard clarify that such permission would entitle a compiler to perform any action which X would cause to be legitimate, but would not entitle it to perform actions which would require to be legitimate [i.e. actions which could only be illegitimate in cases where X couldn’t possibly be true]. In non-causal universes like theorem spaces both statements would be equivalent, but in a causal universe they are not.

    • brucedawson says:

      I think that that separation already exists. The concept in the standard is “implementation defined”. I agree that the C/C++ standard should move many of the undefined behaviors to being implementation defined.

      There are some types of undefined behavior (use-after-free, referencing uninitialized data, going beyond the end of an allocated array) where no implementation definition can be made, without undue cost – those must necessarily remain undefined. However I think that having signed integer overflow be undefined is an avoidable and poor decision by the standard’s committee.

      • John Payson says:

        Making integer overflow UB is perfectly reasonable if one regards the Standard as merely defining minimal requirements for a conforming implementation, and recognizes that it is a good and useful thing for implementations to augment the guarantees given in the Standard when practical. There are some platforms where the consequences of integer overflow would be unpredictable (e.g. a platform might have an integer-overflow trap which malfunctions if it fires at the same time as an external interrupt), and on such platforms the authors of the Standard presumably wanted to leave the job of preventing signed overflows up to the source-code author rather than the compiler. Further, there are many platforms where integer overflows don’t trap but it would still not be possible to specify the effects of integer overflow in sufficient detail to qualify as Implementation-Defined without forcing compilers to add extra code.

        I suspect the authors of the Standard thought that the notion that implementations should offer guarantees beyond those of the Standard in cases where doing so would be practical and useful was sufficiently obvious that there was no need to say it, since in many cases what it really boils down to is “don’t be obtuse”. Most people would recognize that principle without having to be told, and I don’t think I can blame the authors of the Standard for failing to predict that 15+ years later obtuseness would become fashionable.

        • brucedawson says:

          It’s reasonable to cut the authors of the standard some slack for what they wrote 15 years ago. However I think they now have a responsibility to close some of the UB loopholes since they are currently making C++ even more dangerous, subtle, and capricious than it would otherwise be.

          Putting billions of consumers at risk because some buggy platform might misbehave if an integer overflow occurs at the same time as an external interrupt does not seem justified. Better to make integer overflow be implementation defined and then that one platform (if it exists) would be nonconforming. Oh well.

      • John Payson says:

        I think the cleanest way to fix the standard would entail two things:

        1. Replace most forms of “Undefined Behavior” with “testably-constrained behavior” [and at the same time, replace most forms of “Implementation-Defined behavior” with “testably-defined behavior” and define macro/intrinsics which code can use to assert behavioral requirements, such that if code asserts that it requires e.g. that integer overflow not produce side-effects, and that all integer variables always hold “wrapped” values, but not requiring that expressions wrap (so long2=int1*int2+long1; could either promote the multiply or not, at the compiler’s convenience) then a compliant compiler would either refuse compilation or yield the desired behavior. One could then define a class of “selectively-conforming” programs which include directives sufficient to ensure compatibility with any compiler that accepts them.

        2. Define behavior for non-deterministic values, and specify (as an extension of testably-constrained behavior) what operations will convert a non-deterministic value into an Unspecified choice from among the possibilities. Given a construct like:

        x=y;
        foo(x);
        … some code that shouldn’t affect y
        bar(x);

        If y holds a deterministic value and nothing changes it, a compiler could replace x with y and eliminate the need for the latter variable. That would generally be a useful optimization, but could wreak havoc on the system if it could handle a range of values given to foo() and bar() but would require that they match. I would suggest that a good blend of optimization-friendliness and readability could be achieved if type conversions and casts of an object to its own type were required to yield a deterministic (though possibly unspecified) value, of the given type. With the code as written a compiler could replace x with y even if y might be indeterminate and subject to unexpected change [e.g. via improper aliasing], but if the code were written as “x=(int)y;” that would require that foo() and bar() must be passed the same value even if y was indeterminate.

        Unlike many people who complain about UB, I understand the usefulness of the optimizations that can be enabled by loose behavioral specs. Unlike some compiler implementers, however, I recognize the usefulness of even very loose behavioral guarantees. C became popular because having programmers rather than compilers generate code to handle boundary cases allowed programmers to decide in what cases such code should be included in the executable. This required more work than other languages which would auto-generate such code whether it was needed or not, but gave programmers more flexibility. Hyper-modern C requires that programmers include boundary checks whether or not they’re needed in the machine code, thus forfeiting the flexibility advantage C would have over languages that auto-generated such code, while increasing the programmer-workload disadvantage of having to manually add boundary-case code when it’s needed (by increasing the number of cases requiring such code).

        I wonder to what extent hyper-modernists’ real interest in performance, or whether their opposition to defining what was formerly UB stems from a desire to avoid rewarding what they perceive as bad behavior?

        • brucedawson says:

          I’m not sure that your suggestion (introducing one or two more types of behavior?) would be the cleanest way of fixing the standard, compared to just replacing some instances of “undefined behavior” with “implementation defined behavior”.

          > I understand the usefulness of the optimizations that can be enabled
          > by loose behavioral specs

          I read an article a few months ago on this topic which I unfortunately cannot find. The basic thesis was that the claim of increased performance due to UB was not really true, or at least exaggerated. The basic problem is that you often need to rewrite your code to avoid UB and this then costs as much or more performance as the compiler gains by exploiting UB.

          In short, you can’t fairly compare the same program compiled with and without -fwrapv because a program that behaves correctly without -fwrapv has already been compromised.

          Anyway, too big a topic for the comments section.

          • John Payson says:

            Are you looking for http://www.complang.tuwien.ac.at/kps2015/proceedings/KPS_2015_submission_29.pdf or something else?

            I’d be delighted to discuss further elsewhere if you’d like to suggest a venue.

            The problem with “implementation-defined behavior” is that it would require that implementations document a consistent behavior. For integer overflow, that would basically require that compilers generate code which is no more efficient than what programmers could achieve if they cast everything to unsigned types before most operations and cast things back to signed types before comparisons that required signed semantics. Loosening the semantics would allow much more effective optimization.

            Loosening the semantics, however, would require defining terminology that was capable of describing what compilers would and wouldn’t be allowed to do. Rules for non-deterministic values need not be excessively complicated to describe, however. The basic gist would be that each storage location is required to hold a value of its type, but may at the implementation’s leisure be capable of simultaneously holding other values of the same domain (integers, real numbers, etc.) which need not be within range of the type. Some operators, given multiple possible values, would be required to pick one in Unspecified fashion. Any given relational operator, for example, might independently interpret the result of adding 1 to INT_MAX as being higher or lower than 12, but couldn’t do anything other than yield 1 or 0 in possibly-Unspecified fashion. Most operators, however, given such “combined” values, would yield a result that combines every possible value that could be achieved with any combination of source operands.

            • brucedawson says:

              Yes! That is exactly the article that I was thinking of. Thank you.

              I found the arguments in that article quite compelling. I’m not a compiler developer so I’m really not the best person to discuss with. If you’re in the Seattle area it might be interesting to discuss in person, but otherwise I would imagine that discussing with the authors of that articles and the C++ standards committee would be more useful.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.