This site is the archived OWASP Foundation Wiki and is no longer accepting Account Requests.
To view the new OWASP Foundation website, please visit

C-Based Toolchain Hardening

Revision as of 09:52, 15 February 2013 by Jeffrey Walton (talk | contribs)

Jump to: navigation, search

C-Based Toolchain Hardening is a treatment of project settings that will help you deliver reliable and secure code when using C, C++ and Objective C languages in a number of development environments. This article will examine Microsoft and GCC toolchains for the C, C++ and Objective C languages. It will guide you through the steps you should take to create executables with firmer defensive postures and increased integration with the available platform security. Effectively configuring the toolchain also means your project will enjoy a number of benefits during development, including enhanced warnings and static analysis, and self-debugging code.

There are four areas to be examined when hardening the toolchain: configuration, preprocessor, compiler, and linker. Nearly all areas are overlooked or neglected when setting up a project. The neglect appears to be pandemic, and it applies to nearly all projects including Auto-configured projects, Makefile-based, Eclipse-based, Visual Studio-based, and Xcode-based.

The article will also detail steps which quality assurance personnel can perform to ensure third party code meets organizational standards. Many organizations have Security Testing and Evaluation (ST&E) programs or operate in the US Federal arena where supply chain audits are necessary. If you audit a program with a lot of gaps, it could indicate the company providing the binaries does not have a mature engineering process or has gaps in its internal QA processes. For those who lack mature quality assurance or acceptance and testing criteria, then this article will also provide helpful suggestions.

Proper use of auditing tools such as checksec and readelf on Linux and BinScope on Windows means source code will be rarely needed for some portions of an audit. Lack of source code clears a number of legal obstacles in the acceptance testing process since NDAs or other agreements may not be required. For those who are not aware, the US's DMCA (PUBLIC LAW 105–304) has proper exceptions for reverse engineering and security testing and evaluation. The RE exemption is in Section 1205 (f) REVERSE ENGINEERING; and the ST&E exemption is in Section 1205 (i) SECURITY TESTING. If you don't need source code access, then you can decompile, re-engineer, and test without the need for consent or worry of reprisals.

Finally, OWASP's ESAPI C++ project eats its own dog food.


Code must be correct. It should be secure. It can be efficient.

Dr. Jon Bentley: "If it doesn't have to be correct, I can make it as fast as you'd like it to be".

Dr. Gary McGraw: "Thou shalt not rely solely on security features and functions to build secure software as security is an emergent property of the entire system and thus relies on building and integrating all parts properly".


Authors in the toolchain work hard to help projects deliver reliable, secure, and efficient programs. Their efforts are available to you at all stages of the engineering processes - from project configuration and preprocessing to compiling and linking. For example, its non-trivial to ensure line numbers in source files match debug information due to the processing of #include, considering projects can be configured with no macros, the DEBUG macro, or the NDEBUG macro.

Compiler writers also provide a rich set of warnings from the analysis of code during compilation. Both GCC and Visual Studio have static analysis capabilities to help find mistakes early in the development process. In addition, both tool chains have options to produce a hardened executable by taking advantage of the security offered by the platform. Since users expect trouble free and safe code, it would be wise if you used all tools available in your war chests.

The built in static analysis capabilities of GCC and Visual Studio are usually sufficient to ensure proper API usage and catch a number of mistakes such as using an uninitialized variable or comparing a negative signed int and a positive unsigned int. As a concrete example, (and for those not familiar with C/C++ promotion rules), a warning will be issued if a signed integer is promoted to an unsigned integer and then compared because a side effect is -1 > 1 after promotion! GCC and Visual Studio will not currently catch, for example, SQL injections and other tainted data usage. For that, you will need a tool designed to perform data flow analysis or taint analysis.

Some in the development community resist static analysis or refute its results. For example, when static analysis warned the Linux kernel's sys_prctl was comparing an unsigned value against less than zero, Jesper Juhl offered a patch to clean up the code. Linus Torvalds howled “No, you don't do this... GCC is crap.” For the full discussion, see [PATCH] Don't compare unsigned variable for <0 in sys_prctl() from the Linux Kernel mailing list.


Configuration tools are popular on many Linux and Unix based systems, and they present the first opportunutiy for hardening. Configuration tools and auto tools include Autosetup, Autoconf, Automake, config, and Configure. There are two opportunities at this stage - first is language and platform, and second is project specific settings. The opportunities are discussed below under Platforms and Projects.

There are two downsides to many configuration tools in the toolchain: (1) security is often not a goal (for modern expectations of 'secure'), and (2) they cannot create configurations! The bold statements are explored Security Goals and Config Management.


While language and platform settings will be discussed in detail under Compiler and Linker, there are files of interest worth mentioning. The files include m4 and the various *.in, *.ac, and *.am files. The reason these files are important is because some projects don't honor command line settings, so the following might not produce expected results:

$ configure CFLAGS="-g2 -O2 -Wall -fPIE" LDFLAGS="-pie"

The unexpected result - loss of position independent code or layout randomizations - is due to the tools ignoring CFLAGS and LDFLAGS (more problems are discussed under Config Management). Because the auto tools don't enable some flags by default, ignore requested settings, and many projects copy/paste the insecure default settings, you will need to open some of the template files (such as and add the settings by hand to ensure the project is configured to specification.

Its truly unfortunate many projects do not honor a users' preferred settings since it so easy to do. Below is from ESAPI C++ Makefile:

# Merge ESAPI flags with user supplied flags. We perform the extra step to ensure 
# user options follow our options, which should give user option's a preference.

Finally, the problem is not limited to auto tools. Non-auto'd projects, such as GCC and OpenSSL, do the same.

Another reason to open the template files is to tune optimizations. Many default configurations simply use -O3 as a hardcoded CFLAG. -Os is often a better choice in embedded and mobile spaces because devices do not have a page file, they are memory constrained memory, and minimizing code size helps keep the caches "hot".


Configuration at the project level presents opportunities to harden the application or library based upon domain specific knowledge. Foe example, suppose you are building OpenSSL, and you know (1) SSLv2 is insecure, (2) SSLv3 is insecure, and (3) compression is insecure. In addition, suppose you don't use hardware and engines, and only allow static linking. Given the knowledge and specifications, you would configure the OpenSSL library as follows:

$ Configure darwin64-x86_64-cc -no-hw -no-engines -no-comp -no-shared -no-dso -no-sslv2 -no-sslv3 --openssldir=...

If you think the misconfiguration cannot be found or will be overlooked, then you should rethink you position. Auditing (discussed below) will reveal that compression is enabled, and the following command will bear witness. In fact, any symbol within the OPENSSL_NO_COMP preprocessor macro will bear witness since -no-comp is translated into a CFLAGS define:

$ nm /usr/local/ssl/iphoneos/lib/libssl.a 2>/dev/null | egrep -i "ERR_load_COMP_strings"

Security Goals

You will probably be disappointed to learn tools such as Autoconf and Automake miss many security related opportunities and ship insecure out of the box. There are a number of compiler switches and linker flags that improve the defensive posture of a program, but they are not 'on' by default. Tools like Autoconf - which are supposed to handle this situation - often provides setting to serve the lowest of all denominators.

A recent discussion on the Automake mailing list illuminates the issue: Enabling compiler warning flags. Attempts to improve default configurations were met with resistance and no action was taken. The resistance is often of the form: "<some obscure platform> does not support <established security feature>". Its noteworthy that David Wheeler, the author of Secure Programming for Linux and Unix HOWTO, was one of the folks trying to improve the posture.

Config Management

Suppose you want to support Debug, Test, and Release configurations. Later, after configuration, you would like to type make debug or make test to build the desired configuration. You will probably be disappointed to discover Autoconf does not support the concept of configurations. Its not entirely Autoconf's fault - Make is the underlying root of the problem.

Make has not evolved to support platforms undergoing evolution. For example, most modern platforms support executables and shared objects. Yet Make only supports one CFLAGS (and CXXFLAGS) and one LDFLAGS. Assuming you have a library and test suite and you want to use position independent code (or layout randomizations), it means you must make a choice: either -fPIE (compiler) and -pie (linker) for an executable; or -fPIC (compiler) and -shared for a shared object.

As another example, consider the modern engineering process that includes debug, test, and release configurations. Each would require different CFLAGS, and you would extract the relevant command and set CFLAGS like so (taken from ESAPI C++ Makefile):

# Makefile
DEBUG_GOALS = $(filter $(MAKECMDGOALS), debug)
ifneq ($(DEBUG_GOALS),)
  WANT_TEST := 0

ifeq ($(WANT_DEBUG),1)
  ESAPI_CFLAGS += -DDEBUG=1 -UNDEBUG -g3 -ggdb -O0

Now consider what happens when you:(1) type make debug, and then type make release. Make will first build the program in a debug configuration for a session under the debugger. Next you want a release build to ensure there are no "optimized" side effects. Make will do nothing upon typing type make release because it considers everything up to date despite the fact that C{XX}FLAGS have changed. Hence, your program will actually be in a debug configuration and could SIGTERM at runtime because, for example, debug instrumentation could be present (assert calls abort() when NDEBUG is not defined).

If you think its unlikely, then think again. A number of projects have been guilty of executing debug instrumented code in production, including critical infrastructure. The defect is due to failures in the engineering process, including Quality Assurance controls.





After all the discussions on hardening in the development stage, there are still opportunities to improve the engineering process.


Auditing a binary for compliance against an SDLC focuses on the compilation phase and the linking phase.