Review of Book, Optimized C++

Two conflicting views exist in considering how to write software code. Abstraction vs. optimization. Abstraction is about getting your model right and matching software to the business environment. Make the software mirror the concepts in the real-world. An example is when you design a financial software package and the code has concepts baked in like accounts, debits, credits, merchants, and customers. You have object classes written in software that looks just like the way the real-world talks about the process the software facilitates. This is thought to lead to easier to maintainable code that can be kept better in synch with the way the business operates.

The other approach is to put less an emphasis on models and make sure the software is highly competent in using computer resources. The program runs fast, uses computer memory appropriately, scales up to handle more data, and so on. Technical operational criteria that may not immediately map to the business drivers, but that nonetheless could lead to a quality solution in the right circumstances. An example are video games. Get the technical aspects right such as high-speed rendering on graphics cards; reduce page swaps and cache misses for smoother AI, solid use of asynchronous functions for solid multi-player and pre-rendering of terrains, then you get a great solution. In the business world, it may mean adapting a software design to accommodate disk storage capacity, processing windows versus available compute resources and giving up pretty code that looks like the business for code that enable more data and calculation per unit of time.

People argue about these two views all the time. Oft cited is Dr. Knuth’s statement about premature optimization. Despite the pros and cons, substantial gains are available through the power of abstractions. I found no more powerful statement about abstractions in the form of the book From Mathematics to Generic Programming in Alexander Stepanov’s material on the subject which I reviewed in Janurary 2015. The right abstractions provide a significant amount of productivity. You can build more and cover more conceptual and functional ground through abstractions. The abstraction oriented approach has been a major part of the progress of software for many years.

Thinking in Abstractions

The abstraction approach is about thinking about what the software does and write code in a way that does not get too much into the technical weeds. The benefits of useful abstractions has led to the abstraction leaning viewpoint becoming the dominant approach in software today. Humans are always using abstractions, and so we are really talking about less versus more abstractions. Substantive technical justification for going with more abstractions versus less are as follows. Computers get faster and if you want to speed up your program, handle more data, reduce processing delays, then either get more computers or bigger, faster computers. Many companies that invest in systems can apply budgets towards more computers or getting bigger, better computers at a lower cost. A cost that is often more appealing and a predictable boost versus expert coders working in earnest to wring out more speed by reshaping the code.

The other justification for abstraction is that using programming tools that summarize and pre-package technical capabilities, you make program code easier to understand, you possibly write the code faster, the code may be easier to troubleshoot, and you gain greater leverage when the mechanics underneath those abstractions get improved. The ideas baked into those abstractions, what people call API, pre-packages more capabilities that a coder don’t have to code which speeds up software development. That often makes today’s desktop, mobile, and web applications easier to build and quicker to put into active use.

The billions of web pages and millions of mobile apps are made possible through productive abstractions through numerous tools. Many abstractions drive an uncountable number of connected software solutions from the smallest embedded device the size of a speck of dust to a span of interconnected. large machines the size of a city. Abstractions, even when they are not entirely efficient from a computer standpoint, have absolutely established their value in the realm of software development.

You Know When Slow is Not Fun

In reality, there are definitely some types of programs, parts of a program or system, that by their very nature, seem less successfully built with lots of abstractions without an eye towards performance and capacity. Programs that are easy to model, design, and build in relative terms but nonetheless run so poorly from a speed or capacity standpoint as to make those solutions less appealing. Although speed and capacity seems acceptably addressed in the initial phases of design, when the solution is set out for actual use, or after some time when environmental factors have changed, the poor speed or capacity utilization might destroy even the functionality of the program. Often, these are possibilities discounted in the argument for abstraction and against optimization.

Sometimes, you see an article written that describe performance improvements, techniques, or considerations that totally makes sense. Business requirements can define a situation in which delivering the program as early as possible may be among the most important criteria. Unfortunately, that can also facilitate embrace of abstractions in such a way that can lead to a slow program or one that cannot process data quite as well due to the volume of information versus the time it takes to process it. Conditions that ultimately does not serve the business situation. Individual developers who encounter such situation enough times may lean more towards a performance oriented perspective.

Consumers and Businesses Upgrade Less Often

Many companies do not upgrade their hardware as often as the IT department may recommend. I’ve seen many examples of this as well as counter examples. The existing hardware actually does work, the capacity seems adequate, and all the software developer has to do is make new or existing solutions run well in technical environment. That is far more common a reality than the literature on abstraction would suggest. Kind of like the main way to get Microsoft Windows 10 out there was to provide a free upgrade. As the Dot-Com Era waned, and Web 2.0 came and went, and software became more normalized in many areas, the value of hardware as a driver of new abstractions and productive tools seems to have declined in relevance. The ability to run the newest versions of Java, IIS, and Oracle for an accounting system is not a good reason to upgrade hardware if all you are doing are the same functions with newer syntax, configurations, and bug fixes.

Computer processor maker, Intel, does not upgrade their processors substantially enough, often enough to justify turnover in hardware. That eventually puts a ceiling on how far abstractions can take you versus a more detailed technical approach. Sure, new devices will be churned out, but one of their defining features may have far less to do with the speed of the device itself than how well the software is able to function despite a modest to glacial expansion in processing headroom. That does not render abstraction leaning thinking less relevant, but it does show that at least in the present era, that unless you are using clever but expensive distributed computing designs, knowledge and skill in performance in increasingly constrained computing environments may become more useful.

Premature Optimization or Premature Abstraction?

What you emphasize in the design of software differs when considering an optimized design versus one that leans heavily on abstractions. You are more mindful of the weeds in the technical garden. You still have one or more functions you are trying to accomplish through the program, but doing those functions efficiently, fast, and possibly with more conservative use of space are considerations that are more pervasive in how you code. You might ask yourself, “how do I define these data structures so they better fit in the CPU’s L3 cache” versus “how do I make these data structures so polymorphic as to represent different types of financial accounts?” Different questions with potentially different results in terms of how the program operates. Big enough machine and it may not matter and you can write the code as quickly as you may. Constrained enough environment (hardware, competing programs, newer, hungrier third-party/framework abstractions) and you may have to drive a more intense effort towards a more efficient process.

Abstractions are Not Bad

It appears that a larger share of the published materials concerning software development deal in some form of generic programming or various abstractions that do not involve a lot of technical granularity. Much of this material seem to foster the use of high-level tools where the things you build with those tools can simply be upgraded by new hardware or updates to the underlying API. Even when you use abstractions, you may still write a lot of code. That code may be less technical in nature and you can make the case that in many cases, you wrote far less code and possibly higher quality code (in terms of maintainability) than if you went an alternative route.

You may spend many years immersed in the philosophy of abstractions in various guises such as design patterns, generic programming, higher-order functions, functional programming, declarative instructions, component-based frameworks, and object-oriented design. With these tools, you can build substantial solutions that accomplish a wide range of tasks. Abstractions are not a bad thing and I defer to them as often as I can. However, abstractions are not always a good thing.

When Abstractions Go Bad

An otherwise promising, reliable, thoroughly defined technology solution or design can fail dramatically despite clean, well-abstracted code with clean well-defined abstractions selected with proper attention to Big-O considerations. If it is too slow, uses too much space, interferes unexpectedly with other critical solutions, then there is a high possibility that the solution will be rejected. The only defense at that time is the software is maintainable and does everything, but that can matter little.

People experience can have different experiences with the outcome of software implementations. In my experience, I have seen any of 3 situations develop. First, I’ve seen unacceptable performance develop in situations with heavy limits on hardware and/or software environment upgrades in which the sample data set used during development has no relation the actual volume of data in production. This can happen when not even the project sponsors have visibility into the full operating conditions surrounding the solution. Second, I have seen this happen when there was sufficient hardware capacity and the latest software environments, but in which highly maintainable code simply didn’t run fast enough or it imploded under the weight of actual operational requirements not defined or available at the time when the software and system was being defined. Third, I have seen attempts at optimization after the fact simply fail primarily due to the selection of abstractions that are inaccessible to optimization or a refusal to consider the computer environment in considering performance.

Reasons for Improper Use of Abstractions

Poor performance and operational outcomes from the use of abstractions can come about come about because of the heavy push to abstract code or to get a decisive productivity edge when using abstractions. The tone of my writing is not against abstractions. I like productivity. Yet, the over emphasis on abstractions can facilitate a knowledge vacuum or cognitive atrophy regarding performance. You often hear about good abstractions and why you should use them along with their listed theoretical Big-O cost. Usually, you are encouraged to use these abstractions without the ability to make an informed decision as to when not to use them. The true real-world implications are usually hidden behind strong urging to stick with code others have written that are proven in production. I agree with that most of the time, but it is not always true. In some cases, it is absolutely false.

Performance is Not Gut Instinct

You must systematically profile a solution to optimize performance. Slow areas in a program are usually not where your knowledge of computer architecture and machine models suggest they could be, nor are those slow areas the province of code techniques. I first systematically profiled a software application in 2002. It was a mind opening experience. The tool was CompuWare NuMega and it did a great job visually identifying performance hot spots in a program. Third-party or generally available code profilers are the very best ways to learn about performance. Big-O notation and generic complexity analysis may not help you when using abstractions like XML parsers. Despite being highly optimized in 2002, loading such parsers with 1GB+ sized XML documents was a real performance challenge.

In 2002, I learned that when you use an XML parser, there is a gigantic amount of things an XML parser does when you think you are just loading an XML document and accessing data throughout that XML. Alternatively, you might approach the XML parsing differently in which the XML is read into and out of the program a little a time. You end up writing a lot more code, but because that code is specific to the business situation, but performance optimized in terms of space versus computation, it typically results in a faster solution. That is just an example. The performance issue could involve GUIs that while beautiful to look at, have unbalanced code saturated with features that coincidentally make the GUI look pretty but slow things down when a lot of data has to be shown in a grid format. That can be an issue in GUI frameworks like Microsoft WPF which has (wait for it … ) an XML based screen definition solution called XAML that is linked to a broader framework soaked in functionality. It often takes a substantial level of experience to tame WPF so you get the pretty without the bloat.

Whatever you are profiling, across the board, you will often find that you are spending a lot of time processing strings. That is because outside of certain engineering, scientific or focused financial solutions, software deals largely with text data. Text data when the volume of information reaches a certain level is a primary performance impacting aspect of the overall solution. At other times, like when using XML or “friendly” array like structures that are node based, you find that merely swapping out these high-level node-based data structures linked to text data for more efficient structures linked to text data may not have any real impact due to other factors. Regardless, each bit adds up, and good performance profilers help you determine the overall cost. How do you understand the tools you use in such a way as to make better performance decisions?

Kurt Guntheroth’s Optimized C++

Kurt Guntheroth wrote a solid book, Optimized C++, in which he shares 30+ years of experience in improving the performance of software. Reading his book, you learn that there is much about writing and designing software that runs counter to both academia and even industry literature on software best practices. My industry experience is half of Kurt’s, but even in my narrower exposure, I have seen the manifestation of many of Kurt’s statements about performance. The usefulness of his book is how it catalogs many real-world behaviors in software tools, abstractions, and practices into a coherent whole. A book now exists people can point to in which actual experiments were executed with credible implications for the likely performance outcomes for certain choices when writing code. He even shows you how to do this with what you might call “the poor person’s code profiler”.

The book is very detailed but there are a few major points to consider. Kurt has made a solid case for reducing your use of dynamic memory. Declaring pointers and assigning an address from memory following the use of the keyword, new, can a performance draining operation. It doesn’t appear so on the surface, but there can be a huge amount of code or calculations behind the scenes to make this seemingly simple abstraction work. As an alternative, declaring variables and passing references to them can be more efficient. Kurt shows how you can get a large performance win in some cases by adjusting how or whether you use dynamic memory. One of the basic reasons why dynamic memory can be an issue is that you are indirectly adding more calculations through the memory manager in order to gain dynamic memory allocation.

Many programs have two elements in common use. Lists and loops. Based on what I read, he describes a situation that seems straightforward but seems explained away in academic and industry literature. Linear data structures and loop processing that minimize actual computation is a solid path to higher performance. He compares binary algorithms with other kinds and show that it doesn’t always provide solid performance in a generic way versus other kinds of algorithms in which well-defined stop conditions may provide a more reliable speed boost versus progressively reducing the data set.

Kurt spends a lot of time on data structures and algorithms but after taking it all in, the simple truth is it is hard to beat arrays, sequential lists, and code statements that maximize efficient processing on those kinds of structures. Counter-intuitively, he also showed how some sophisticated code expressions such as lambdas can be faster to use in some situations. Particularly as function predicates to higher-order functions. He covered threads and asynchronous mechanisms and shows which works better in some cases and the results are not what you’d expect. It appears that there is a better way to do asynchronous processing that doesn’t involve raw threads.

The last chapter of the book talked about allocators and optimizing dynamic memory. Although the solutions he present are useful, he admits that in the typical case, your mileage will vary when attempting to enhance dynamic memory allocation. The performance boost may not be a large a gain compared to the time invested. Actually, carefully reducing your use of dynamic memory can be among the best ways to get or sustain higher performance.

Despite all that he wrote, all of which I pretty much absorbed, the most interesting piece of information had to do with I/O. Something that seems as simple as writing to a log file or showing progress results on a screen, even a command-line screen can have large, negative impact on the software. That was not the point he was making when he wrote about I/O, but you could not help but notice when he observed that the difference in processing time with I/O and without I/O can be thousands of data points processed per second versus millions.

After reading Kurt’s book, which I learned about on quora.com, it is obvious that software performance is more than individual code techniques or Big-O, Big-Theta. Performance is not necessarily about algorithms (although the right algorithm in the right situation can make a huge difference if the data structures are proper for the situation). Like security, solid performance involves design. It also requires you pay attention to which abstractions you use and be willing adjust those abstractions, cut them down, or replace them altogether with more primitive, simpler design and coding approaches that provide better results in speed, processing capacity, and storage utilization. Use intuition and informal hypothesis but be prepared to lay aside assumptions through systematic measurement and profiling. Many great tips in Kurt’s book regarding performance.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s