Review of From Mathematics to Generic Programming

What are the properties, principles, and conditions that constitute a transformation of one kind of thing to another? The authors of the book, From Mathematics to Generic Programming explain that gradually morphing software code from a specific to general form is perhaps the most effective approach to making software programs more adaptable. The audience is everyone. That includes those who want to grow their understanding of software technology as well as those who are involved with creating software code. Whether that code is a small snippet embedded in the cell of a spreadsheet; code working upon a trillions of data records from a database and all types of solutions in between, the authors contend that those solutions would benefit from an approach that is more general than specific. They set out to inform the reader on how this might be done.

Foundations of Generic Programming

When I first set out to share my thoughts on the book, I kept notes along the way. By the time I was finished, I had 20 pages of notes in which assumptions I developed along the way needed to be rethought. The book may challenge paradigms.

Alexander Stepanov, one of the authors of the book is perhaps someone who I should have known about but did not. I decided to read the book knowing nothing about the author. My motivation was plain curiosity about the subject matter. It turns out that Alexander Stepanov is a designer of the C++ STL which is the primary software code foundation for C++. Another solution for a future version of C++ is called Concepts. Alexander Stepanov is the primary mind behind this as well and this book lays out the foundational ideas for this forthcoming approach to writing software in C++ at least.

The authors describe a collection of ideas and structured thinking they represent as useful to achieving well-defined, generalized software code. I do not recall any statements that theirs is the only path to generalism. I suspect many who have a history of writing software has undertaken the creation of more general designs through a number of means. Those approaches are valid. What may set the authors’ approach apart from others is the thoroughly systematic nature of their process.

The book is not a primer in template metaprogramming. Other books do exist for that. The book also does not go so far as to describe any of the technical machinery necessary to make the ideas come to fruition. It assumes this machinery in the form of compilers, languages, and type deduction facilities exist already. Numerous pages of text is saved with this approach which I think allowed the material to be presented with greater clarity. No, the book does not get you ready for templates, rather, it perhaps prepares you for Concepts by illustrating how that might be achieved in C++ using what exists in that language currently.

Excellent Code Examples

Through this book, you probably see what great code looks like in C++. At least in example form as other considerations may change some of the details. See Eric Meyer’s book, Effective Modern C++ for background, a book I reviewed recently. The code examples could be expressed with some effort in translation in Java and C# through their generics mechanisms. Alexander Stepanov has been writing code for almost half a century and his detail, clean brevity shows in a highly consistent manner. If nothing else, you will gain exposure to a level of code quality that could be useful in evolving how you view code.

What About Design Patterns?

Recently, I read and reviewed a book by the name of Game Programming Patterns that discussed software design patterns. I found it supremely ironic that the next book on software technology I picked up was as far opposite from software design patterns as one could embrace. Both books, Game Programming Patterns and From Mathematics to Generic Programming deal with generalized representations of software. Which is correct, or more correct and which should readers emphasize? A central fact is that equivalent pieces of code can do the same thing but expressed in different ways.

Two ways exist to write software by hand. Semantically or functionally. No, I am not talking about Semantic or Functional programming although there is some relation to what I am about to present. Rather, I am talking about the mental picture a person has of the software they are writing. It is going to be a semantic/intuitive frame of reference or a functional/utilitarian form. The default mode, given that we are living creatures is the semantic form. It is how you express code when you attempt to decide what to name variables, functions, and modules (classes, components, interfaces, files, etc). It is the default way people do things including the organization of software code that corresponds to a given domain of activity. Software design patterns is an evolved attempt to express intent in a manner that balances the cognitive/effort imperatives of people with the operational constraints of the machine.

The functional mode exists separate from the semantic one and is concerned only with the operations on a model. It does not require a semantic interpretation. However, semantic expressions of software overlay a functional substrate that is required to exist. You cannot have semantics alone whereas functional representations can exist bare. While the code examples in the book have good, aesthetic semantic qualities by virtue of being written by people, the full direction of the book is functional in nature.

Which is more correct? It really does not matter because systematic semantic methods can improve cognitive engagement with systems and, at the same time, functional refinement is unavoidable. I know the semantic approach is customary in business software systems whereas I imagine solutions that are more functional driven would more likely be adopted in scientific, engineering, and technical infrastructure scenarios.

Still, the approach in this book can be seen as an alternative to software design patterns. The question in the book is not about the organization of code and what code is supposed to mean. The nuanced distinction then is not organization but structure. The question the book challenges the reader with concerns structure and process. Is the structure and process founded on the right principles and expressed in such a way that the truths of such are more broadly applicable?

History, Philosophy and Formalism

I think when some people hear the word philosophy, it may conjure an image of intuition. Philosophy of the ancient kind has a connection to science, logic, reasoning, mathematics, and formal thought. Dave Hilbert speaks of confirming intuition through logical analysis to form reliable axioms underlying the propositions that define theories. The book carefully and consistently shows the evolution of thought from ancient times to the present from intuition to formal methods and how that can be applied to the definition of software.

Thales, Plato, Socrates, Aristotle, Pythagoras, and Euclid are all discussed in terms of how they moved civilization forward through either their discovery of or well communicated notions of truth. Truths, many of which, had a grounding in reality or otherwise pushed the mind to think in ways that expanded intellectual capacities to produce works of art, literature, and science. Great credit goes to the authors for not discounting the influence of other intellectuals from Africa, Southeast Asia, and the Middle East as they openly admit that the focus is on the European intellectual tradition in which the lines of philosophical and scientific progression has a direct relationship to computers and software.

Computers, in some form, has been around for thousands of years. Roughly 22,000 years by some estimates. Maybe one or two existed in a given nation operated by the most elite thinkers. Perhaps just a few people had one in what we would consider a primitive form. Consider the Antikythera mechanism as an example. The difference is that today, they are more sophisticated and available to far more people. You do not need electronics to have a computer. The process of computation is independent of machines. It is just that machines currently make computing easier to do in terms of speed, consistency, and variation of effects. The philosophy behind computers is ancient and the oldest recorded philosophers dealing in questions of reason and reality have wondered about a general model of the world represented by symbolic order, what we currently call number theory, abstract mathematics, and structured data modeling.

The book gets you from then to now leading you to an approach based on the authors’ work with formalized generalism expressed in C++ through templates that may evolve into a mechanism called Concepts. Whether you accept any of the concrete tools involved such as choice of programming language and type representation mechanism, you will still benefit from understanding the history and rational basis for the idea of systematic generic programming described.

Element Relationships

I gained a greater appreciation for the idea of systematic relationships. Perhaps there are 3 questions you can ask regarding systems. They are questions that appear to me fundamental to improving the strong internal cohesion of a system:

  • What is the relationship between elements
  • How do you express a given relationship
  • How do you apply that relationship

It seems that these questions can be asked at each level of a system. Perhaps on many occasions, software is written in such a way that these questions are answered intuitively. That is certainly an exceedingly fast way to proceed through the definition and organization of elements in a system. Approaching those questions rigorously however, could be far more useful but the cost in time may exceed productivity thresholds past that which would be acceptable to a system definition/revision deadline. It is good nonetheless to know those questions exist.

The Truth of Relationships

I began to see a great example of how to execute refinement (or what others may consider software code refactoring) in chapters 10 – 11. The presentation of the rotate function was particularly compelling in that you saw how important it was to a large majority of the C++ STL. A single function that impacts so much. Somewhere between following the process outlined by the authors and their discussion of fundamental truths, I began to wonder about the truth behind the truth. I began to think about the concept of relations as these abstract things that sit at the root of all things being discussed. Relations are in the background and if only you could engage with that concept in a more complete way might you greatly enhance all that follows from them. My thoughts on relations are:

  • Relation as a construct with properties that determine the arrangement and kind of derivative constructs.
  • Relation as the true point of interface and transposition of elements.
  • Relation as quanta and data when properly interpreted, oriented to other relations constitute the fundamental first proto-quanta underlying coherent structures, static or dynamic.

These are my intuitions of course, but it would be interesting to see the work involving such things as they may apply more practically to systems of various kinds (not just of the computing kind). A reductive question that becomes challenging to address at each rung of follow-up inquiry concerns the relation that underly more apparent entities. Answering such questions, and their spawn and so forth through a method of spiral inquiry from most to least obvious description may lead to greater products of efficiency, generality, scale, and breadth of transformation.

History Shows Transcendence

The book is rich with historical accounts and context. You see Aristotle the thinker establish a precedent in contemplating reality that defines science today. Thales, Euclid, Pythagoras and others travel forth from their homelands and are rewarded with an expanded awareness of the possibilities for thought, reason, and understanding. The trend of the past, though there is darkness in it, also shows that the ability of people to transcend limitation is inevitable. Many branches of knowledge applied in the world through science, engineering, medicine, philosophy, law, art, and recreation indirectly contribute to the furthering of people’s capacity to understand and engage with reality in a more complete way. Software development is one thread of this in alignment with the more Euclidean aspects of this journey. The authors present a document that encourages a certain form of inquiry melded with practical objectives.

Good ideas are not the province of singular sources as the authors note. A major example of that is the general relativity theory. It is commonly thought and popularized that Albert Einstein is the primary source of general relativity theory. It turns out that Henri Poincare, Dave Hilbert, and a few others were working on the same idea somewhere around the same time, in some cases with no knowledge of each other’s efforts on this question. See Relativity priority dispute for more information. Good ideas are universal and are simply waiting, not for the right person, but for someone to put in the time harnessing the right thought process and tools. While Poincare, Hilbert, and Einstein came from different cultural traditions, they had access to common knowledge and the reigning empirical approaches of the day. Free knowledge was a major catalyst.

What I discerned from Lobachevsky and Gauss is that theory may be dependent on context. Lobachevsky evolved certain aspects of Euclidean Geometry by considering a larger context. I generalize this to say that hypothetically, scale may influence whether or not propositions hold which affects the universality of a theory. That became somewhat more concrete a conclusion to me after reading about Gauss’ proposed experiment using live mountain ranges to test the ideas of Lobachevsky. After you read chapter 9, you may see a similar implication.

Dave Hilbert’s 20th Century

While the idea of a pure formalist approach may seem appealing, I would have to disagree with Dave Hilbert’s contention that the consistency equates to truth to the extent a theory expresses this quality. Meanwhile, you need more modern-day Dave Hilbert’s to revisit, re-contextualize, and join together disparate but related efforts in a field of study. What they produce greatly raises the productivity and intellectual range of those who follow such as the case of Emmy Noether when she advanced the field of abstract algebra.

One of the major things Dave Hilbert did and what the authors of the book may also be doing is to streamline existing knowledge into a new form that improves other’s access to that knowledge. The heirs of Hilbert then produce a more effective foundation that strengthens the discoveries that follow from observation of the systems and collection of information they establish. The modern Space ProgramDigital Computers, the Internet, and Microwave Ovens among many other things are more likely as a result of this work.

Diamond Clear Description of Objects

What are these things we call objects in object-oriented design? The authors discuss data, values, value types, and objects and a few other things and describes these things in a clearer and more accurate way than may be commonly expressed. Chapter 10 is where this begins and it is also where the material takes on a slightly more technical flavor. Emphasis on slightly. Programming language and language mechanisms are treated to inform the reader of how well they support the practical realization of generic programming.

Broad Qualities of Generic Software

What is the foundation of general approach to software? That question is answered consistently and stated unambiguously throughout the book. I have collected what I think the authors intend as the major criterion for generic software in terms of design:

  • Abstraction. Introduced in Chapter 1.
  • Versatility. Introduced in Chapter 7.
  • Domain Independence. Introduced in Chapter 8.
  • Conceptually Complete. Introduced in Chapter 10.

You will not see items with these headings in the book. Rather, they reflect my synopsis of several paragraphs in the cited chapters. All, except domain independence are subjective qualities. Abstraction may not be appreciated the same way by all reviewers. That does not mean an element of software is not abstract, but perhaps it is not abstract enough. You can say that all of these qualities observed together in software is representative of software that is more general in nature.

C++ Concepts

People are talking about C++ Concepts. Well, what are they? They are explained in the usual places that discuss the evolution of C++. This book however is written by the principal designer of C++ Concepts and you are treated to a very close examination of this approach to software. The thinking behind Concepts is not new. What is new is making it work at a larger scale. At its core, Concepts is about confirming one’s judgment about the relationship among elements and proving that through correct designs. You can reflect every concept using Concepts, but a given representation of a concept may not be valid though it may function operationally. Concepts, the programming language mechanism, may become a means to verify assumptions in a way that is productive and produces a larger body of correct software.

I did think of variant data types for a moment while reading about Concepts. I have used those data types well into my past. They may bear some of the capabilities for materializing the Concepts mechanism. Between variant data types and prototype-based objects in JavaScript, you may already have the basic elements necessary to make Concepts real in that environment. Concepts seem intriguing, but will it really work when brought into the C++ vernacular? Time will reveal.

Software Design Laws

Early in the book, the authors introduced the Law of Useful Return. Additional laws would be presented and by the end of Chapter 11, these laws are listed together. The authors give excellent guidance in Chapter 11 on how to decide when each law applies. They inform good software design:

  • Law of Useful Return = If you create data that might be useful in the future, keep it by giving it to the caller.
  • Law of Separating Types = At higher levels of generality, define type mapping in terms of type structure. Do not defer to implied transformations conducted by compilers.
  • Law of Completeness = Broaden the base of procedures that could support a functional scenario to the extent feasible.
  • Law of Interface Refinement = It is okay to redesign and rewrite an interface if the improvement is truly worthwhile. Real world usage may show what the interface really needs to be.

Much of this may seem like common sense. Sometimes common sense can get blurred and skewed to a level that good design principles you once knew have decayed in your application of them over a longer period of time as you undoubtedly consider other things. Clear definitions such as these remind you of these principles in an unambiguous way.

Shortcuts Can Live Forever

A function by the name of get_temporary_buffer was written by one of the authors of the book. The story of this function is told in which the author expresses regret for a sub optimal design that appears to be in effect to this day. What I learned from this is that you probably benefit from getting as close to the most correct and ideal representation of a solution as you can before you have to move to the next objective because decisions you make can continue for a long time. Some call this technical debt and such debt can grow beyond repayment.

Second, if you make a solution that is the universal embodiment of all implementations, a good solution is to contrive a really good reference implementation others can observe. I still cannot believe 10 years passed before the std::rotate function’s lessons learned migrated into the standard governing C++. Such conditions can give rise to all manner of clever hacks in response to unanswered shortcomings and the establishment of cargo cult practices.

Masterful Refactoring

I began to see equally valid examples of software functions that did the exact same thing in a different way. At least in how they were written and structured and in how each function approached the same process. Each new example was a more generalized form of the prior version. What I learned is that if you pushed further, perhaps requiring a strong investment of time, patience, research, or just analysis, an even more general yet equally applicable approach could emerge. Such an approach may apply to more than just the area of the system examined. I see it as software code refactoring that is localized but could potentially apply globally.

Chapter 11 is a great marvel that showcases the process of refinement on procedures that themselves deal in data reorganization. Tremendous progress is made in chapter 11 in showing the potential for generalism as driver of concise, comprehensive, functional software architecture. Results following from a well-defined process of refinement that may impart greater consolidation, stronger form, and even stronger reuse and efficiency. This is not guaranteed, but the potential exists.

Software Performance

Chapter 12 calls into question assumptions about software performance. In-copy edits of data is not always faster than edits involving the copying of data. We also learn about Josef Stein who sped up an ancient Euclidean procedure by virtue of the qualities of the computer.

Henri Poincare is discussed and I get the sense that the constructivist view he espoused may, in fact, exist as the ruling ideology of software development. The point of Poincare is to remember to prove ideas in tangible form rather than accept as true ideas that exist primarily as proposal. That include age old mantras about performance. The right adjustments proven in real form can make a huge difference in what we know to be true.

Crypto

While we are now following the order of the book’s chapters, chapter 13 is where the journey begins to wind down. As the authors state, this is the chapter where all of what was presented prior is put into action. We look at software security in the form of encryption and decryption to make information secret in a computer.

We look at some of the history of computer security in the area of crypto including the happenings at Bletchley Park during wartime. The events there shows us how the pursuit of crypto may have accelerated the emergence of digital computers. Alan Turing and others feature quite well in that tale.

Later we look at various approaches useful to the composition of crypto keys including the Miller-Rabin test. The approach from Agrawal, Kayal, and Saxena described by Andrew Granville is put in context has being more accurate than Miller-Rabin but not fast enough to be acceptable in many crypto solutions. I understand from that reading that sometimes the best, most accurate, and precise solution may not be chosen due to operational concerns. Practicality often rules the day.

I think the authors did a great job in chapter 13 distilling a broad set of information into a few pages. They note that present-day crypto solutions may hinge on how efficiently inputs may be factored. A dependency I think could be weak in practice contrary to the widely accepted view if only we rely upon the public absence of practical methods for efficiently exercising factorization on large inputs. That aside, the presentation of cryptography is very clear and fully coherent in one chapter.

Highly Rated

This is a superb book. I am not 100% convinced of the idea of generic programming as represented here, but nonetheless think the approach presented is solid. I reserve some skepticism only that the present-day, conventional computer machine model can dominate all other concerns. I am only talking about extreme, tightly specified performance and not the general case of acceptable operation. To the latter, I can see that the approach of this book is worth pursuing. The author does show in a conceptual way how performance through the generic programming approach could meet or beat the conventional methods. The only way to know for sure is to try it. If it works, you have a more valuable design that is adaptable, reliable, and efficient which is a good thing indeed.

Chapter 14 is the end of the book that begins with a highly condensed overview of the preceding chapters. You necessarily conduct the generic software process from a more specific state to one with broader reach by reframing the operating context to one that is, itself, less specific. The authors maintains that speed is important since an adaptable solution that is slower may be considered less desirable and seldom put into practice.

Thinking about the book as a whole, I feel very good about it. The way language is used is superb, crisp, and clear. The book concerns a single, primary topic but the information that is presented is multi-faceted, mentally stimulating, conceptually invigorating, and rich with ideas and excellent advice. Proceeding through the narratives gave me a deeper, clearer understanding of systematic thought process that breaks with stereotypes on said process by not eschewing intuition and prerogative in the choices one may exercise in defining the approach to a solution.

At the beginning, I did not know anything about the author. I was simply curious. While the book’s premise was contrary to what I understood as a governing practice in software (that a specific solution is the path to higher performance), I nonetheless approached it with an open mind. I decided to explore, at least conceptually, the generic approach. What I know now is that between specificity and generality one does not outweigh the other. The better approach depends entirely on your goals, resources, and allowances of time.

In that light, perhaps all generic solutions are actually specific solutions of a different kind. At least until an even more generic solution comes along. Therefore, what we call specific solutions today are simply more general than their predecessors in some form. They have yet to be expressed in such a way that is tangibly more adaptable in an acceptable fashion. All that to say that there is much here to benefit solutions of many kinds. The greatest benefit, what this work may encourage, may be the evolution of reason and intuition in a more powerful direction to create solutions that work better to reveal and advance reality more usefully.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s