A brilliant article over at the Dobb’s Journal asks a great question about computer programming languages. The article titled, “Theory Versus Practice: The Great Divide in Programming Languages” goes into the difference between programming languages designed to run computers and computers designed for the best programming languages. The article, is short and concise and the author builds his case very well.
Programs for Computers versus Computers for Programs
Like the author stated, the difference almost is irrelevant since the outcome may be the same. I like his conclusion about how a language based on mathematics would differ from the way computer programming languages are today. The difference between the types of languages you could have based on how you design a computer could lead to different results in how you design a program.
The bottom line truth is that programming languages from the 1970’s to today are designed to translate human words into a sequence of 1’s and 0’s. The patterns of 1’s and 0’s, the logic processor instructions, and the memory and device address system means that any computer program language you design has limits on how they might be defined.
All the languages have to mirror the underlying computer in some way. CPU instructions, RAM and disk storage, devices are concepts that are reflected in programming languages to greater or lesser degrees. Attempting to track sequences of 1’s and 0’s or assembly mnemonic codes can be mentally challenging to most individuals. Today’s programming languages strives to streamline the representation of these concepts in order to boost the productivity of the programmer.
Once the source code is defined, it is translated into a representation that the computer can directly understand. Programming languages may have a limit on how they can differ from the underlying computer’s structure and operation. Therefore, most programming languages of a given class will have similar approaches to how source code is defined. In the end, all the programming language does is assist the programmer in defining a computer program in an easier way, compared to sequences of 1’s and 0’s, compatible with the standards of a computer.
The Way Computers Are Now
The C programming language created in the 1970’s appears to be the best programming language the more fully maps to the way computers are designed while being modestly productive. Based on the way computers are actually designed, assembly language is actually the perfect fit, but is generally unsuitable to building programs the way people prefer to build them.
Programming languages such as Java, Visual Basic, Python, and PHP seems to be among the languages most people prefer. They all correspond to the design of today’s computers. Those languages and their descendants will remain useful as long as computers consist of the basic components they currently use in the way those components are designed. Components like CPU’s from Intel, RAM, disk storage, and devices.
The Way Computer Could Be
One thing is for certain, computing in terms of a computing mechanism is about unit patterns. You can have 1’s and 0’s, a DNA sequence of GACT, or numbers 0 – 10 as an example. You could even have vibrating crystals in which the resonant frequency of sound is the unit of computation. Or, you could have light waves in a vacuum running at 100% the speed of light with the photon, the space between photons, or other attributes of a photon as the unit of computation. Anyway, however you do it, the way you interact with a computer is the same. You have to submit inputs as data or code.
When we used punch cards, the code was a pattern of dot stamped out on a strip of stiff paper or metal. That was cool because there was no way for the code to be erased or for the code to contract a virus. The downside was when you needed to update the code, you had to stamp out a new strip of paper or metal. The punch card was designed for the computers of the time. It would appear then that computers determine the shape and boundaries of a programming language.
Controlling a Computer
You cannot really control a computer. The control is an illusion for the most part. When power is running through a computer, that power is cycling through different circuits and taking one turn across a path of circuits or another. While the computer is on, it is sitting there channeling electrical current through wires. The input of data and the output of different information is an effect of mathematics to calculate an output for reception by people or as input to other processes. The control of a computer is the successful activation or prevention of activation of different states of operation. I would have to stop there since to go further would require the use of technical language at this point.
Designing an Programming Language for a Computer
The author of the article, “Theory Versus Practice: The Great Divide in Programming Languages” made a good point about the efficacy of actual mathematics versus computer programming languages in use today. It is such a profound point that you wonder if it would be possible to define a computer that operated according to algebraic notation rather than basic arithmetic.
The most sophisticated business formula for financial processing has to be reduced down into basic arithmetic when translated into a computer based form. Scientific formulas have to be broken down into basic arithmetic in order to work with current computer systems. There are exceptions like the program Mathematica, but even then, programs like it presents the convenience of higher level mathematical notation but translates it back down into basic arithmetic or their assembly language equivalents behind the scenes. As such, basic arithmetic and its corresponding order of operations is an inescapable aspect of the way programming languages must represent mathematics.
Mathematica costs a lot of money and it is not at all certain that you can produce the range of programs with it that you can with general languages like C, C++, Java, Visual Basic, and PHP. It would seem the author of the great article about programming languages would see a both a computer better designed for higher mathematics as actually used by people and computer languages based more faithfully on higher mathematics.
Functional languages like Lisp, Haskell, and Microsoft F# seems closest to the structure of mathematics but not all of the notations and symbolic conventions of mathematics. For now, it looks like those wanting a higher expression in a general purpose language will have to wait a while longer. Until then, it would seem that the hundreds of programming languages available today as well as the specialized environments like Mathematica, offers something for just about everyone.