The Gap Between Standard and Scientific/Engineering Computing

Much is said in industry trade journals about Big Data and Business Intelligence. Companies and various organizations would like to take the information accumulated in databases and make better sense of it. Often times you will see in the published conversation about these things about the desire to unearth new insights and build on information to better manage operations, save costs, limit waste, and enrich products and solutions. One of the ways to realize these goals is through data technology and the skills that goes with it.

Let’s discuss what it means to have Fortran as the dominant computer language for ultra big data and super high end computer modeling. Observing this lets us see that what has been normal practice in corporate IT and web companies, namely SQL databases, web scripting front-ends may not always serve as the right tools for in-depth, performance critical data processing.

Conventional Data Technology

Speak to any of us involved with computing in many business or home contexts about data and the conversation may eventually lead in 1 of 2 directions. Either data technology is based on relational databases or it is a based on NoSQL oriented solutions. Both solutions are highly standardized with relational data technology widely adopted and understood. During the mid 2000’s however, a growing concern had to do with how well relational databases could meet requirements around depth, scale, and speed.

Relational databases are well understood in standard computing environments. The frequency with which relational databases are used can give a false sense that they are the dominant solution for high capacity, high speed data processing of data models that reveal the answers to a difficult problems. Sometimes you run across some information that reminds you that general computing is not the baseline for other relevant areas of computing.

The New HPC May Introduce the Previously Established HPC

The term HPC has seen a resurgence in recent times. This emphasis on high performance computing belies the fact that most of our usual tools, programming languages, data technologies, and other systems tools are suitable only up to a certain level. Now, if you are really clever, talented, skilled, and educated, you can take the conventional tools and stretch them, but you run into a ceiling.

Google may embark on that journey away from stretching the state of the conventional art to embracing solutions better fitted to the computing problems they face. Essentially, the idea of commodity x86 servers clustered together with the right algorithms work for a certain class of problems, but there is a ceiling that can be best overcome through better fit solutions.

Google’s evolution may be instructional for us all as we contemplate the long-term relevance of our technology investments be they tools, processes, methodologies or worldview. What we see with HPC is that the dominant viewpoints about the relevance and primacy of our tools have to be understood in context regarding those things that may work better for high scale calculation or widely distributed visualization clients.

So Fortran Rules

In light of all this, it was somewhat surprising to see revelations that even after all the new computer languages that have arisen since the 1950s, that Fortran still reigns supreme in processing really big data or defining many of the most sophisticated models that exist. That few technologies outside C/C++ can come close to addressing what Fortran can achieve is quite remarkable given the level of investment in newer technologies.

Skills in Fortran Are Not Important to Corporate IT

Reality is that corporate IT and web companies generally strive towards newer and turnkey technologies. Fortran has no widely known application for visual screens (otherwise called GUIs) technologies for Windows or Apple desktops, smartphones and tablets and the Web. Certainly, there are tools somewhere that could make this possible, but young, new explorers into the world of high impact, compelling user interfaces are likely to see more conventional sets of tools advertised. That means no Fortran in the user interface.

SQL is not the most obvious thing in the world but it is widely publicized in technology instruction and trade press. It is the default go to for data processing in the corporate or web data center with NoSQL frameworks and solutions close behind.

The common reaction in this environment is that what is old is broken, what is new has to somehow be better. The reality is that it all depends. I would never recommend using C++ to write a web application as C# does a much better job on average, but there are exceptional cases where PHP, C or both might be more suitable than C# or Java.

I would never recommend using flat files to run a company’s financial systems as SQL contains better tools for data accuracy, consistency, and transactional integrity in tracking debits and credits. Yet, there are cases were flat files, a far, far older approach to data processing, is exactly the right solution.

Can you fit nearly all corporate IT and web systems into the framework of well understood systems composed of Microsoft, Oracle, or LAMP based structures? The past few decades of evidence suggests that the answer is yes.

Certainly, the time spent and success experienced using today’s conventional technologies reinforces the primacy of those technologies by those of us who have benefited from them. They become our hammer. When all you have is a hammer, everything becomes a nail subject to that tool as the saying goes.

That also means that independently superior technologies like Fortran are not necessarily imperative for people to learn as the marketable appeal of this skill may be met with limited reception in employment. A surge of big data pursuits in the entreprenurial, corporate, and government spheres may see a reawakening of older tools and methods with prior art that predates today’s patented solutions. Or it may be that returning to big iron and paid compilers is a step too far compared to more affordable x86 servers and generalized software technologies. That’s a case to be made or not. Nonetheless, learning them can nonetheless be an opportunity should the conditions require them.

By Michael Gautier


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.