Elite Programmers Make Mistakes Too

Top conceptualists and engineers came up with the Go programming language. Humans can’t do perfection. Fortunately, errors were found early. This error affects security. Think data breach, info leaks, and spyware. As it is fixed, what other errors remain? Devs Working To Stop Go Math Error Bugging Crypto Software. Despite this, Go is still a great server-side web development and networked middle-ware language.

Advertisements

Confirmed … Insecure Computer Processors

Latest from Intel: We’ve Found Severe Bugs in Secretive Management Engine, Affecting Millions. What I am about to talk about is a topic in my mind that has grown over the years. When I started out in information technology 20ish+ years ago, I had no idea how the journey would unfold. I was immersed in IT. Like many, there was this naive expectation of the bountiful possibilities that technology could unfold. IT was treated like a kind of magic backed by logic, reasoning, and empirical process. Fast forward to the mid 2000s and we see a different reality begin to unfold.

Data breaches start out slowly. First, you had worms. Remember the old Slammer worm? At the time, we could blame it on oversights by Microsoft. In our collective hopeful attitude, we overlooked such things thinking in the back of our minds that it will get fixed. Every exploit, data breach, vulnerability, or spyware problem was ultimately explained away with the phrase, “technology changes and things get better.” Over the vast course of time that may be true, but when you are living right now with bad technology that somehow destroys your life, none of the hopeful aspirations serve as consolation.

The data breaches kept expanding in number and scale. The entire time, we were all “this software is bad or that” and much of that was true. An unwavering faith in hardware led us to think that if only the software could be written better. If only Intel could produce a more secure computer architecture. They purchased McAfee and many of us thought that security improvements were sure to come. Surely, the best engineers can solve these problems. All the tech companies with all the PhDs money could buy with huge advances in machine learning could not put Humpty back together again.

Just over 20-years of security flaws. Let’s say we had a top tech engineer and PhD workforce of about 200,000 individuals throughout that time. I am excluding everyone who doesn’t work on core technology (operating systems, drivers, chip designs) upon which pedestrian technology (websites, desktop programs, batch jobs, and apps) are built. That represents roughly 8,320,000,000 hours for all individuals on a 40-hour work week. That is being conservative. All that money spent, 8.3 million hours of collective effort and what the results show is that digital computer technology is inherently flawed and fragile when it comes to information privacy, security, and protection.

Cryptography systems have been broken. SSL is broken. RSA is broken. PKI in general is broken. Sure, you could say that many breaches occurred due to a failure to update systems. On the other hand, what about the concept that the original code was expected to be the best in the first place? Stating that all software has bugs or flaws are inevitable proves the point. It is unwise to trust monetary transactions and communication of sensitive information such as social security numbers and their like over digital technology. Compared to the issues in digital, the best move is to revert many systems of a monetary nature and those involving sensitive individual information to analog systems and processes pre-Internet era.

Net neutrality reversal? Who cares. Let the big companies charge a huge toll because the medium itself is untrustworthy to begin with. Alternative voices are proposing the creation of a new Internet. The counter argument to that is by what basis, track record can anyone prove they have the methodology, first principles, and end-to-end conceptual model to define a better Internet when not even billion dollar companies can pull it off. Apple’s Face ID was cracked by a 10-year old recently. Absolutely nobody understands 100%, lock solid digital security. If they did, they’d have to burn the entire tech status quo to the ground and put in place new stuff that is 100% incompatible with what came before. Anyone pursuing that is hopefully not using the cloud as it turns out Over 400 of the World’s Most Popular Websites Record Your Every Keystroke.

Top Hackers have long known everything I’ve stated above. They are intimately familiar with the fact that digital technology is an open book. As soon as you decrypt data, it is transparently visible on the computer itself from CPU to hard drive and everything in between. You cannot hide information in a computer. That is part of the reason Why Hackers Reuse Malware because it works. When you think deeply about the interacting parts of a computer, as has been defined since Von Neumann, it really is like Neo’s Matrix in terms of the ability to fully bend it in directions good and not so good.

The latest question on Slashdot asks How Are So Many Software Vulnerabilities Possible? The question is partially complete when you consider that the hardware itself is vulnerable as well. However, just looking at software, a major contributor to the problem is an over abundance of emphasis on the business concept of Opportunity Cost. Besides that are the narrow deadlines, the unwillingness to slow things down so there is a curated and cultivated security architecture upon which a system is defined. It is still a features and functionality business with security considered a separate issue given less investment. Unfortunately, even if the developers did well in business programs, there is still the problem of the operating system, database technology, drivers, hardware, etc. A weak link breaks the entire chain and digital technology has many weak links.

Finally, I end with this post, Everything is Broken. That post says in more eloquent statements what I am implying here. I do reach a slightly different conclusion. Continue to use computers. Use them for fun, games, convenience, moving data around, talking to friends and family. Useful automation awaits when private individual’s information is not part of the mix. Just be aware that casual or professional activities may be okay but using the Web, mobile, and computers for sensitive communications is a huge risk.

Samsung Galaxy Linux Computer in a Phone

Demo of Linux running on a Samsung smartphone. Very smooth and convenient. A way to use full desktop functionality on an ARM (Qualcomm Snapdragon Processor or Exynos … I’m fine with either) chip. Do full desktop functionality (software anyway) and have the ability to write programs for that same processor. Ubuntu Edge has returned. It would be nice to do Fedora and Red Hat as well. Anyway, this is nothing new, but Samsung’s approach provide a solid alternative in terms of a widely deployed computing environment that is productive and flexible.

Qualcomm Laptop

Qualcomm and chips like it power the majority of smartphones and smartwatches. Such chips are increasingly showing up in cloud data centers. Some of us speculate that Apple will eventually produce a MacBook that runs exclusively on a chip like this. Microsoft already tried it with Surface RT. That failed for reasons unrelated to the chips in question.

Now, it turns out that Google is feverishly working to clean up their data centers due to a hardware problem that could introduce a vicious form of malware. Indeed, the computer security situation is potentially quite precarious. The right triggers can result in Google data centers hosting Gmail and Docs leaking info Equifax style.

The hardware they are using that has this potential issue is the same hardware running Microsoft Azure, Office 365, Amazon.com, and Amazon cloud. It is the same hardware running Facebook sites, Twitter, and many banking sites. Basically, all the hardware of a certain brand used by most large companies has the same flaws Google is working to root out.

Meanwhile, this flaw in all this big company hardware may also exist in everyday laptops. That is unfortunate and a potentially a huge let down. You go to all this trouble to work with computer architecture, assembly language, systems concepts and component specifications and default assumptions only to learn that the machine isn’t what it is billed out to be.

A sea change in hardware would start things anew. Provide a chance at hardware that works fully as advertised. Most importantly, allow the tech community at large (IEEE, universities, and research institutions) to revisit computers from an open specification that also fundamentally addresses security.

C++ UI using Functional Composition

I think the following would be a neat concept if doable. The concept is to produce a UI in C++ using functional composition similar to what you have in functional programming languages. An example is Buckaroo’s Reactive Terminal Demo which is quite solid. Functional composition has a mathematical orientation as can be seen in the StackOverflow question about function composition in C++. An existing framework by the name of Sodium is on GitHub and is seeing some adoption. I looked into Sodium and understand that the C++ version is still a work in progress. Meanwhile, I have been reading a few functional programming books in the last 2 or 3 weeks and there are some good perspectives that I will discuss later.

Following my brief research into functional programming proper (namely Haskell, Lisp, and Clojure), I concluded that a structure similar to the following may be useful. The implementation is in C++ and follows from the latest writings from David Vandevoode, Nicolai Josuttis, and Douglas Gregor in their book, C++ Templates: The Complete Guide. Anyway, if I can get the following working, then I’ll have more to say on the topic of the intersection of functional programming and C++. As it stands, the latest build scripts for Fedora Linux and the basic skeletons for the .hxx and .cxx files is as far as I’ve gotten. The following link is to the git commit reflecting these changes.

 

Most Used Operating System: MINIX

Professor Andrew S. Tanenbaum made an operating system 30 years ago named MINIX. He wrote a letter recently in which he highlighted the little known tidbit that his operating system could rank among the most widely deployed system of all time. It was a surprise that, although he could have made billions, he setup his operating system from the goodness of his heart so that even major corporations could use it and make billions. Harvesting free stuff to make proprietary closed stuff. Meanwhile, the move validates MINIX as superior as an industrial strength microkernel.

Compiled Code is Faster (Python vs C)

Paul Pacheco, who works at American Airlines, explains in succinct layman’s terms how C code executes faster than Python. Steve Baker concludes that present-day computers and their processors are designed to accommodate the C and C++ programming languages. That makes sense from the standpoint that it is stated that C was used features of the PDP-11 during the creation of UNIX. It is likely that the PDP-11 influenced C. Likewise, as the PDP-11 influenced Intel x86, then it follows that C has a design aligned to Intel based computers.