2017 IT Downsize

HP is cutting jobs and that has people thinking about the viability of the tech sector. An economic downturn may be on the horizon. Maybe. Now, looking at the layoffs that have occurred over the last few years, it appears they portend product saturation in some categories and lowered interest in other cases. Many other causes of layoffs and expected downturn apply.

The safest job in IT is the network admin. Difficult to outsource this position when the network is actually down. Who’s going to remote in?

The second safest job in IT is the help desk. Not all positions but at some point, mailing in broken equipment onlygoes so far. Some things require onsite assistance.

The rest is truly up in the air. Opportunities exist, but they seem increasingly specialized. A workable combination is SME skills in one or more business and operational disciplines along with the ability to integrate solutions well.

Anyway, the next year to 3 years will be carefully observed.


Some Meaning in Hewlett Packard Retail Cuts

HP Enterprise division is doing well, but the consumer business is in stress. HP announced they would cut thousands of jobs on top of thousands that already took place over the last few years. I did not realize HP had 50,000 employees. More cutting is probably on the horizon.

None of this means that HP will stop selling computers at retail, in the consumer space. Rather, it is increasingly difficult to offer a product that is decreasing in relevance to a growing number of people. Tastes and preferences have changed and the only company holding steady is Apple.

What we can learn from this is that computers have passed a threshold in terms of what people expect out of them. I find them convenient to type on but I recognize that an Apple iPad Pro with an attached keyboard may work just as well plus you can write on it. Laptops and desktops simple could not compete in the area of convenience. These other devices are more convenient. Continue reading

Apple MacBook Innovation

Not too long ago you couldn’t put much stock in rumors about Apple products. Times are little different. They’ve kept up on the processors and circuitry for MacBook Air and 13″ Pro lines at a reasonable clip. New rumors are out that seem very plausible regarding changes to the design of MacBooks. If this is the case, it could represent a real sea-change in what people expect in computer design.

Rumors concern touch-screen based function keys. E-ink technology for keyboard keys. That’s all most people know. If the past is any indication, this is just the tip of the iceberg.

I will admit that I am not too intrigued with these types of changes, but I will also admit to have been quietly impressed with some of the designs that have emerged from Apple. The refresh of the MacBook line with the Core M processors to produce a super thin laptop was very interesting. I now see the benefits of the Retina display. I am now hoping that they’ll not only have a superb design and performance update for the MacBook Pro refresh.

Whatever the case may be, the announcement of new MacBooks will be a strategic setup for the 10th Anniversary of the iPhone in 2017. MacBooks are the primary fuel for iOS App creation across iPhone, iPad, and TV and WatchOS. It will be interesting to see what comes.

Apple MacBook Refresh Could Bring E-Ink Enabled Keyboard

Premature Optimization is NOT Evil

One of the biggest lies in the computer programming arena is Donald Knuth’s quote about “premature optimization is the root of all evil“. He was not wrong if you read his full quote which spans more than a sentence. The problem is, several generations of software developers have been brought up on the abbreviated version of the quote out of context. Do not believe it.

A major factor in deciding not to optimize software programs was Moore’s Law. That law doesn’t actually hold up when it comes to the speed of a computer. When people thought the computer was just going to keep getting faster (like doubling/tripling of speeds), they decided buying a new, bigger, faster computer was cheaper than spending the time to optimize the program. I personally agree with the economic argument in situations where it applies. Still, you can never quite count on today’s data input/output profile remaining constant.

The point of view that you can just throw more hardware at the problem broke down somewhere around 2001 to 2008 depending on how you look at it. Fact is, computers have barely crossed the 5GHz barrier and even if when they do that is definitely nowhere close to the 1THz barrier and beyond. At least in the very near future. The main approach we have today to increasing performance with new computers is simply adding more of them and then running code across multiple cores and multiple computers at the same time.

Cloud computing is the ultimate expression of this. That also partly explains Amazon’s fortunes in cloud (and very closely followed by Microsoft) as many enterprises cannot get similar price/per performance in their own data centers. Scaling a software program through the cloud is an inexpensive way to get performance until the compute hours cross a certain threshold.

What’s left to do?

Write optimized code from the get go. Or, it may not even be code, but something such as database structures and data that code and other processes will rely upon. Simple things like the structure of your file folders. The trade-offs between complex and simple file formats are not clear-cut. Sometimes a complicated file format can lead to a faster program versus a simple file format. One reason sometimes deals with the amount of information you need to retain while processing data. How the data is shaped and where it lives and must go has the biggest impact on performance. Code must be organized accordingly.

Sometimes you are in an enterprise with longer equipment upgrade cycles resulting in tighter hardware constraints by the standards of today’s software frameworks and libraries. Software still has to run fast and it can if you exclude the possibility of scaling it later on newer hardware outside the timetable of your near-term deadlines. Perhaps you are deploying to mobile and need the best performance you can get in a tightly constrained space. While mobile processors are generally said to be several times faster than supercomputers of the 1970s and 1980s, they are still not as fast as today’s desktop/laptop processors. You still need good performance in what is, by today’s standards, a constrained processing envelope. Even laptop battery life can be influenced by efficient software code. As it turns out, just like security, you cannot easily retrofit solid code execution performance after the fact. You have to start with performance as a goal or have the skills to write high performance software.

I found several good starting points for this:

Why are programmers so anti-optimization? Quora answer by Kurt Guntheroth, author of Optimized C++

Writing High-Performance .NET Code

Systems Performance Enterprise and Cloud

New 4K and 8K Displays Probably Means New GUI Standards

Many GUI toolkits for making GUIs for desktops are based on 96 pixels per inch or something similar. Most of these toolkits work well for 1080p and lower resolutions. Now, with 4K and 8K (and my dream resolution of 16K) displays, the amount of pixels involved temporarily exceeds the reasonable settings of these toolkits. That can mean areas of a GUI program that are too small, too narrow, and barely noticeable.

Yes, 4K and 8K resolution is a huge leap beyond in terms of the amount of visual information you can pack into the screen.

I ran into the very directly recently when writing a C++ program using the Fast and Light Toolkit (FLTK 1.3.3). The program logic is such that it dynamically adjusts screen areas based on the overall resolution. It is accurate to say I use a proportional layout approach that resizes various GUI widgets based on percentages translated into pixels. The results are excellent.

Except, you cannot easily control fonts in FLTK. The font size in my program is too small. I can solve it, but not the way I want. In the 1.3.x series of the toolkit, API ease of use and results falls apart in the area of fonts. After a few hours of research I learned there are code libraries from others who ran into the same issues. While they do provide font overrides for FLTK, the situation gave me pause to consider the future. Do I really want extra code at the application level for fonts? Not really. In fact, I am attempting to slim things down.

While I expect native GUI toolkits to adapt (by the way, this is not a problem for certain GUI toolkits for .NET and Java), it would probably be best to have great alternatives that are more adaptable and forward-looking. Alternatives that are heavily vector-based. Some exist, but a few of the ones I saw had issues. This is less of an issue for mobile platforms (phones and tablets) today has they have yet to reach 4K across the board but that time is coming soon.

Major Tech Lesson from the Samsung Galaxy Note 7 Recall

The Silicon Valley mantra of “Fail Fast” does not work in every case. Avoid Silicon Valley mantras when working on information technology. Real life is more complicated. Facebook as a company took many years to become a major company. Apple, Microsoft, and several other large companies had and likely still have their intense moments, but good things take time.

The Galaxy Note 7 situation appears linked to attempts to get out a top grade phone weeks ahead of the iPhone 7 announcement. A laudable goal and indeed a clear act of decisive decision-making. The problem is that complex engineering cannot be rushed. Even when you have an established process.

The “good old days” of the 1980s and 1990s saw information technology and processes that was much simpler than exists today. Back then, you could rush things out and even if there were a few rough edges, you could still succeed. Today, customers expect finished items with no major show stopping aspects. Along with that, all the simplified graphical user interfaces and hardware form factors are possible because of “more” not less complexity behind the scenes. You cannot rush any of it and have a stable product.

The lesson is, if you are going to engineer a solid solution, you must take your time. Plan well. Test thoroughly and triple check everything. Otherwise, the cost may be your very way of sustaining a living.

Samsung Permanently Discontinues Galaxy Note 7

Great Decision Made on Software Patents 

Software is not patentable. Software is a form of language. A specification, not a technology. Software enables technology. Rather, the computer itself, which is the actual technology is merely reconfigured in terms of the signals it receives, processes, and outputs based on software triggers. The computer is in the public domain as a technology, widespread, and software is the common language for reconfiguring them to do certain things following generic, repeatable steps. Things like opening a file, showing a file on a display, and sending the file to someone else. That file could be a text-message, photo, document, or set of inputs into a database or data-driven business situation.