The other announcement called Amazon Aurora means you can dump those SQL Server, Oracle, and DB2 databases. Use those $200 Chromebooks to access structured data in the cloud. Scrappy start-ups have a much faster path to get big in their operations. No more huge capital sunk into IT equipment because Amazon is the data center in the sky.
No more local IT and masses of equipment on site. Maybe just a few small routers in the data center. Contract IT is called on demand rather than you maintaining salaried staff. The operating bill goes from really big to almost none existent. That is the glimpse of the future IT life these announcements represent.
Microsoft and Google hope to join Amazon on this journey to be the world’s data center. Amazon got there first and has the most successful model to date. Since Amazon had no stock in traditional IT, they were free to totally redefine the landscape. Indeed, they are using that freedom to remake IT in a new model as pure a way as they can. The truth is, Amazon and those that follow in their footsteps are taking on the IT model themselves. Theirs is IT on steroids but to you, it looks like things getting smaller and nimbler.
This is not always an optimal solution. Local IT gives you speed, control, and opportunities you do not gain with the cloud construct. Traditional IT is solid but the quirks can be unsettling. Regardless, well defined IT applied well in alignment with the true possibilities for an organization that utilizes technology is a huge opportunity.
In lieu of that realization, a more advanced form of these cloud services 10 or 20 years from now may put the term IT into disuse. What was IT may evolve into a cadre of highly adept Power Users who craft solutions under the auspices of the business analyst department. You’ll have programmer/analysts, marketing officers, and others running about knitting new solutions. In other news, the skills of sustaining network infrastructure, conducting tech support, and defining software architectures is a distant memory. When your $200 Chromebook fails, you just buy another because all your systems is in the cloud.
When you run a distro such as today’s latest Ubuntu LTS release, your Linux kernel version is 3.13. The latest version as of this writing is 3.17. That is not a problem usually as Ubuntu developers usually backport relevant, significant updates from the latest kernel into the main kernel supported in Ubuntu. Now, the question is, are there benefits to running the latest kernel version for Linux? I think so, but I think the reasons for an older kernel in an LTS release are valid as well. It is all about stability rather than features. That is not the important story today. The point today is about how the situation will develop in early 2016. Continue reading
Web platforms are among the tools I have used to present and capture information since 1998. Yes, I have written much about and shared research on native client app technologies applicable to desktops and mobile. In my opinion, principles applicable to web development can be relevant in the native code space and vice versa. Regardless of platform, scaling up the performance of systems is something I have explored in-depth. I was curious to see what Kyle Loudon had to say about the topic of performance in his book, Developing Large Web Applications. Continue reading
The EFF is leading an effort to ensure computer APIs are not copyright eligible. When I read the post on Slashdot, I thought it was a good idea. As I read the notice on the EFF’s website, I realized, there are substantive ideas behind the effort. It seems that 77 computer scientists, many of which are leading thinkers in the field of computers, are collectively saying that copyrighting computer API is not a good idea.
Several parties are involved in the matter and I have absolutely nothing to say about that because what is important is the concept itself. The question is, should computer API be copyrighted, patented, etc.? Let’s look at the related questions. The questions related to this in terms of societal imperatives is centrally pertinent. During my thinking on this, I realized 3 questions to explore related to this issue. Continue reading
Encryption is a valuable means for keeping information relatively secure. The question remains is there a valid case to design encryption so that there is another way to undo it besides the normal way? Ars Technica has presented an article that describes support for the ability to undo encryption in the case of malicious parties. The position of the technology community is that encryption is a neutral technology and it is neither negative or positive. The technology community in general seems to express a view that encryption is technology and if you make it possible to undo it outside the normal mechanisms, you make encryption useless. It would be the same as not having encryption at all. Others advocate for the ability to undo encryption as that can help in certain circumstances. What is the right way to go? Let’s explore this for a moment. Continue reading
Aesthetics and ergonomics as to the overall sense of technology is a crucial element in gaining more from the technology. The companies that lead in this area change over time and it is clear that success in this aspect of the technology’s representation and use can improve the adoption of a particular distribution of a technology. Concerning computer technologies operating in software, there is a tense balance between appealing aesthetics, direct practicality, technical quality, and a sustainable technical design. The extent of some of these qualities may be low relative to the amount of investment made in others. The priority of some may defer to others and that can be acceptable if the overall solution holds up well enough for the target use. A great undertaking in the design of a solution then could be a defined process for maximizing or raising the overall user experience higher as a consequence of realizing higher quality in multiple areas simultaneously. Continue reading