Dennis Schmidt wrote a book called “Way-farer”, which plays in some distant future (as envisioned back in 1978) where men have come to the planet of Kensho. In it, a young man called Jerome starts on his road to enlightenment through the Way of the Sword. In the mountains it is rumored that a Master lives, who might teach him, if he manages to convince the Master to do so. He eventually finds the master, and, when confronting him with his request, is denied. He perseveres however, and eventually the master confronts him about his motives. This confrontation is perfectly reusable for the quest of many into the “Way of Microservices”. So here is my adaptation:
The Master squatted, peering into his eyes, his face only a few inches from Jerome’s.
“Who are you?” Abrupt. Harsh.
“What are you?”
“What do you seek?”
“The Way of the Microservice.”
For a Software Development professional, one of the earliest truths you learn is that estimating the delivery effort for new system is surprisingly hard. A project manager at IBM told me once, back in the nineties, that the difference between a senior developer and a junior one, can be found in the finely tuned padding he or she adds to their estimates. This is not to say that you won’t be able to quickly develop a sense for the amount of work needed if the subject domain is in your field of experience, but what you learn over time is how much to add for what wasn’t specified explicitly. Frederick Brooks (of The Mythical Man-Month fame) called this the difference between essential complexity and accidental complexity. It leads to the familiar frustration with developers that, even though their estimates were spot-on, the system was nevertheless late, and they get blamed for the discrepancy.
An often heard remark, especially at management level, is that software development needs to get “industrialized”: just like the production of cars and kitchen appliances, writing software needs to becomes standardized to the extent that estimates necessarily improve. This is related to the idea that a higher level of maturity in the software development process will automatically cause the costs to come down. What really messes up these expectations however, is that, in a very fundamental way, producing software is not like producing a car.
Software Quality is a nice subject to write about. So much to say with plenty of viewpoints to choose from. Wikipedia tells me it is about functional quality (how well it complies or conforms with design or requirements), or structural quality. (the non-functional aspects) There’s an ISO standard on Software Quality Requirements and Evaluation, and another one on Process Improvement and Capability Determination. Via that route we can then continue into the process for making a quality product and invoke CMMI; the integrated model for maturity capabilities.
Actually, there’s so much on the subject you’d wonder why it even is an issue. I personally think there is a simple reason for all this: for software, quality is an inherently vague quality! [sic]
What the whole discussion is about, just to get down to the basics of it, is whether dynamic, interpreted languages will ever get on an equal footing with something like C, on a mobile platform. The central discussion is concerned with memory management; where Java popularized an approach where you don’t have to clean up (or even: can’t clean up) behind your back, because a Garbage Collector will do that for you. If you are a Software Developer or Architect and like reading a well-researched article about the dangers of over-reliance on features such as Garbage Collection, please do read Dan’s article.
What got me thinking, was the vehemence as well as the technical focus of the discussion, for something non-techies are more-and-more getting interested in: non-functional requirements.
In the nineties I had my first Project Management related training, where I was introduced to the Standish Group‘s Chaos Study. This study collected, through question forms, information on IT related projects and their success. The study asked for projects to be classified as “Failed”, “Challenged”, or “Successful”, and asked respondents what factors influenced this outcome. The score for successful projects in 2004 was abysmal: only 29%, with 53% “Challenged”. (Over budget, late, or incomplete) For the training, this study was used to emphasize (among other things) the need for good project management, by referring to the list of factors influencing the result.
At that time I had already seen several project become challenged due to bad (“less then good”?) project management. However, if you read recent articles referring to the Chaos study, you’ll find that many are questioning the project success classification. They challenge the study’s findings by asking if a project’s success was measured correctly. As an all-round software guy, I know that project success is hard to grasp, but I still thought the study’s results very useful due to that list of factors. In my eyes, that was the most significant finding, not the bare success scores. Then again, I accepted the premise that a lot of projects don’t qualify as the success they’re presented as. Some people just seem to prefer denial.
It has taken some time, but my first Arduino sketch worked! I found a great set of pushbuttons with embedded LED’s, so perfect for building a Boeing style MCP, and just got the LED’s to work with my Arduino.
XML is an amazingly successful standard for representing data in a portable and cross-platform readable way. Unfortunately also suffers from that limitation Crocodile Dundee noted about certain bush-foods: you can live on it, but it tastes like sh*t. XML is mostly unfit for human consumption.