Jose Sandoval Google
 Resume     Book     Software     Drawings     Home     Subscribe to RSS feed Search Web Search josesandoval.com

Coding Standards and Naming Conventions
Wednesday, November 30, 2005

In our noble quest of engineering complex software applications, we have within our vast arsenal of tools, Coding Standards and Naming Conventions.

It seems obvious why organizations use them, but why should a group of highly trained engineers want to write uniform looking code?

It boils down to the following axiom:
    The implementation of coding standards and naming conventions is an attempt to control quality.
According to statisticians and expert Quality Assurance professionals, including W. Edwards Deming, quality control is not about building things to specification, but about controlling the variability of individual parts forming a whole system.

The idea of variability control leading to higher quality was first implemented in Japan after WWII. Statisticians brought to the devastated island new methodologies, which they claimed would aid in manufacturing industrial products with higher levels of quality. The experiment worked quite well, leading to a rapidly growing economy based on an unexpected manufacturing revolution.

At the time, North American business circles had the naive theory that Japanese manufacturing processes were better than North American's because of cultural differences, i.e., Japanese worked harder and better. We now know that their processes were better because they learnt to decrease variability in the smaller parts used to make bigger components.

When manufacturing physical components, it is easy to measure variability. For example, if we are building a car engine, we can measure all the parts in order to determine how each piece is different from an "ideal" designed standard. In this case, each piece is probably created with the aid of some kind of mold and automated machinery that make a copy of the model over and over again.

In software engineering, if we are to prove the statistical fact that less variability results in greater quality, we now have to figure out what "variability in lines of code" means. Specially, when every program ever written is different from each other.

When we build or grow software, our working realm is the abstract and conceptual world. The executing parts of a system are ideas and processes that only exist while a computer is turned on. These abstract systems are put together by different types of minds with different experiences and competencies.

Not having molds to churn out lines of code, we try to reduce the variability of the code we write by implementing and enforcing coding standards and naming conventions. In other words:
    by limiting the variability, we will have a better chance of maintaining and enhancing any particular program, regardless of its complexity or length.
The difference of executability and maintainability becomes key in the process, as writing code that compiles and runs at least once is easy - however, writing code to be maintainable is almost an art form.

Software applications are complex to design and implement. This complexity together with the ever evolving needs of users, make codelines non-static entities that must keep changing and adapting to new requirements.

Unfortunately, we can't physically open the hood of our J2EE banking systems and install new components at will. To maintain and modify a particular business rule, we must look at the whole system and figure out how the new change will affect everything else. Thus, by having standardized code styles and well known naming conventions, the task of maintaining any system becomes easier, as we can go back to the source and understand why certain algorithm has been written the way it was.

However, can we realistically expect our software engineers, who have vastly different experiences, to program identical looking and measurable lines of code?

Software Engineers come from an array of different backgrounds. If we give a particular problem to solve to a group of developers, we'd probably end up with as many solutions as there are engineers. The best we can do, I think, is to clearly explain why these standards are required and get the buy-in from our development groups. Enforcement levels will vary from organization to organization, but every company should implement coding standards and naming conventions.

I'm a big proponent of any Java shop to use Sun's Java Coding Conventions. Most Java developers I know follow them, thus making it easy to come on board at any time or in situation with minimum training required.

Overall, the less variability there is in the lines of code we write, the better chance we'll have to successfully maintain any piece of software, no matter how complex it is. Moreover, the easier the maintenance phase becomes, the less cost we'll incur fixing defects or adding new features to our ever evolving codelines.


7:37 PM | 2 comment(s) |


Lumping Organizational Experience
Monday, November 28, 2005

Should a corporation lump organization experience into one whole number? Can any corporation have "100 years of Java experience"?

The thought just came up to me while reading some details on Nadler's Congruence Model, or fancy talk for corporate alignment of different things: training, strategic goals, task assignation, etc., etc.

The thought of accumulating organization experience is ridiculous, but it's worth thinking about it. Why is it wrong?

Assume Company X employs ten Java developers and each claims to have ten years of experience writing Java applications - These developers have worked at different companies for the last ten years. In terms of work years, we can claim that the corporation has, indeed, one hundred years of Java experience. I.e., each developer has worked with the technology for that long; each developer has mastered the language; Company X is doing nothing wrong by claiming one hundred years of Java experience.

Of course there is a fallacy in Company X's argument - We need to take into consideration all the knowledge overlapping. Just because our ten developers know how to use Threads in a program, it doesn't mean that our corporation is ten times better at writing multi threaded applications. It's the "Mythical Man Month" of yesteryear.

Why am I even writing about such silly things? I was bored, so there :)

BTW, I've been playing with Java since it came out in 1995. However, I don't claim to have ten years of Java experience. O, how sweet memories I recall of my first copy/paste Java applet - Java Applets were ahead of their time.

BTW2, I've seen such claims from some companies: "Over 40 years of management experience put together." Do you believe them? What does it even mean?


12:32 AM | 0 comment(s) |


Google's Violentainment
Wednesday, November 16, 2005

It's easy to point the judgmental fingers at google now a days. The company touches almost everything that matters on the Internet. They do search, news, maps, directions, on-line searchable books, pictures, videos, desktop searches, web site analysis, etc.

They have become so good at deploying usable software application, even Microsoft thinks they are a big threat to its bread and butter.

With google having so much influence in the new millennium, will google's mistakes have any influence in global culture? It's probably an answer left to the ages and social archaeologists of the future. Though, we should start asking such silly questions now.

Hence, my questions: I read news.google.com almost every day, as I'm sure many other people do. When I read the "Entertainment" section of news.google.com, I saw the following headline: "Neighbor: Man threatened to take down cops" (one of the latest gun related tragedies to take hold of US news).

Can it be confusing to some of the more gullible Internauts?

Is google trying to desensitize and confuse our reality by mixing up real human tragedy with entertainment? With intent? I doubt it was with intent - It was probably another lazy pigeon.

Physical comedy is quite acceptable - Who doesn't like the "Three Stooges"? But, some mistakes should be fixed - I most certainly wouldn't want my tragedy to be the source of amusement, unless I'm being compensated for it. "I'll roshambo you for it."



8:35 PM | 0 comment(s) |


How to make fast food unfast
Friday, November 11, 2005

The appeal of fast food joints is, literally, the quick service, i.e., you order something, you pay for it, and you expect it to be ready as you get your change back.

It's not that complicated of an equation, I think: the faster you make a burger, the faster you can serve a new client - And in the process make the client happy, as in our time-saturated North American life style, time is money.

So, I go to a local burger joint and duly fall in line to order my "fast" lunch. I look around and see a new promotion of "toasted" sandwiches. I see the "Steak Sandwich" and proceed to order it. The cashier informs me that it could take four to five minutes to prepare. I nod indicating that it is fine, I can wait - There is nothing that will stop me from enjoying that steak sandwich.

Five minutes go by; then six; then seven. Around the eight minute my lunch is ready, but I'm left questioning the "fast food" oath or serving me greasy (and unhealthy, if eaten without moderation) fast food - After all, I have the right to get fatter as quickly as I want to.

I put my thinking hat on to dispense the only thought of the year, in order to ponder the joint's strategy of introducing a lunch that takes so long to prepare that the customer-greasy-food relationship is now jeopardized, for they made me wait eight whole minutes to eat my food.

I understand that they need to jump in the new "toasted sandwich" bandwagon, as everyone and their dog are introducing these toasted things.

Have you noticed there are lot more "new" oven toasted lunch choices? As a market place, do we really need that many toasted sandwiches for lunch? Well, it's like the tooth-whitening thing - How white can my teeth really get?

But I digress...I almost canceled my order, with the thought of acquiring a trusty original burger - That thing is ready in fifteen seconds - Perhaps it's from a different type of cow, since it takes so little time to cook.

As far as business strategies go, this one was probably not well tested in a real world situation, i.e., rush hour. Imagine, that by chance, the next twenty people in line order the steak sandwich for lunch. Yeap. It's a linearly growing function - I personally wouldn't want to be the twentieth person in line.

To their strategists' credit, this is probably a temporary promotion to compete with the other "fast" food giants and their toasted treats - The promotion does have a feel of the "me too" strategy in order to have one more toasted meal option to capture a piece of the lunch market.

As per the success of the strategy of introducing a new sandwich in the mix, it did work in my case - They have my six dollars, but it's probably the last time I buy the steak sandwich from them: it took way too long to prepare.

If I were them, I would personally reconsider the fast-food oath they swore to upheld...


6:32 PM | 0 comment(s) |


Pricing Software - Modest How To Guide
Friday, November 04, 2005

Pricing software products is not rocket science, but it is a mixture of computer science, software engineering, and accounting.

In order to understand internal software pricing issues, we need to first understand the costs of production: In almost all software development undertakings, the main cost driver is the work-hours necessary to complete the system.

In this case, a cost driver is an activity such that as the volume of the activity is decreased or increased, the cost of its action is decreased or increased, respectively. I.e. The greater the number of hours you spent engineering the product, the greater the cost.

While building a software product, it makes no difference what operating system you are working under (Windows or Linux); what architecture you are following (.NET or J2EE); what programming language you are using (C++ or Java). The underlying cost of your development effort will always be dependent on work-hours required.

So, knowing that the number of engineering hours is directly proportional to the cost of building software, how then, do you price a complete product? Either a shrink wrapped application or a custom application.

There are different ways you can arrive at the price of a product. The easiest method you can use is the "Cost-Plus" methodology: find the total cost of production, then add a bit more to the cost depending on the profits you want to generate.

The costs of manufacturing or building anything can be classified as Fixed Costs and Variable Costs.

Fixed Costs
Fixed costs are not affected by cost drivers - This means that you can be engineering four different software products at the same time and still only pay for one building's worth of office space, as long as all your employees fit in one building.

Fixed costs include, but are not limited to: rent of your office space, computer equipment, salaries of your executive team, and sometimes your own workforce's salary.

Variable Costs
Variable costs are directly proportional to cost drivers. I.e. The more work-hours you require to complete a project, the higher the variable cost.

Determining either cost is a matter of classification: fixed costs are more in tuned with operational costs; variable costs, on the other hand, are attached to cost drivers. How then, do we determine the variable costs of building software in a software engineering environment?

In order to be as accurate as possible when determining variable costs you need a good understanding of Requirement Engineering and must be able to generate accurate schedule estimates.

It all boils down to what the system is supposed to do and who will develop it. I.e. The more you know about both, the needs of the client and the skills of your team, you'll end up with more accurate estimates of variable costs.

Accuracy of estimates
Estimates are necessary in order to price application development. For example, I would like to know if a particular undertaking will be profitable or not. In other words, we need to calculate costs before the product is built; And the more accurate the estimate, the better the price will reflect the total costs and desired profit margins.

One of the biggest problem we face due to inaccurate estimates, aside from unhappy stake holders for receiving late software, is free work.

We all know that there is no such thing as "free work" - It's similar to the conservation of energy laws: energy can be transformed, but not destroyed nor created - In the end someone has to pay for the "free" part.

For example, if the price of your product is based on your engineering department's estimate of 100 hours of work, and the production task is actually completed after 150 hours of effort, then the extra 50 hours are, essentially, "free" work.

Who pays for the extra 50 hours? The 50 hours of "free work" are eating up your profit margins. I.e. Your employees are being paid regardless of your tardiness (fixed cost). Somehow, your company is paying in lost productivity.

Using the past to predict the future
What about historical data to make estimates and pricing decision? This, in fact works very well, if and only if, you have accurate historical data.

If your company or software shop has been around for some time and it has been using the same work force from project to project, it has the advantage of having historical data or organizational memory of past performances. I.e. Knowing how long it took to complete a particular project, it now gives you a good reference to quote a similar project and be as accurate as possible.

Sometimes though, using historical data is much harder than we would expect, as most times, history is only available as an after thought on some software manager's memory.

This very reason, of enterprise amnesia, has led to the proliferation of formal documenting methodologies in new Software Engineering parlance. I.e. The CMM initiatives look very appealing, since the more you know and have documented your processes, the better you'll become at predicting costs and pricing.

As a recommendation (I'm sure we all do it), we should always keep metrics on past project's results in order to make comparison from project to project - Even if these tracking methodologies lack the sophistication and standards that CMM (or any other) gives you.

Note that there are other estimation methods that promise yielding accurate variable costs. You've probably heard of some of them: LOC (Lines of Code), function points, etc. They are all dandy and have professional looking formulas with different ratios you can use in different situations, however, who is to say that 10 lines of code in a week are not productive? In other words, be cautious of the usage of such findings - They are not applicable to every case.

Total Cost
And we've arrived at the simplified equation:
    Total Cost of Software Development = Fixed Costs + Variable Costs.
Answering the question of pricing, becomes a matter of figuring the break even point, or the exact volume of "whatevers" you need to sell in order to recuperate your costs of production. In other words, how many "whatevers" you need to sell to make 0 (zero) profits. Once you know this number (I mean, who wants to generate 0 profits), it's easy to see how you determine profits above your costs: how many units to sell and at what price to generate 25% profits.

In most cases, you determine the break even point as follows:
    Break Even Point = Fixed Costs / Contribution Margin per Unit, where Contribution Margin = Revenue per Unit - Variable Cost per Unit
With such a simple formula, you wonder why it is so hard to come up with a good pricing model for software.

The answer is painfully obvious: in Software Engineering we don't have a "unit" of software, nor do we know exactly how much our variable costs will be, as we are always working from estimates, as discussed above, which may or may not be accurate. The formula is there, it just need to be properly understood to apply it to a software product.

Will we ever be able to accurately price our software product to be competitive in the market place and, at the same time, generate profits in the long run? The answer is a big resounding yes - Many companies are very successful software vendors, so it is possible.

In addition, new models of distribution allow new software companies to take a hit in the production cost, to be later made up in volume of sales and 0 (zero) distribution cost. I.e. Distributing software is quite cost effective via the Internet - Forget about paying retailers for a cut of revenues, for a measly 3D point of sale - Who buys software from a software retailer, anyway?

Total cost of production, then, depends on the what (requirements Engineering) and the who (The team implementation the requirements) - Everything else around product production becomes the how (methodologies), which is part of your cost drivers - Which we now understand how to calculate.

Will the simplified total cost and break even point formulas work in every case? Probably. But you have too look at your overall operation to come up with all the fixed and variable costs - Do you need help arriving at these costs within your organization or future projects?

Notes:
  1. In order to make this read manageable, I have ignored all market theory, as my intention is to bring out the internal issues of costing software products and not to discuss micro/macro economics.

  2. When we are discussing Fixed Cost, we need to understand that given enough elapsed time nothing will remain a fixed cost. Hence, a fixed cost is only relevant in a specified period of time or volume of whatever you are building.

    Another important point to keep in mind is that determining what a fixed cost is varies from company to company.

  3. I'm not getting into the issues of proper Team Design, but you must realize that highly effective software development teams need good direction and leadership - Either as a Software Architect or Project Manager, someone has to estimate and schedule the work - I think, the task of estimation, though, is more appropriatly executed by someone with technical skills, rather than only managerial skills. I.e. It's the Architect's or Team Lead's responsibility estimate project schedules, of course, with the input team members. In more detail, estimate the overall time with the aid of each engineers specific task's estimation - The parts of the estimate, make the whole.


7:45 AM | 2 comment(s) |


This page is powered by Blogger. Isn't yours?

Guestbook
© Jose Sandoval 2004-2009 jose@josesandoval.com