Sunday, August 12, 2012

Cooperation vs. Competition

I read this excellent article in the July issue of Scientific American. It was called “The Evolution of Cooperation”. The basis of the article was centered around game theory, but the essence of it helped me to better frame my understanding of several deep issues. Often there are interesting abstract high-level constructs that once understood, nicely overlay a sea of information, leading to a better understanding.

Our modern era stresses the benefits of competition. From the earliest ideas of a ‘free hand’ that would fix people on a level playing field, we have been inundated with the notion that things would be better if we were freer to compete. Those notions have always bothered me, since inevitably in order to gain the upper hand, people bend towards pushing the boundaries of the rules. Eventually going too far. Thus it is no surprise that the Olympics needs very serious drug testing, since the rewards of winning often outweigh the risks of getting caught. Competition may start out fair, but there are always people willing to abuse it, and gradually over time as more of them do better, the rest follow. Eventually it always degrades.

So these philosophies of unfettered competition, either sound naive or they seemed to be pushed by people who are already at the margins of fairness. They’re already bending the rules and they’d like to get away with bending them harder. I do hear what they are saying, but I suspect it to either be self-serving spin or a failure to accept the full range of expected behaviors from our species.

What the article clarified is that competition really only exists on the back of cooperation. That is, if there are no rules, there are no rules to bend. There is no competition, just chaos. So in order to compete, initially everyone has to agree on a set of rules. Thus cooperation is by far the deeper principle. It must be in place first. In that sense, it seems more than obvious as a concept, after all we’ve come together to form societies, countries, companies, etc. and these entities are all held together by rules that define how one should behave within them. By sheer scale, it seems that we cooperate far more often than we compete, but if one only reads modern literature, it would seem if the opposite were true. No doubt that is because the adherents of competition are louder and more vocal than those for cooperation.

What fascinated me about the article was that their game theory models predicted that cooperation tended to reinforce cooperation. That is, if it gets root somewhere, it tends to get larger, and while the two ebb and flow with respect to each other, the formal system model tries to balance them out in its stead state. Again, not really a surprise, but it does heavily contradict those that preach progress through increased competition. It points towards that philosophy as being unbalanced (and thus unsustainable).

So far all of what I’ve said has been applied generally across the behavior of our organizations, what does that have to do with software you’re wondering? The underlying cultures of software development have always tended towards programmers getting more ‘freedom’. That’s a recurring theme and one that has always bothered me. Freedom in its simplest form is just a lack of rules. And as society shouts so frequently, a lack of rules means that it is easier to compete. So the essence of our cultural values in programming is towards individuals competing against each other in the guise of being able to push their own creative boundaries for their solutions. It’s every programmer for themselves.

That is fine when the output is limited to the amount of work a single individual can do. So if we wrote programs that were no longer than one man-year’s work of effort, the programmers would do best if they were free to code these in whatever eclectic manner they choose. However, the era of small software ended a long time ago. What we are most often building now is built on a huge number of man-years of effort. Big systems. The little stuff exists already, it is the big stuff we are struggling with.

In all the different projects I’ve worked on, one thing has always remained true. If the project is large, it will absolutely fail if the underlying programmers don’t all get one the same page together. That is, it is a team effort, and a poorly functioning team will never be successful. It doesn’t actually matter what the ‘page’ is, it doesn’t actually matter what the standards, style, architecture or technology are. All that matters is that a group of people come together and agree on how the system will be constructed, otherwise the process of constructing it quickly breaks down and the whole thing fails.

One easy way to determine who's really on a team is by what they say. The members of a team never lie to each other, even if what is being said is not pleasant. Lying is a deliberate misdirection and generally something that you might want to do to your competitors, so that you maintain an advantage over them. Telling the truth, as you know it (even if it turns out to be wrong) is what you do when you are cooperating. You are trading your understanding for theirs, so that it is shared and everyone gets on the same page. Withholding information, particularly in this case, is a form of lying. If you let an opportunity pass without speaking up about something important, then you are doing so for competitive reasons. You are deliberately choosing not to alter the direction and you are essentially lying about the fact that you don’t have any ‘other’ information to share.

Given a huge rise in expectations for software, and that the things we seek to build far exceed the complexity manageable by a single individual, getting better results for software development comes directly from getting programmers to cooperate more with each other. However, our culture (and certainly many of the discussions on the web) show that we programmers are highly competitive. We’d rather have our freedoms, and maximize them at the expense of the whole project failing. Or industry. And when that sort of problem happens, we’re quick to blame each other for the problem, rather than accepting that the root cause was a failure (on our part and theirs) to get onto the same page and cooperate with each other. We see this over and over in the industry. Excessive squabbling about how ‘theirs’ is wrong, and ‘ours’ is better. About ‘right’, and about ‘perfect’. Now one doesn’t expect to get cooperation at every level in the industry, our species would never accept that, but what we’ve seen so far is that that we are far more competitive than we should be. Everyone is out for their own personal glory (although ‘glory’ is not always defined in the same way) and as a result everyone heads out into their own unique direction, waving the banner of ‘freedom’. The consequences of this behavior have been obvious. The failure rate for software is stunningly high and there is more energy applied to telling others they are wrong, then there is for civil discussions on what would be better. The number of programmers in the field has been increasing, but the number of innovations in software (not hardware) has been dwindling. We occasionally see some interesting new ideas, but generally there are constrained to a small group of individuals. Little pockets, here and there. There hasn’t been a real major shift in technologies for well over a decade now. Just a few fragments.

So that article was really informative in laying a basis for learning how to improve things. Competition pushes us, but it also stagnates us. It motivates us, but it also limits us. If we want to move forward, then the only way to do so is via cooperation, and in order to cooperate we have to all be willing to all play by the same rules. The specifics of the rules don’t matter, just that we all play by them. It is sometimes a hard thing for people to accept, and certainly in our modern age it has become harder, but it really represents the only way to move forward. We can choose to all do things differently, but that choice also means that we’ve put a cap on what we can now do, and what we have now isn’t particularly impressive. Or we can come together to build really big spectacular things. Most of us want to build better software but in order to do that, we have to give up many of our cherished freedoms. Excellence comes with a price attached.

2 comments:

  1. Please feel free to add comments :-)

    ReplyDelete
  2. I know this is an old post, but since you talked about something similar on your Irrational Focus blog, I thought it might still be top-of-mind.

    I've reflected on the idea that the reason we have to have so much agreement in software projects on things like what OS to use, what language to use, what API to use, and what design scheme to use, is due to the way our systems and languages are constructed, you can run into such serious incompatibilities that the work of one programmer won't function with the work of the others if they don't all agree on these factors. This has been mitigated somewhat with web standards. Again, an agreement of a different sort. The problem is they're a similar type of standard to what we've been using all along, and they don't adjust to change that well. There's still a lot that is very "sticky." You change a critical aspect, and most of the structure falls apart.

    It seems to me this is a problem with our software. We keep expecting standards in how pieces of software communicate to be the "glue" that holds everything together. The problem is we keep having to rewrite old functionality to fit into the new scheme, or we hack together some way of translating how software communicates with different modules.

    When we find out that one method is insufficient, we want to change it, but we have this tremendous weight of the installed base to deal with, both in terms of volume of software, and the knowledge base in programmers' heads. So I can see the validity of the argument that too much cooperation is demanded. The thing is we need a better software architecture to really make more independent work possible. You can't do it with the runtime architectures that most use now.

    One idea I have re. this is that with regard to software modules, we need to move more towards description of functionality, and depend less on labeling, and hard jump points. The latter can be described in our current parlance as "early binding." What I mean is "description" that a computer can understand, which likely involves mathematics. I mean it in the sense that a code base can describe the (virtual) machine it expects to run on, which a "machine of machines" (or "runtime of runtimes") can adapt itself to, and it can provide its own descriptions of its own functionality, such that finding functionality becomes a computable process. What I mean to describe here isn't so much a new programming language, but rather a new way of packaging software modules, such that other software that wants to use its functionality doesn't have to conform to its way of doing things. The goal being that it won't matter what OS, language, or API programmers are using, in terms of code compatibility. What will matter more is standards of software/system behavior. Such a "runtime of runtimes" should be configurable such that it can accept or reject software packages that conform or don't conform to the behaviors it allows. Secondly, code modules should be able to access external functionality via. a query that's computable, rather than having to be either early-bound to code within a runtime environment that only recognizes one way of doing things, or to a communication standard that also only recognizes one way of communicating. The informal name I've given this idea is "free objects."

    I don't have the first clue about how to pursue this goal, but it's an idea that's been gelling in my head for a long time.

    ReplyDelete

Thanks for the Feedback!