Creative Destruction in Software Development: As design becomes more critical to software, programmers must find new challenges to realize user value

I've been thinking a lot lately about the relationship between front-end design and programming in software development, and this article hit me at just the right time (thanks for tweeting it, Wren!). Basically, Allen argues that as programmers become more productive, more resources get spent in the indeterminate and creative tasks of design than in the increasingly well-understood and less error-prone tasks of programming. I'm generally inclined towards his idea of a designer-driven industry, because I'd love nothing more than to spend my time exercising my artistic side. However, I think he gets a few things mistaken, because the state of our industry is not as cut and dry as he makes it out to be.

It would be impossible to positively address Allen's larger point without pointing out the elephants in the room. Yes, I laughed out loud at the mention of cookies as a data store. But the biggest pachyderm has to be the paltry sample size for such a sweeping conclusion. One project does not an industry trend make. This guy works with Microsoft frameworks; surely he's seen bigger, more involved projects than the one he describes as spending more time in design than in development. The more complex the functional requirements, the more time spent implementing that functionality. That takes time, and productivity can only shave so many man hours off an enterprise project.

Any well understood problem and solution will be faster to implement than one that takes a lot of planning. In the web world, we developers spend a lot of time developing the same basic applications over and over. While they don't all have the same function to the user, the relationships between the data entities on the backend are often pretty damn similar. How many of us build apps with account models? How many have entities that aggregate other entities? The reason ActiveRecord can model these joins and finds so well is because there aren't too many permutations of the data relationships in the database-backed web app industry.

Of course the world doesn't really need another CMS built from scratch, which is why we have fancy frameworks like Django and Rails to help with the tedious stuff. But they are able to make this work easier precisely because it gets done so often. If these frameworks and tools allow us to focus on where we "add value", reflection might lead us to conclude that a lot of that value is relatively marginal, resulting from us following conventions, copying the work of many others, and not truly charting new territory.

This isn't to say that we all need to become geniuses. Nothing wrong with borrowing code here and there and implementing yet another e-store. And to the extent projects use the same data models and incorporate algorithms that get used over and over, solutions will become more and more similar and easily realized. One could even say the software becomes commodified, a phenomenon that Tim O'Reilly discussed in his seminal 2004 paper on open source software.

To the same degree that O'Reilly found that commodified hardware placed the power of the computer industry in the hands of software developers (see Microsoft's meteoric rise to dominance, for instance), perhaps commodified software solutions are placing the power in the hands of designers. After all, if the plumbing is all the same, differentiating the product in a hyper-competitive market inevitably relies on a unique, compelling look and a novel, more usable interface to manipulate the same data and functionality. Moreover, designers still have creativity and freshness to inject even if us coders are tapped out.

I agree with Allen that this is a good thing if it is indeed the core phenomenon. It means developers are doing their job right: we are becoming more efficient and consistent in making computers easily employable in the business goals for which they are created. There's more to it than that, though - true innovation in software is probably a bit more rare than in design (perhaps I speak from an ignorance of the design world) and we end up using the same software design patterns over and over. Look at the explosion of social networking apps - all with similar data models, all clustering around the same messaging systems, all having a twitter-like status update feature, all modeling friendships, and so on.

The taks of defining core business requirements unites designers and developers (and managers and testers) in a common goal, and to the extent we're converging on better processes and eschewing the need for specialization, we are realizing a relatively old utopia of ubiquitous computer programmability. There's always been a dream to make instructing computers a matter of mere design - where creating and refining the interface is not an afterthought but the primary manner in which logic flows are defined. I'm totally committed to reaching that world (viva Schumpeter!), where computers cease to be these mystic systems only the priesthood of programmers can invoke through their object oriented incantations, and people are freed to design and imagineer instead of holding the CPU's hand.

Indeed, what we're talking about is the shifting of creativity from algorithmic programming to visual design - from pleading with the computer to pleading with the user's brain. This creative problem solving with respect to the intangible requirements of aesthetics and cognitive processes is what design is all about. At our work, we use Wordpress for simple sites because most customers just need basic content management capabilities. Given that common requirement set, coupled with a large array of plugins to customize functionality further, it's a great way to use our designers' talents to drive value rather than having developers reinvent the wheel over and over.

But this begs the question: why are developers constantly reinventing the wheel instead of blazing new trails? Web application requirements have become stale and predictable, with everybody and their broker wanting a facebook clone. Too often, developers have not taken the initiative to lead the way in creating new software paradigms with their unique understanding of computer architecture and logic models. This is especially egregious when we do what's technically safe rather than bothering to learn skills that best suit the business ends of the project. We should be raising the bar of what we all expect from the computer, instead of just getting better at doing the same ol' apps. To the extent that we've let dull clients set the standards, we developers have commodified ourselves. And, hey, the same goes for designers if you look at other industries that have been around for more than 30 years.

There's also the matter of our discipline - the programming of computers - not just becoming more standardized, but also better planned. After decades of struggling to clearly articulate functional and non-functional requirements, unit-level expectations, and invariant definitions, we've made progress (at least, the best of us have). The article's author mentions how we spend more time in testing than we ever did before, and this can be coupled with a better, more consistent understanding of what problems we're solving and what customer needs are being met. We've become better players of the game because we've better defined what it means to win.

By contrast, look at the design discipline. What does it mean to have "the right one"? How does one demonstrate a particular interface layout is more usable than another? Certainly, there are approaches like A/B testing that can validate design decisions, but they aren't used nearly as often as programming tests are. Essentially, designers are where developers were twenty years ago. If developers have shed at least some of the aura of magic, designers have clung to their mystique. If costs accrue to the design and away from development, at least that is a function of the genuinely indeterminate nature of visual and user experience goals.

That's not to say this design authority is without problems. Designers are subject to the same conceit as developers when it comes to interfaces and user expectations, where we design for our own convenience instead of for the user. As Jaron Lanier postulates in his book "You Are Not a Gadget", The Serfdom of Crowds:

From an engineering point of view, the difference between a social-networking site and the Web as it existed before such sites is a small detail. You could always create a list of links to your friends on your website, and you could always send emails to a circle of friends announcing whatever you cared to. All the social-networking services offer is a prod to use the Web in a particular way, according to a particular philosophy. If you read something written by someone who used the term "single" in a custom-composed, unique sentence, you will inevitably get a first whiff of the subtle experience of the author, something you would not get from a multiple-choice database. The benefits of semi-automated self-presentation are illusory. Enlightened designers leave open the possibility of metaphysical specialness either in humans or in the potential for unforeseen creative processes that we can't yet capture in software systems. That kind of modesty is the signature quality of being human-centered. (emphasis added)

Although I'm twisting an anti-social networking manifesto to my purposes, Lanier has a point. If web application software is becoming commodified, maybe it's because we're neglecting the new possibilities of "unforeseen creative processes we can't yet capture in sofware systems". You're expecting people to conform their needs to your supply, rather than meeting demand.

Of course, design has - and has always had - a role to play in presenting these new systems to users, but they are also guilty of the same reduction of the user to algorithms and processes that we are. We should be adapting to the user's needs, not the other way around. That is not solely a task for user experience specialists; it is something we developers need to think about more when we model domains, or design APIs, because those constructs always constrain every other aspect of the product.

Our complacency is transforming the user against his or her will, making him or her a less creative and more predictable part of our system. But the way to make money in software is to fit our product into the user's life, not the user into our product. There are both design and programming aspects to this problem, but as the ones communicating most intimately with the computer, I can't help but feel the majority of the blame lies with us programmers.

Allen is looking in the right direction, but he's drawing too narrow a conclusion. If you're spending the majority of your time in marketing and packaging - which, at their most shallow, is what designers do - you're in an industry that is not innovating. Engineering has always been the gateway to new markets, new vectors for human creativity and expression, and profits. If we're spending more time solving design problems than engineering problems, then perhaps we need to find new engineering problems, and raise the bar. That, after all, is supposed to be the upside of creative destruction.

Written on Thursday, February 04, 2010 | Tags: programming, web-applications