Thursday, February 21, 2013

Programmers are not interchangeable

[UPDATE: just found this in my drafts folder from a few months ago - not sure why I never published it at the time, but since YJ is out-and-proud, I thought I should push the button.]

This essay by Paul Graham (YC founder) from back in 2007 has been doing the rounds recently (trending on HN), and it struck a chord with me because of the following extract and the new project with which I am involved - YunoJuno - www.yunojuno.com.

YJ is founded on precisely this belief - that certain 'craft skills' (designers, developers, ux etc.) are not interchangeable, and that the success of a project is very largely dependent on the specific individuals involved.

To that end, we want to create exactly the kind of relationship that Paul outlines below - "Maybe we could define a new kind of organization that combined the efforts of individuals without requiring them to be interchangeable." 

It's a vision we share, and one that I hope YunoJuno will be a part of (specifically the part that looks after the individuals). If you have a skill, and you want to treated as an individual, but to feel part of something greater, then head on over and 'Join the Family'.

(We're in beta at the moment, launching in earnest in the new year. Not any more - we are open for business.)
One of the defining qualities of organizations since there have been such a thing is to treat individuals as interchangeable parts. This works well for more parallelizable tasks, like fighting wars. For most of history a well-drilled army of professional soldiers could be counted on to beat an army of individual warriors, no matter how valorous. But having ideas is not very parallelizable. 
And that's what programs are: ideas. 
It's not merely true that organizations dislike the idea of depending on individual genius, it's a tautology. It's part of the definition of an organization not to. Of our current concept of an organization, at least. 
Maybe we could define a new kind of organization that combined the efforts of individuals without requiring them to be interchangeable. Arguably a market is such a form of organization, though it may be more accurate to describe a market as a degenerate case—as what you get by default when organization isn't possible. 
Probably the best we'll do is some kind of hack, like making the programming parts of an organization work differently from the rest. Perhaps the optimal solution is for big companies not even to try to develop ideas in house, but simply to buy them. But regardless of what the solution turns out to be, the first step is to realize there's a problem. There is a contradiction in the very phrase "software company." The two words are pulling in opposite directions. Any good programmer in a large organization is going to be at odds with it, because organizations are designed to prevent what programmers strive for.

Desktop ambivalence (in a post-OS world)


There was a time when your choice of desktop OS said something about you - you were a Windows nerd or an Apple hipster - hell, Apple even based an entire advertising campaign on it. That was then. This is now.

A couple of years ago I was exclusively a Windows person. I wrote software for the .NET platform, which made Visual Studio my tool of choice, and the software I wrote ran on Windows servers. Even if I was working with VMs, I was running Windows Server VMs inside my Windows desktop OS.

I was always aware of Apple, but as I wasn't a hipster, and had never opened Photoshop, I left them to one side. When I was forced to use OSX I felt quite strongly that it was inferior to Windows.

Then I went to work at a creative agency, where I was swimming in an ocean of Mac-luvin. I started using one on the odd occasions when I need to borrow an office laptop. I still didn't like OSX, but I could get along with it.

At the same time I started playing around with Python, and I decided to scratch an itch I had around my complete lack of understanding of Linux as an OS. So I installed Ubuntu on a VM, and started playing with that.

Then I was given an iPad by work. And I bought myself a Windows 7 phone. So now I had Windows 7, Windows Phone 7, iOS, OSX, and Ubuntu in my life. I even installed Windows 8 when it came out (and paid for the upgrade!), but uninstalled it a week later (you can read about that here).

I finally realised that I had become ambivalent to the OS - and started concentrating on the tools I was using instead. I dropped Visual Studio in favour of Sublime Text, and dropped Word in favour of anything that worked with Markdown (which is anything).

I still don't like OSX - the fact that maximising a window doesn't, you know, maximise it, and the woeful Finder being my pet hates - but I bought myself a MacBook Air as my primary computer simply because I like the hardware, and frankly, I don't really care about the OS.

A beautifully designed piece of hardware running beautifully designed, and simple, software (stand up iA Writer), running on an OS that just gets out of my way, is all I need these days. Are we in the post-OS-as-brand world?

Postscript: the logical conclusion of this is no (discernible) OS at all. Over Christmas I thought I'd be a good son and back up the laptop I bought for my mother a couple of years ago. She uses it infrequently, but enough, and I figured she probably had something of value on it worth saving. In fact there was nothing - at all - on it. Every single thing she does on her computer is web-based. She is the Chrome OS poster child (grandmother).