We Learn By Changing Our Minds
Like plenty of others around that time, I was laid off at the end of
September 2008.1 My then employer was running out of money. Fortunately, I
had some savings to rely on and I was able to take on a bit of freelance work as
well to keep afloat while I looked for the next gig. But it was late in the year
and the US stock market was crashing, and I lived in a small town with
relatively few technology jobs.
The day after Christmas that year, a local tech CEO invited me to meet for
coffee, which I happily accepted. I was grateful for the professional connection
and figured that, if things went well, I could maybe pick up some freelance
work. There was one slight wrinkle, though—they were a dyed-in-the-wool
Microsoft shop. And I had fairly strong, somewhat public, negative feelings
about Microsoft technologies.
Now, I came of age in the software industry during Slashdot's heyday in the late
1990s, and I philosophically align then and now with those advocating for open,
non-proprietary technologies. Microsoft, on the other hand, wanted to control it
all. Bill Gates had written a book advocating private communication networks to
supplant the open internet, Microsoft was under investigation by the DOJ for
anti-trust violations, and their attitude toward the FOSS world was actively
hostile. In short, Microsoft sought to destroy the very future I
hoped for. I'd also spent a fair bit of time working with Microsoft's software
in corporate IT contexts, and found it buggy and cumbersome. And the licensing
model remains a nightmare for anyone except Microsoft.2
"I avoid Microsoft technology like the plague"
As our coffee wrapped up, the CEO and I spent a brief few moments discussing
transitioning our server infrastructure from Windows to Linux or FreeBSD, which
he fairly quickly dismissed; I assured him I'd continue to suggest it. Coffee
morphed into an interview, with the CEO handing me off to an impromptu panel
interview with the engineering leadership. In the interview, one of the managers
on the panel decided to bring up my personal website and came across a sentence
he wanted me to explain—"I avoid Microsoft technology like the plague."3 I
hand-waved away the more political aspects of the topic and focused instead on
one item—the cumbersome implications of Microsoft's Licensing model on software
architecture for multi-tenant web-based software.4 On New Year's Eve 2008,
after another more focused interview, they offered me a job, which I accepted.
When I decided to accept the job, I made an explicit the choice to challenge any
technological prejudices which had come along for the ride but which were not
truly connected to any deeply held principles. I ran Windows Vista (which
sucked), I used Visual Studio (which was a revelation), I used IIS (which
sucked), and learned C#, which remains one of my favorite languages to work
in.
"The Mac Team Has More Fun"5
The autumn before I joined, the company had released an initial Mac version of
its software. Development of this initial version had been contracted to a
separate shop and few in the company took it seriously. Despite a resurgence of
Mac use over the previous 10 years, particularly among technologists, the
pervasive view within the company was that Macs and their users weren't worth
the trouble.
But the initial Mac version had demonstrated that there was enough of a market
to invest further. The company's flagship Windows desktop software was in the
final year of a multi-year re-platforming and set to launch the following fall.
After some initial prototyping done by the one engineer who believed in the Mac
project, a decision was made to build a Mac version that was aligned with the
Windows version and that new version would be built in-house. This was a major
undertaking that would require adapting a 20-year old C++ Windows-only codebase
to run on the Mac and also figuring out a way to share a substantial portion of
the C#/.NET code written for the soon-to-be-released Windows version.
A small team, most of whom joined the company around the same time I did, formed
around the leadership of that one engineer who believed in the Mac version. I
happened to sit near this group, which gave me a front-row seat as they suffered
through the first 9 months wrangling that C++ codebase and figuring out how to
wield that shared C# code by embedding Mono and building a native Cocoa UI
around it.6
The new Windows version, which I had been working on since March, shipped in
November 2009, and the Mac team was expanding. I'd been a Mac user since 1995
and had grown fond of the team, so I was happy to join them when asked. The same
week the new Windows version launched, the team released a barely functional
alpha version of the new Mac app, which some customers actually downloaded and
installed. We released a new alpha almost weekly for twenty-something releases.
In Summer 2010, we finally released what we called "Beta 1". Within a week of
release, nearly 30% of active users were running the Mac beta.
As we had discovered during the Alpha period, the Mac market for our software
was actually sizable. That 30% number remained fairly consistent over the years
I remained with the company. We shipped the first GM release in early October
2010, and I
moved on from the Mac team in February 2011.
How can we learn, except by changing our minds?
I stayed with that company for a little more than five years after leaving the
Mac team, during which I was able to collaborate with some truly excellent
engineers on work that I continue to be proud of years later. And along the way,
many beliefs—which would have at one time been axiomatic for me or others—were
challenged and, ultimately, revised or set aside (disconfirmed?) because they
didn't hold up under pressure.7 Our customers were better for it. We were
individually better for it. The company was better for it.
Over the course of my career, confirmation bias has been one of the
most common sources of problems for individuals and organizations. A particular
pernicious manifestation occurs when a strong, but context-sensitive opinion
or belief is elevated to the level of fundamental value or principle,
independent of context.
Dogma can serve as a powerful unifying force, which, for a time, may seem
useful. But dogma makes us brittle—like shooting the moon, one essentially has
to be perfect, which is unlikely. Our brains our wired to preserve our sense of
integrity, so broad dogma also makes us more susceptible to a host of other
cognitive biases, including the backfire effect and belief
bias. When a group does it, it can add the in-group bias as
well.
To work in an organization whose value system is unacceptable or incompatible
with one’s own condemns a person both to frustration and to nonperformance.
—Peter Drucker, Managing Oneself
For individuals, this can be challenging enough, as the ability to find a
compatible organization or team is inversely proportional to the number of
opinion-cum-moral-imperatives one holds to. For organizations, the risk is often
existential—adapting to changes which invalidate those core beliefs is nearly
impossible without painfully undoing the unity built around that shared belief,
sometimes unmaking the organization in the process.
I have found Paul Saffo's Strong Opinions, Weakly Held or Richard
Feynman's admonitions against 'cargo cult science' to be instructive and
helpful here. For myself, I have made it a practice to regularly challenge
those of my own beliefs which are falsifiable or and wrestle with which of my
primary beliefs ought to become secondary based on new information.
Growth is literally the stuff of life. Dogma inhibits growth by inhibiting
learning, which makes us brittle. Keep it to a minimum.